3D tracking by catadioptric vision based on particle filters

7Citations
Citations of this article
14Readers
Mendeley users who have this article in their library.

This article is free to access.

Abstract

This paper presents a robust tracking system for autonomous robots equipped with omnidirectional cameras. The proposed method uses a 3D shape and color-based object model. This allows to tackle difficulties that arise when the tracked object is placed above the ground plane floor. Tracking under these conditions has two major difficulties: first, observation with omnidirectional sensors largely deforms the target's shape; second, the object of interest embedded in a dynamic scenario may suffer from occlusion, overlap and ambiguities. To surmount these difficulties, we use a 3D particle filter to represent the target's state space: position and velocity with respect to the robot. To compute the likelihood of each particle the following features are taken into account: i) image color; ii) mismatch between target's color and background color. We test the accuracy of the algorithm in a RoboCup Middle Size League scenario, both with static and moving targets. © 2008 Springer-Verlag Berlin Heidelberg.

Cite

CITATION STYLE

APA

Taiana, M., Gaspar, J., Nascimento, J., Bernardino, A., & Lima, P. (2008). 3D tracking by catadioptric vision based on particle filters. In Lecture Notes in Computer Science (including subseries Lecture Notes in Artificial Intelligence and Lecture Notes in Bioinformatics) (Vol. 5001 LNAI, pp. 77–88). https://doi.org/10.1007/978-3-540-68847-1_7

Register to see more suggestions

Mendeley helps you to discover research relevant for your work.

Already have an account?

Save time finding and organizing research with Mendeley

Sign up for free