A. Comport, E. Marchand, F. Chaumette. Robust and real-time image-based tracking for markerless augmented reality. Rapport de recherche IRISA, No 1534, Avril 2003.
Charger l'article : Adobe portable document (pdf)
Copyright : Les documents contenus dans ces répertoires sont rendus disponibles par les auteurs qui y ont contribué en vue d'assurer la diffusion à temps de travaux savants et techniques sur une base non-commerciale. Les droits de copie et autres droits sont gardés par les auteurs et par les détenteurs du copyright, en dépit du fait qu'ils présentent ici leurs travaux sous forme électronique. Les personnes copiant ces informations doivent adhérer aux termes et contraintes couverts par le copyright de chaque auteur. Ces travaux ne peuvent pas être rendus disponibles ailleurs sans la permission explicite du détenteur du copyright.
Augmented Reality has now progressed to the point where real-time applications are being considered and needed. At the same time it is important that synthetic elements are rendered and aligned in the scene in an accurate and visually acceptable way. In order to address these issues a real-time, robust and efficient 3D model-based tracking algorithm is proposed for a 'video see through' monocular vision system. The tracking of objects in the scene amounts to calculating the pose between the camera and the objects. Virtual objects can then be projected into the scene using the pose. Here, non-linear pose computation is formulated by means of a virtual visual servoing approach. In this context, the derivation of point-to-curves interaction matrices are given for different features including lines, circles, cylinders and spheres. A local moving edges tracker is used in order to provide real-time tracking of points normal to the object contours. A method is proposed for combining local position uncertainty and global pose uncertainty in an efficient and accurate way by propagating uncertainty. Robustness is obtained by integrating a M-estimator into the visual control law via an iteratively re-weighted least squares implementation. The method presented in this paper has been validated on several complex image sequences including outdoor environments. Results show the method to be robust to occlusion, changes in illumination and miss-tracking
@TechReport{Comport03b,
Author = {Comport, A. and Marchand, E. and Chaumette, F.},
Title = {Robust and real-time image-based tracking for markerless augmented reality},
Number = {1534},
Institution = {IRISA},
Month = {April},
Year = {2003}
}
Charger la référence EndNote (.ref)
This document was translated automatically from BibTEX by bib2html (Copyright 2003 © Eric Marchand, INRIA, Vista Project).