This paper presents an image-based visual servoing strategy for the autonomous navigation of a mobile holonomic robot from a current toward a desired pose, specified only through a current and a desired image acquired by the on-board central catadioptric camera. This kind of vision sensor combines lenses and mirrors to enlarge the field of view. The proposed visual servoing does not need any metrical information about the 3-D viewed scene and is mainly based on a novel geometrical property, the auto-epipolar condition, which occurs when two catadioptric views (current and desired) undergo a pure translation. This condition can be detected in real-time in the image domain observing when a set of so-called disparity-conics have a common intersection. The auto-epipolar condition and the pixel distances between current and target image features are used to design the image-based control law. Lyapunov-based stability analysis and simulation results demonstrate the parametric robustness of the proposed method. Experimental results are presented to show the applicability of our visual servoing in a real context.
Mariottini, G.L., Prattichizzo, D. (2008). Image-based Visual Servoing with Central Catadioptric Camera. THE INTERNATIONAL JOURNAL OF ROBOTICS RESEARCH, 27(1), 41-57 [10.1177/0278364907084320].
Image-based Visual Servoing with Central Catadioptric Camera
Prattichizzo D.
2008-01-01
Abstract
This paper presents an image-based visual servoing strategy for the autonomous navigation of a mobile holonomic robot from a current toward a desired pose, specified only through a current and a desired image acquired by the on-board central catadioptric camera. This kind of vision sensor combines lenses and mirrors to enlarge the field of view. The proposed visual servoing does not need any metrical information about the 3-D viewed scene and is mainly based on a novel geometrical property, the auto-epipolar condition, which occurs when two catadioptric views (current and desired) undergo a pure translation. This condition can be detected in real-time in the image domain observing when a set of so-called disparity-conics have a common intersection. The auto-epipolar condition and the pixel distances between current and target image features are used to design the image-based control law. Lyapunov-based stability analysis and simulation results demonstrate the parametric robustness of the proposed method. Experimental results are presented to show the applicability of our visual servoing in a real context.File | Dimensione | Formato | |
---|---|---|---|
41.full.pdf
non disponibili
Tipologia:
Post-print
Licenza:
NON PUBBLICO - Accesso privato/ristretto
Dimensione
1.23 MB
Formato
Adobe PDF
|
1.23 MB | Adobe PDF | Visualizza/Apri Richiedi una copia |
I documenti in IRIS sono protetti da copyright e tutti i diritti sono riservati, salvo diversa indicazione.
https://hdl.handle.net/11365/22632
Attenzione
Attenzione! I dati visualizzati non sono stati sottoposti a validazione da parte dell'ateneo