This paper presents a new algorithm based on previous results of the authors, for the estimation of the yaw angle of an omnidirectional camera/robot undergoing a 6-DoF rigid motion. Our real-time algorithm is uncalibrated, robust to noisy data, and it only relies on the projection of 3-D parallel lines as image features. Numerical and real-world experiments conducted with an eye-in-hand robot manipulator, which we used to simulate the 3-D motion of a Micro unmanned Aerial Vehicle (MAV), show the accuracy and reliability of our estimation algorithm.

Scheggi, S., F., M., Prattichizzo, D. (2013). Uncalibrated Visual Compass from Omnidirectional Line Images with Application to Attitude MAV Estimation. In Proc. 2013 IEEE/RSJ International Conference on Intelligent Robots and Systems (IROS) (pp.1602-1607). IEEE [10.1109/IROS.2013.6696563].

Uncalibrated Visual Compass from Omnidirectional Line Images with Application to Attitude MAV Estimation

SCHEGGI, STEFANO;PRATTICHIZZO, DOMENICO
2013-01-01

Abstract

This paper presents a new algorithm based on previous results of the authors, for the estimation of the yaw angle of an omnidirectional camera/robot undergoing a 6-DoF rigid motion. Our real-time algorithm is uncalibrated, robust to noisy data, and it only relies on the projection of 3-D parallel lines as image features. Numerical and real-world experiments conducted with an eye-in-hand robot manipulator, which we used to simulate the 3-D motion of a Micro unmanned Aerial Vehicle (MAV), show the accuracy and reliability of our estimation algorithm.
9781467363587
Scheggi, S., F., M., Prattichizzo, D. (2013). Uncalibrated Visual Compass from Omnidirectional Line Images with Application to Attitude MAV Estimation. In Proc. 2013 IEEE/RSJ International Conference on Intelligent Robots and Systems (IROS) (pp.1602-1607). IEEE [10.1109/IROS.2013.6696563].
File in questo prodotto:
Non ci sono file associati a questo prodotto.

I documenti in IRIS sono protetti da copyright e tutti i diritti sono riservati, salvo diversa indicazione.

Utilizza questo identificativo per citare o creare un link a questo documento: https://hdl.handle.net/11365/46469
 Attenzione

Attenzione! I dati visualizzati non sono stati sottoposti a validazione da parte dell'ateneo