In this paper, we present a novel cooperative navigation control for human–robot teams. Assuming that a human wants to reach a final location in a large environment with the help of a mobile robot, the robot must steer the human from the initial to the target position. The challenges posed by cooperative human–robot navigation are typically addressed by using haptic feedback via physical interaction. In contrast with that, in this paper, we describe a different approach, in which the human–robot interaction is achieved via wearable vibrotactile armbands. In the proposed work, the subject is free to decide her/his own pace. A warning vibrational signal is generated by the haptic armbands when a large deviation with respect to the desired pose is detected by the robot. The proposed method has been evaluated in a large indoor environment, where 15 blindfolded human subjects were asked to follow the haptic cues provided by the robot. The participants had to reach a target area, while avoiding static and dynamic obstacles. Experimental results revealed that the blindfolded subjects were able to avoid the obstacles and safely reach the target in all of the performed trials. A comparison is provided between the results obtained with blindfolded users and experiments performed with sighted people.
|Titolo:||Cooperative Navigation for Mixed Human–Robot Teams Using Haptic Feedback|
|Citazione:||Scheggi, S., Aggravi, M., & Prattichizzo, D. (2017). Cooperative Navigation for Mixed Human–Robot Teams Using Haptic Feedback. IEEE TRANSACTIONS ON HUMAN-MACHINE SYSTEMS, 47(4), 462-473.|
|Appare nelle tipologie:||1.1 Articolo in rivista|
File in questo prodotto:
|IEEE_THMS_2016.pdf||Accepted version © 2017 IEEE. Personal use of this material is permitted. Permission from IEEE must be obtained for all other uses, in any current or future media, including reprinting/republishing this material for advertising or promotional purposes, creating new collective works, for resale or redistribution to servers or lists, or reuse of any copyrighted component of this work in other works. Digital Object Identifier (DOI): 10.1109/THMS.2016.2608936||N/A||PUBBLICO - Pubblico con Copyright||Open Access Visualizza/Apri|