This work presents a novel technique to control multi-functional hand for robot-assisted laparoscopic surgery. We tested the technique using the MUSHA multi-functional hand, a robot-aided minimally invasive surgery tool with more degrees of freedom than the standard commercial end-effector of the da Vinci robot. Extra degrees of freedom require the development of a proper control strategy to guarantee high performance and avoid an increasing complexity of control consoles. However, developing reliable control algorithms while reducing the control side's mechanical complexity is still an open challenge. In the proposed solution, we present a control strategy that projects the human hand motions into the robot actuation space. The human hand motions are tracked by a LeapMotion camera and mapped into the actuation space of the virtualized end-effector. The effectiveness of the proposed method was evaluated in a twofold manner. Firstly, we verified the Lyapunov stability of the algorithm, then an user study with 10 subjects assessed the intuitiveness and usability of the system.
Ficuciello, F., Villani, A., Lisini Baldi, T., Prattichizzo, D. (2021). A Human Gesture Mapping Method to Control a Multi‐Functional Hand for Robot‐Assisted Laparoscopic Surgery: The MUSHA Case. FRONTIERS IN ROBOTICS AND AI, 8 [10.3389/frobt.2021.741807].
A Human Gesture Mapping Method to Control a Multi‐Functional Hand for Robot‐Assisted Laparoscopic Surgery: The MUSHA Case
Villani A.;Lisini Baldi T.;Prattichizzo D.
2021-01-01
Abstract
This work presents a novel technique to control multi-functional hand for robot-assisted laparoscopic surgery. We tested the technique using the MUSHA multi-functional hand, a robot-aided minimally invasive surgery tool with more degrees of freedom than the standard commercial end-effector of the da Vinci robot. Extra degrees of freedom require the development of a proper control strategy to guarantee high performance and avoid an increasing complexity of control consoles. However, developing reliable control algorithms while reducing the control side's mechanical complexity is still an open challenge. In the proposed solution, we present a control strategy that projects the human hand motions into the robot actuation space. The human hand motions are tracked by a LeapMotion camera and mapped into the actuation space of the virtualized end-effector. The effectiveness of the proposed method was evaluated in a twofold manner. Firstly, we verified the Lyapunov stability of the algorithm, then an user study with 10 subjects assessed the intuitiveness and usability of the system.File | Dimensione | Formato | |
---|---|---|---|
frobt-08-741807.pdf
accesso aperto
Tipologia:
PDF editoriale
Licenza:
Creative commons
Dimensione
1.89 MB
Formato
Adobe PDF
|
1.89 MB | Adobe PDF | Visualizza/Apri |
I documenti in IRIS sono protetti da copyright e tutti i diritti sono riservati, salvo diversa indicazione.
https://hdl.handle.net/11365/1213935