This manuscript reports the research work done during my Ph.D. on the applications of haptic technology to guide humans, i.e. on the design of devices and strategies for instructing human users by means of haptic stimulation. The basic concept presented in this thesis is the exploitation of the tactile channel, that is the most underused but also the most distributed sensory input channel, to provide users with relevant and otherwise unaccessible information, e.g. environmental awareness or task-related instructions. Over the past years, several wearable haptic devices have been developed to stimulate the users’ skin receptors and induce a variety of touch perceptions, from texture rendering to temperature and skin indentation. This manuscript investigates applications of the aforementioned haptic interfaces in guidance scenarios, with a particular interest toward the design of haptic patterns to deliver minimal, intuitive and effective cues. Indeed, the haptic policy design process has to take into account that guiding humans is different from guiding robots. Robots can receive an impressive amount of data, process it, and use it to plan and correct motions in an outstandingly short time. Applying the same approach to humans would most probably yield instructions that are difficult to understand and apply, thus leading to poor task performance. A better understanding of the human physical and mental capabilities is necessary to optimize the communication toward the operators and facilitate their acceptance and trust in technology. For this reason, the first part of this thesis work discloses the background literature on human locomotion, neural entrainment and haptic stimulation. The dissertation then moves to specific facets of the human guidance mediated by haptics in individual and cooperative scenarios. The second chapter addresses the problem of instructing humans to modify their walking direction and velocity by means of haptic cues, for instance for indoor and outdoor navigation, and explores the topic of sharing tactile perceptions between users applied to a remote social walking experience. The third chapter presents the developments in human-human cooperation scenarios mediated by wearable devices, i.e. instructing a formation of humans to accomplish a common objective coordinated by haptic stimuli. The fourth chapter reports two minor projects on haptic guidance. The No-Face Touch system was developed during the current Covid-19 pandemic to support the population by detecting and alerting face-touch attempts. The guidance provided by the system does not instruct specific motions, but leverages the gesture-detection functionalities to notify unwanted behaviors, this way unburdening the users from constantly paying attention to their actions. The latter project proposes a novel approach to Augmented Reality that was designed to minimize the encumbrance on users’ hands, so that the augmented experience can comply with different tasks and provide users with support and guidance by leveraging visual and haptic cues.
Paolocci, G. (2021). Guiding Humans through Wearable Haptics [10.25434/paolocci-gianluca_phd2021].
Guiding Humans through Wearable Haptics
Paolocci, Gianluca
2021-01-01
Abstract
This manuscript reports the research work done during my Ph.D. on the applications of haptic technology to guide humans, i.e. on the design of devices and strategies for instructing human users by means of haptic stimulation. The basic concept presented in this thesis is the exploitation of the tactile channel, that is the most underused but also the most distributed sensory input channel, to provide users with relevant and otherwise unaccessible information, e.g. environmental awareness or task-related instructions. Over the past years, several wearable haptic devices have been developed to stimulate the users’ skin receptors and induce a variety of touch perceptions, from texture rendering to temperature and skin indentation. This manuscript investigates applications of the aforementioned haptic interfaces in guidance scenarios, with a particular interest toward the design of haptic patterns to deliver minimal, intuitive and effective cues. Indeed, the haptic policy design process has to take into account that guiding humans is different from guiding robots. Robots can receive an impressive amount of data, process it, and use it to plan and correct motions in an outstandingly short time. Applying the same approach to humans would most probably yield instructions that are difficult to understand and apply, thus leading to poor task performance. A better understanding of the human physical and mental capabilities is necessary to optimize the communication toward the operators and facilitate their acceptance and trust in technology. For this reason, the first part of this thesis work discloses the background literature on human locomotion, neural entrainment and haptic stimulation. The dissertation then moves to specific facets of the human guidance mediated by haptics in individual and cooperative scenarios. The second chapter addresses the problem of instructing humans to modify their walking direction and velocity by means of haptic cues, for instance for indoor and outdoor navigation, and explores the topic of sharing tactile perceptions between users applied to a remote social walking experience. The third chapter presents the developments in human-human cooperation scenarios mediated by wearable devices, i.e. instructing a formation of humans to accomplish a common objective coordinated by haptic stimuli. The fourth chapter reports two minor projects on haptic guidance. The No-Face Touch system was developed during the current Covid-19 pandemic to support the population by detecting and alerting face-touch attempts. The guidance provided by the system does not instruct specific motions, but leverages the gesture-detection functionalities to notify unwanted behaviors, this way unburdening the users from constantly paying attention to their actions. The latter project proposes a novel approach to Augmented Reality that was designed to minimize the encumbrance on users’ hands, so that the augmented experience can comply with different tasks and provide users with support and guidance by leveraging visual and haptic cues.File | Dimensione | Formato | |
---|---|---|---|
phd_unisi_569207_1.pdf
accesso aperto
Descrizione: Tesi di dottorato Parte 1 di 5
Tipologia:
PDF editoriale
Licenza:
PUBBLICO - Pubblico con Copyright
Dimensione
7.48 MB
Formato
Adobe PDF
|
7.48 MB | Adobe PDF | Visualizza/Apri |
phd_unisi_569207_2.pdf
accesso aperto
Descrizione: Tesi di dottorato Parte 2 di 5
Tipologia:
PDF editoriale
Licenza:
PUBBLICO - Pubblico con Copyright
Dimensione
7.18 MB
Formato
Adobe PDF
|
7.18 MB | Adobe PDF | Visualizza/Apri |
phd_unisi_569207_3.pdf
accesso aperto
Descrizione: Tesi di dottorato Parte 3 di 5
Tipologia:
PDF editoriale
Licenza:
PUBBLICO - Pubblico con Copyright
Dimensione
6.66 MB
Formato
Adobe PDF
|
6.66 MB | Adobe PDF | Visualizza/Apri |
phd_unisi_569207_4.pdf
accesso aperto
Descrizione: Tesi di dottorato Parte 4 di 5
Tipologia:
PDF editoriale
Licenza:
PUBBLICO - Pubblico con Copyright
Dimensione
6.21 MB
Formato
Adobe PDF
|
6.21 MB | Adobe PDF | Visualizza/Apri |
phd_unisi_569207_5.pdf
accesso aperto
Descrizione: Tesi di dottorato Parte 5 di 5
Tipologia:
PDF editoriale
Licenza:
PUBBLICO - Pubblico con Copyright
Dimensione
9.09 MB
Formato
Adobe PDF
|
9.09 MB | Adobe PDF | Visualizza/Apri |
I documenti in IRIS sono protetti da copyright e tutti i diritti sono riservati, salvo diversa indicazione.
https://hdl.handle.net/11365/1144448