For a human-robot interface it is important to have a good model of how the human subject operates. However, since such a model is difficult to obtain, then the robotics interface must observe accurately the subject's behaviour when interacting with him. We present here a new human-robot interface for active interaction with the cognitive and emotional human domains. Since eye movements convey a lot of information about one subject's cognitive and emotive status, we have designed a new human-robot interface which uses a video-based Eye-Tracker (ET) to observe the subject's line of gaze. Since we are also interested in using our interface for studying and treating depression, our interface can send stimulating inputs to the subject using both a Transcranial Magnetic Stimulator (TMS) and a visual stimulus. The latter elicits the subject's emotions and consists of a set of pictures of facial expressions, which have been shown according to a novel visualization protocol, called Memory-Guided Filtering (MGF). Its effectiveness has been verified by means of many experimental results. We also present the application of our human-robot interface for preliminary studies concerning new cognitive rehabilitation strategies in depression.
|Titolo:||Human-Robotics Interface for the Interaction with Cognitive and Emotional Human Domains|
|Citazione:||G. L., M., Prattichizzo, D., C., S., M., D.B., Rufa, A., A., D.C., et al. (2007). Human-Robotics Interface for the Interaction with Cognitive and Emotional Human Domains. In Proc. IEEE/RSJ International Conference on Intelligent Robots and Systems, IROS 2007 (pp.528-533). IEEE.|
|Appare nelle tipologie:||4.1 Contributo in Atti di convegno|
File in questo prodotto: