A Multimodal Perception Framework for Users Emotional State Assessment in Social Robotics
In this work, we present an unobtrusive and non-invasive perception framework based on the synergy between two main acquisition systems: the Touch-Me Pad, consisting of two electronic patches for physiological signal extraction and processing; and the Scene Analyzer, a visual-auditory perception sys...
Main Authors: | Lorenzo Cominelli, Nicola Carbonaro, Daniele Mazzei, Roberto Garofalo, Alessandro Tognetti, Danilo De Rossi |
---|---|
Format: | Article |
Language: | English |
Published: |
MDPI AG
2017-08-01
|
Series: | Future Internet |
Subjects: | |
Online Access: | https://www.mdpi.com/1999-5903/9/3/42 |
Similar Items
-
Designing the Mind of a Social Robot
by: Nicole Lazzeri, et al.
Published: (2018-02-01) -
Multimodal Human–Robot Interaction for Human‐Centric Smart Manufacturing: A Survey
by: Tian Wang, et al.
Published: (2024-03-01) -
Editorial: Language and Robotics
by: Tadahiro Taniguchi, et al.
Published: (2021-04-01) -
A Multimodal User Interface for an Assistive Robotic Shopping Cart
by: Dmitry Ryumin, et al.
Published: (2020-12-01) -
Investigation of Methods to Create Future Multimodal Emotional Data for Robot Interactions in Patients with Schizophrenia: A Case Study
by: Kyoko Osaka, et al.
Published: (2022-05-01)