In this paper, we investigate whether information related to touches and rotations impressed to an object can be effectively used to classify the emotion of the agent manipulating it. We specifically focus on sequences of basic actions (e.g., grasping, rotating), which are constituents of daily interactions. We use the iCube, a 5 cm cube covered with tactile sensors and embedded with an accelometer, to collect a new dataset including 11 persons performing action sequences associated with 4 emotions: Anger, sadness, excitement and gratitude. Next, we propose 17 high-level hand-crafted features based on the tactile and kinematics data derived from the iCube. Twelve of these features vary significantly as a function of the emotional context in which the action sequence was performed. In particular, a larger surface of the object is engaged in physical contact for anger and excitement, than for sadness. Furthermore, the average duration of interactions labeled as sad, is longer than for the remaining 3 emotions. More rotations are performed for anger and excitement than for sadness and gratitude. The accuracy of a classification experiment in the case of four emotions reaches 0.75. This result shows that the emotion recognition during hand-object interactions is possible and it may foster development of new intelligent user interfaces.

Multimodal Emotion Recognition of Hand-Object Interaction / Niewiadomski, Radoslaw; Sciutti, Alessandra. - ELETTRONICO. - (2021), pp. 351-355. (Intervento presentato al convegno IUI 2021 tenutosi a College Station, TX, USA nel 14th-17th April 2021) [10.1145/3397481.3450636].

Multimodal Emotion Recognition of Hand-Object Interaction

Niewiadomski, Radoslaw;
2021-01-01

Abstract

In this paper, we investigate whether information related to touches and rotations impressed to an object can be effectively used to classify the emotion of the agent manipulating it. We specifically focus on sequences of basic actions (e.g., grasping, rotating), which are constituents of daily interactions. We use the iCube, a 5 cm cube covered with tactile sensors and embedded with an accelometer, to collect a new dataset including 11 persons performing action sequences associated with 4 emotions: Anger, sadness, excitement and gratitude. Next, we propose 17 high-level hand-crafted features based on the tactile and kinematics data derived from the iCube. Twelve of these features vary significantly as a function of the emotional context in which the action sequence was performed. In particular, a larger surface of the object is engaged in physical contact for anger and excitement, than for sadness. Furthermore, the average duration of interactions labeled as sad, is longer than for the remaining 3 emotions. More rotations are performed for anger and excitement than for sadness and gratitude. The accuracy of a classification experiment in the case of four emotions reaches 0.75. This result shows that the emotion recognition during hand-object interactions is possible and it may foster development of new intelligent user interfaces.
2021
26th International Conference on Intelligent User Interfaces
New York
Association for Computing Machinery
9781450380171
Niewiadomski, Radoslaw; Sciutti, Alessandra
Multimodal Emotion Recognition of Hand-Object Interaction / Niewiadomski, Radoslaw; Sciutti, Alessandra. - ELETTRONICO. - (2021), pp. 351-355. (Intervento presentato al convegno IUI 2021 tenutosi a College Station, TX, USA nel 14th-17th April 2021) [10.1145/3397481.3450636].
File in questo prodotto:
File Dimensione Formato  
IUI21_niewiadomskietal.pdf

Solo gestori archivio

Tipologia: Versione editoriale (Publisher’s layout)
Licenza: Tutti i diritti riservati (All rights reserved)
Dimensione 3.01 MB
Formato Adobe PDF
3.01 MB Adobe PDF   Visualizza/Apri

I documenti in IRIS sono protetti da copyright e tutti i diritti sono riservati, salvo diversa indicazione

Utilizza questo identificativo per citare o creare un link a questo documento: https://hdl.handle.net/11572/314902
Citazioni
  • ???jsp.display-item.citation.pmc??? ND
  • Scopus 3
  • ???jsp.display-item.citation.isi??? 2
social impact