We investigate the recognition of the affective states of a person performing an action with an object, by processing the object-sensed data. We focus on sequences of basic actions such as grasping and rotating, which are constituents of daily-life interactions. iCube, a 5 cm cube, was used to collect tactile and kinematics data that consist of tactile maps (without information on the pressure applied to the surface), and rotations. We conduct two studies: classification of i) emotions and ii) the vitality forms. In both, the participants perform a semi-structured task composed of basic actions. For emotion recognition, 237 trials by 11 participants associated with anger, sadness, excitement, and gratitude were used to train models using 10 hand-crafted features. The classifier accuracy reaches up to 82.7%. Interestingly, the same classifier when learned exclusively with the tactile data performs on par with its counterpart modeled with all 10 features. For the second study, 1135 trials by 10 participants were used to classify two vitality forms. The best-performing model differentiated gentle actions from rude ones with an accuracy of 84.85%. The results also confirm that people touch objects differently when performing these basic actions with different affective states and attitudes.

Affect Recognition in Hand-Object Interaction Using Object-sensed Tactile and Kinematic Data / Niewiadomski, Radoslaw; Beyan, Cigdem; Sciutti, Alessandra. - In: IEEE TRANSACTIONS ON HAPTICS. - ISSN 1939-1412. - 2022, 14:(2022), pp. 1-8. [10.1109/TOH.2022.3230643]

Affect Recognition in Hand-Object Interaction Using Object-sensed Tactile and Kinematic Data

Niewiadomski, Radoslaw
Primo
;
Beyan, Cigdem
Secondo
;
2022-01-01

Abstract

We investigate the recognition of the affective states of a person performing an action with an object, by processing the object-sensed data. We focus on sequences of basic actions such as grasping and rotating, which are constituents of daily-life interactions. iCube, a 5 cm cube, was used to collect tactile and kinematics data that consist of tactile maps (without information on the pressure applied to the surface), and rotations. We conduct two studies: classification of i) emotions and ii) the vitality forms. In both, the participants perform a semi-structured task composed of basic actions. For emotion recognition, 237 trials by 11 participants associated with anger, sadness, excitement, and gratitude were used to train models using 10 hand-crafted features. The classifier accuracy reaches up to 82.7%. Interestingly, the same classifier when learned exclusively with the tactile data performs on par with its counterpart modeled with all 10 features. For the second study, 1135 trials by 10 participants were used to classify two vitality forms. The best-performing model differentiated gentle actions from rude ones with an accuracy of 84.85%. The results also confirm that people touch objects differently when performing these basic actions with different affective states and attitudes.
2022
Niewiadomski, Radoslaw; Beyan, Cigdem; Sciutti, Alessandra
Affect Recognition in Hand-Object Interaction Using Object-sensed Tactile and Kinematic Data / Niewiadomski, Radoslaw; Beyan, Cigdem; Sciutti, Alessandra. - In: IEEE TRANSACTIONS ON HAPTICS. - ISSN 1939-1412. - 2022, 14:(2022), pp. 1-8. [10.1109/TOH.2022.3230643]
File in questo prodotto:
File Dimensione Formato  
IJ19_Affect_Recognition_in_Hand-Object.pdf

accesso aperto

Tipologia: Post-print referato (Refereed author’s manuscript)
Licenza: Creative commons
Dimensione 2.33 MB
Formato Adobe PDF
2.33 MB Adobe PDF Visualizza/Apri

I documenti in IRIS sono protetti da copyright e tutti i diritti sono riservati, salvo diversa indicazione

Utilizza questo identificativo per citare o creare un link a questo documento: https://hdl.handle.net/11572/364688
Citazioni
  • ???jsp.display-item.citation.pmc??? ND
  • Scopus 5
  • ???jsp.display-item.citation.isi??? 5
  • OpenAlex ND
social impact