This paper presents a multimodal system for real-time analysis of nonverbal affective social interaction in small groups of users. The focus is on two major aspects of affective social interaction: the synchronization of the affective behavior within a small group and the emergence of functional roles, such as leadership. A small group of users is modeled as a complex system consisting of single interacting components that can auto-organize and show global properties. Techniques are developed for computing quantitative measures of both synchronization and leadership. Music is selected as experimental test-bed since it is a clear example of interactive and social activity, where affective nonverbal communication plays a fundamental role. The system has been implemented as software modules for the EyesWeb XMI platform (http://www.eyesweb.org). It has been used in experimental frameworks (a violin duo and a string quartet) and in real-world applications (in user-centric applications for active music listening). Further application scenarios include entertainment, edutainment, therapy and rehabilitation, cultural heritage, and museum applications. Research has been carried out in the framework of the EU-ICT FP7 Project SAME (http://www.sameproject.eu).

A System for Real-Time Multimodal Analysis of Nonverbal Affective Social Interaction in User-Centric Media / Varni, Giovanna; Volpe, Gualtiero; Camurri, Antonio. - In: IEEE TRANSACTIONS ON MULTIMEDIA. - ISSN 1520-9210. - 12:6(2010), pp. 576-590. [10.1109/tmm.2010.2052592]

A System for Real-Time Multimodal Analysis of Nonverbal Affective Social Interaction in User-Centric Media

Giovanna Varni;
2010-01-01

Abstract

This paper presents a multimodal system for real-time analysis of nonverbal affective social interaction in small groups of users. The focus is on two major aspects of affective social interaction: the synchronization of the affective behavior within a small group and the emergence of functional roles, such as leadership. A small group of users is modeled as a complex system consisting of single interacting components that can auto-organize and show global properties. Techniques are developed for computing quantitative measures of both synchronization and leadership. Music is selected as experimental test-bed since it is a clear example of interactive and social activity, where affective nonverbal communication plays a fundamental role. The system has been implemented as software modules for the EyesWeb XMI platform (http://www.eyesweb.org). It has been used in experimental frameworks (a violin duo and a string quartet) and in real-world applications (in user-centric applications for active music listening). Further application scenarios include entertainment, edutainment, therapy and rehabilitation, cultural heritage, and museum applications. Research has been carried out in the framework of the EU-ICT FP7 Project SAME (http://www.sameproject.eu).
2010
6
Varni, Giovanna; Volpe, Gualtiero; Camurri, Antonio
A System for Real-Time Multimodal Analysis of Nonverbal Affective Social Interaction in User-Centric Media / Varni, Giovanna; Volpe, Gualtiero; Camurri, Antonio. - In: IEEE TRANSACTIONS ON MULTIMEDIA. - ISSN 1520-9210. - 12:6(2010), pp. 576-590. [10.1109/tmm.2010.2052592]
File in questo prodotto:
Non ci sono file associati a questo prodotto.

I documenti in IRIS sono protetti da copyright e tutti i diritti sono riservati, salvo diversa indicazione

Utilizza questo identificativo per citare o creare un link a questo documento: https://hdl.handle.net/11572/372930
 Attenzione

Attenzione! I dati visualizzati non sono stati sottoposti a validazione da parte dell'ateneo

Citazioni
  • ???jsp.display-item.citation.pmc??? ND
  • Scopus 86
  • ???jsp.display-item.citation.isi??? 56
  • OpenAlex ND
social impact