Social interaction and embodiment are key issues for future User Centric Media. Social networks and games are more and more characterized by an active, physical participation of the users. The integration in mobile devices of a growing number of sensors to capture users' physical activity (e.g., accelerometers, cameras) and context information (GPS, location) supports novel systems capable to connect audiovisual content processing and communication to users social behavior, including joint movement and physical engagement. In this paper, a system enabling a novel paradigm for social, active experience of sound and music content is presented. An instance of such a system, named Sync'n'Move, allowing two users to explore a multi-channel pre-recorded music piece as the result of their social interaction, and in particular of their synchronization, is introduced. This research has been developed in the framework of the EU-ICT Project SAME ( www.sameproject.eu ) and has been presented at Agora Festival (IRCAM, Centre Pompidou, Paris, June 2009). In that occasion, Sync'n'Move has been evaluated by both expert and non expert users, and results are briefly presented. Perspectives on the impact of such a novel paradigm and system in future User Centric Media are finally discussed, with a specific focus on social active experience of audiovisual content. © 2010 Springer Science+Business Media, LLC.

A system for mobile active music listening based on social interaction and embodiment / Varni, G.; Mancini, M.; Volpe, G.; Camurri, A.. - In: MOBILE NETWORKS AND APPLICATIONS. - ISSN 1383-469X. - 16:3(2011), pp. 375-384. [10.1007/s11036-010-0256-4]

A system for mobile active music listening based on social interaction and embodiment

Varni G.;
2011-01-01

Abstract

Social interaction and embodiment are key issues for future User Centric Media. Social networks and games are more and more characterized by an active, physical participation of the users. The integration in mobile devices of a growing number of sensors to capture users' physical activity (e.g., accelerometers, cameras) and context information (GPS, location) supports novel systems capable to connect audiovisual content processing and communication to users social behavior, including joint movement and physical engagement. In this paper, a system enabling a novel paradigm for social, active experience of sound and music content is presented. An instance of such a system, named Sync'n'Move, allowing two users to explore a multi-channel pre-recorded music piece as the result of their social interaction, and in particular of their synchronization, is introduced. This research has been developed in the framework of the EU-ICT Project SAME ( www.sameproject.eu ) and has been presented at Agora Festival (IRCAM, Centre Pompidou, Paris, June 2009). In that occasion, Sync'n'Move has been evaluated by both expert and non expert users, and results are briefly presented. Perspectives on the impact of such a novel paradigm and system in future User Centric Media are finally discussed, with a specific focus on social active experience of audiovisual content. © 2010 Springer Science+Business Media, LLC.
2011
3
Varni, G.; Mancini, M.; Volpe, G.; Camurri, A.
A system for mobile active music listening based on social interaction and embodiment / Varni, G.; Mancini, M.; Volpe, G.; Camurri, A.. - In: MOBILE NETWORKS AND APPLICATIONS. - ISSN 1383-469X. - 16:3(2011), pp. 375-384. [10.1007/s11036-010-0256-4]
File in questo prodotto:
Non ci sono file associati a questo prodotto.

I documenti in IRIS sono protetti da copyright e tutti i diritti sono riservati, salvo diversa indicazione

Utilizza questo identificativo per citare o creare un link a questo documento: https://hdl.handle.net/11572/437349
 Attenzione

Attenzione! I dati visualizzati non sono stati sottoposti a validazione da parte dell'ateneo

Citazioni
  • ???jsp.display-item.citation.pmc??? ND
  • Scopus 17
  • ???jsp.display-item.citation.isi??? 10
  • OpenAlex ND
social impact