In this paper we describe the SAME networked platform for context-aware, experience-centric mobile music applications, and we present an implementation of the SAME active music listening paradigm: the Mobile Conductor. It allows the user to express herself in conducting a virtual ensemble playing a MIDI piece of music by means of her mobile phone. The mobile phone detects the user's hand movement and molds the music performance style by modulating its speed, volume, and intonation. © 2010 OpenInterface Association.
Human movement expressivity for mobile active music listening / Mancini, M.; Varni, G.; Kleimola, J.; Volpe, G.; Camurri, A.. - In: JOURNAL ON MULTIMODAL USER INTERFACES. - ISSN 1783-7677. - 4:1(2010), pp. 27-35. [10.1007/s12193-010-0047-z]
Human movement expressivity for mobile active music listening
Varni G.;
2010-01-01
Abstract
In this paper we describe the SAME networked platform for context-aware, experience-centric mobile music applications, and we present an implementation of the SAME active music listening paradigm: the Mobile Conductor. It allows the user to express herself in conducting a virtual ensemble playing a MIDI piece of music by means of her mobile phone. The mobile phone detects the user's hand movement and molds the music performance style by modulating its speed, volume, and intonation. © 2010 OpenInterface Association.I documenti in IRIS sono protetti da copyright e tutti i diritti sono riservati, salvo diversa indicazione