This paper investigated how different emotional states of a walker can be rendered and recognized by means of footstep sounds synthesis algorithms. In a first experiment, participants were asked to render, according to imagined walking scenarios, five emotions (aggressive, happy, neutral, sad, and tender) by manipulating the parameters of synthetic footstep sounds simulating various combinations of surface materials and shoes types. Results allowed to identify, for the involved emotions and sound conditions, the mean values and ranges of variation of two parameters, sound level and temporal distance between consecutive steps. Results were in accordance with those reported in previous studies on real walking, suggesting that expression of emotions in walking is independent from the real or imagined motor activity. In a second experiment participants were asked to identify the emotions portrayed by walking sounds synthesized by setting the synthesis engine parameters to the mean values found in the first experiment. Results showed that the involved algorithms were successful in conveying the emotional information at a level comparable with previous studies. Both experiments involved musicians and non-musicians. In both experiments, a similar general trend was found between the two groups.
Emotion rendering in auditory simulations of imagined walking styles / Turchet, L; Rodà, A. - In: IEEE TRANSACTIONS ON AFFECTIVE COMPUTING. - ISSN 1949-3045. - STAMPA. - 8:2(2017), pp. 241-253. [10.1109/TAFFC.2016.2520924]
Emotion rendering in auditory simulations of imagined walking styles
Turchet L;
2017-01-01
Abstract
This paper investigated how different emotional states of a walker can be rendered and recognized by means of footstep sounds synthesis algorithms. In a first experiment, participants were asked to render, according to imagined walking scenarios, five emotions (aggressive, happy, neutral, sad, and tender) by manipulating the parameters of synthetic footstep sounds simulating various combinations of surface materials and shoes types. Results allowed to identify, for the involved emotions and sound conditions, the mean values and ranges of variation of two parameters, sound level and temporal distance between consecutive steps. Results were in accordance with those reported in previous studies on real walking, suggesting that expression of emotions in walking is independent from the real or imagined motor activity. In a second experiment participants were asked to identify the emotions portrayed by walking sounds synthesized by setting the synthesis engine parameters to the mean values found in the first experiment. Results showed that the involved algorithms were successful in conveying the emotional information at a level comparable with previous studies. Both experiments involved musicians and non-musicians. In both experiments, a similar general trend was found between the two groups.File | Dimensione | Formato | |
---|---|---|---|
Emotion rendering in auditory simulations of imagined walking styles.pdf
Solo gestori archivio
Tipologia:
Versione editoriale (Publisher’s layout)
Licenza:
Tutti i diritti riservati (All rights reserved)
Dimensione
563.9 kB
Formato
Adobe PDF
|
563.9 kB | Adobe PDF | Visualizza/Apri |
I documenti in IRIS sono protetti da copyright e tutti i diritti sono riservati, salvo diversa indicazione