Sound localization is crucial for interacting with the surrounding world. This ability can be learned across time and improved by multisensory and motor cues. In the last decade, studying the contributions of multisensory and motor cues has been facilitated by the increased adoption of virtual reality (VR). In a recent study, sound localization had been trained through a task where the visual stimuli were rendered through a VR headset, and the auditory ones through a loudspeaker moved around by the experimenter. Physically reaching to sound sources reduced sound localization errors faster and to a greater extent if compared to naming sources’ positions. Interestingly, training efficacy extended also to hearing-impaired people. Yet, this approach is unfeasible for rehabilitation at home. Fullyvirtual approaches have been used to study spatial hearing learning processes, performing headphones-rendered acoustic simulations. In the present study, we investigate whether the effects of our reaching-based training can be observed when taking advantage of such simulations, showing that the improvement is comparable between the full-VR and blended VR conditions. This validates the use of training paradigms that are completely based on portable equipment and don’t require an external operator, opening new perspectives in the field of remote rehabilitation

TRAINING SPATIAL HEARING SKILLS IN VIRTUAL REALITY THROUGH A SOUND-REACHING TASK / Valzolgher, Chiara; Capra, Sara; Sum, Kevin; Pavani, Francesco. - In: PROCEEDINGS OF FORUM ACUSTICUM. - ISSN 2221-3767. - (2023). (Intervento presentato al convegno ForumAcusticum 2023, 10 th Convention of the European Acoustics Association tenutosi a Torino nel 11-15 Settembre 2023).

TRAINING SPATIAL HEARING SKILLS IN VIRTUAL REALITY THROUGH A SOUND-REACHING TASK

Chiara Valzolgher
;
Francesco Pavani
2023-01-01

Abstract

Sound localization is crucial for interacting with the surrounding world. This ability can be learned across time and improved by multisensory and motor cues. In the last decade, studying the contributions of multisensory and motor cues has been facilitated by the increased adoption of virtual reality (VR). In a recent study, sound localization had been trained through a task where the visual stimuli were rendered through a VR headset, and the auditory ones through a loudspeaker moved around by the experimenter. Physically reaching to sound sources reduced sound localization errors faster and to a greater extent if compared to naming sources’ positions. Interestingly, training efficacy extended also to hearing-impaired people. Yet, this approach is unfeasible for rehabilitation at home. Fullyvirtual approaches have been used to study spatial hearing learning processes, performing headphones-rendered acoustic simulations. In the present study, we investigate whether the effects of our reaching-based training can be observed when taking advantage of such simulations, showing that the improvement is comparable between the full-VR and blended VR conditions. This validates the use of training paradigms that are completely based on portable equipment and don’t require an external operator, opening new perspectives in the field of remote rehabilitation
2023
Proceedings of the 10th Convention of the European Acoustics Association Forum Acusticum 2023
Torino, Italia
Politecnico di Torino
978-88-88942-67-4
Valzolgher, Chiara; Capra, Sara; Sum, Kevin; Pavani, Francesco
TRAINING SPATIAL HEARING SKILLS IN VIRTUAL REALITY THROUGH A SOUND-REACHING TASK / Valzolgher, Chiara; Capra, Sara; Sum, Kevin; Pavani, Francesco. - In: PROCEEDINGS OF FORUM ACUSTICUM. - ISSN 2221-3767. - (2023). (Intervento presentato al convegno ForumAcusticum 2023, 10 th Convention of the European Acoustics Association tenutosi a Torino nel 11-15 Settembre 2023).
File in questo prodotto:
File Dimensione Formato  
Conferencepaper_Forumacusticum_2023.pdf

accesso aperto

Tipologia: Versione editoriale (Publisher’s layout)
Licenza: Creative commons
Dimensione 465.07 kB
Formato Adobe PDF
465.07 kB Adobe PDF Visualizza/Apri

I documenti in IRIS sono protetti da copyright e tutti i diritti sono riservati, salvo diversa indicazione

Utilizza questo identificativo per citare o creare un link a questo documento: https://hdl.handle.net/11572/400750
Citazioni
  • ???jsp.display-item.citation.pmc??? ND
  • Scopus ND
  • ???jsp.display-item.citation.isi??? ND
  • OpenAlex ND
social impact