Sound localization is the ability to perceive the position of auditory events. Humans locate sounds in the environment by interpreting the acoustic signals that reach the ears. When these signals are altered, as in many conditions of hearing loss, locating sounds can become challenging. Adaptation mechanisms enable training of sound localization abilities, even in adulthood. In this thesis, I explored the cognitive mechanisms involved in sound localization and I tested the impact of multisensory and motor variables when training acoustic space perception in normal hearing adults, aging people with presbycusis and cochlear implant users. To these aims, I used an innovative virtual reality approach (VR). In the first two studies, I explored the effect of visual and motor information on sound localization and validated the overall VR approach, in young (Chapter 2) and ageing participants (Chapter 3). In the subsequent three studies, I demonstrated the effectiveness of a reaching-to-sound training, based on multisensory feedback and active listening, in normal hearing people with one ear plugged. First, I showed that this training improves performance more than a comparable control condition (Chapter 4); next, I showed that reaching to sound training generalizes toward a different auditory spatial task in normal hearing people with one ear plugged (Chapter 5); finally, I showed that training and generalization effects can also be observed in people with deafness that use bilateral cochlear implants (Chapter 6). These results extend the current models of acoustic space relearning and propose multisensory-motor trainings that have the potential to transfer to clinical and applied contexts.
Active listening in sound localization: Multisensory and motor contributions to perceiving and re-learning the auditory space / Valzolgher, Chiara. - (2021 Oct 27), pp. 1-192. [10.15168/11572_321729]
Active listening in sound localization: Multisensory and motor contributions to perceiving and re-learning the auditory space
Valzolgher, Chiara
2021-10-27
Abstract
Sound localization is the ability to perceive the position of auditory events. Humans locate sounds in the environment by interpreting the acoustic signals that reach the ears. When these signals are altered, as in many conditions of hearing loss, locating sounds can become challenging. Adaptation mechanisms enable training of sound localization abilities, even in adulthood. In this thesis, I explored the cognitive mechanisms involved in sound localization and I tested the impact of multisensory and motor variables when training acoustic space perception in normal hearing adults, aging people with presbycusis and cochlear implant users. To these aims, I used an innovative virtual reality approach (VR). In the first two studies, I explored the effect of visual and motor information on sound localization and validated the overall VR approach, in young (Chapter 2) and ageing participants (Chapter 3). In the subsequent three studies, I demonstrated the effectiveness of a reaching-to-sound training, based on multisensory feedback and active listening, in normal hearing people with one ear plugged. First, I showed that this training improves performance more than a comparable control condition (Chapter 4); next, I showed that reaching to sound training generalizes toward a different auditory spatial task in normal hearing people with one ear plugged (Chapter 5); finally, I showed that training and generalization effects can also be observed in people with deafness that use bilateral cochlear implants (Chapter 6). These results extend the current models of acoustic space relearning and propose multisensory-motor trainings that have the potential to transfer to clinical and applied contexts.File | Dimensione | Formato | |
---|---|---|---|
phd_unitn_Chiara_Valzolgher.pdf
accesso aperto
Tipologia:
Tesi di dottorato (Doctoral Thesis)
Licenza:
Creative commons
Dimensione
5.33 MB
Formato
Adobe PDF
|
5.33 MB | Adobe PDF | Visualizza/Apri |
I documenti in IRIS sono protetti da copyright e tutti i diritti sono riservati, salvo diversa indicazione