Although sound position is initially head-centred (egocentric coordinates), our brain can also represent sounds relative to one another (allocentric coordinates). Whether reference frames for spatial hearing are independent or interact remained largely unexplored. Here we developed a new allocentric spatial-hearing training and tested whether it can improve egocentric sound-localisation performance in normal-hearing adults listening with one ear plugged. Two groups of participants (N = 15 each) performed an egocentric sound-localisation task (point to a syllable), in monaural listening, before and after 4-days of multisensory training on triplets of white-noise bursts paired with occasional visual feedback. Critically, one group performed an allocentric task (auditory bisection task), whereas the other processed the same stimuli to perform an egocentric task (pointing to a designated sound of the triplet). Unlike most previous works, we tested also a no training group (N = 15). Egocentric sound-localisation abilities in the horizontal plane improved for all groups in the space ipsilateral to the ear-plug. This unexpected finding highlights the importance of including a no training group when studying sound localisation re-learning. Yet, performance changes were qualitatively different in trained compared to untrained participants, providing initial evidence that allocentric and multisensory procedures may prove useful when aiming to promote sound localisation re-learning.

Interactions between egocentric and allocentric spatial coding of sounds revealed by a multisensory learning paradigm / Rabini, G.; Altobelli, E.; Pavani, F.. - In: SCIENTIFIC REPORTS. - ISSN 2045-2322. - 9:1(2019). [10.1038/s41598-019-44267-3]

Interactions between egocentric and allocentric spatial coding of sounds revealed by a multisensory learning paradigm

Rabini G.;Pavani F.
2019-01-01

Abstract

Although sound position is initially head-centred (egocentric coordinates), our brain can also represent sounds relative to one another (allocentric coordinates). Whether reference frames for spatial hearing are independent or interact remained largely unexplored. Here we developed a new allocentric spatial-hearing training and tested whether it can improve egocentric sound-localisation performance in normal-hearing adults listening with one ear plugged. Two groups of participants (N = 15 each) performed an egocentric sound-localisation task (point to a syllable), in monaural listening, before and after 4-days of multisensory training on triplets of white-noise bursts paired with occasional visual feedback. Critically, one group performed an allocentric task (auditory bisection task), whereas the other processed the same stimuli to perform an egocentric task (pointing to a designated sound of the triplet). Unlike most previous works, we tested also a no training group (N = 15). Egocentric sound-localisation abilities in the horizontal plane improved for all groups in the space ipsilateral to the ear-plug. This unexpected finding highlights the importance of including a no training group when studying sound localisation re-learning. Yet, performance changes were qualitatively different in trained compared to untrained participants, providing initial evidence that allocentric and multisensory procedures may prove useful when aiming to promote sound localisation re-learning.
2019
1
Rabini, G.; Altobelli, E.; Pavani, F.
Interactions between egocentric and allocentric spatial coding of sounds revealed by a multisensory learning paradigm / Rabini, G.; Altobelli, E.; Pavani, F.. - In: SCIENTIFIC REPORTS. - ISSN 2045-2322. - 9:1(2019). [10.1038/s41598-019-44267-3]
File in questo prodotto:
File Dimensione Formato  
Rabini et al. (2019).pdf

accesso aperto

Descrizione: Articolo principale
Tipologia: Versione editoriale (Publisher’s layout)
Licenza: Creative commons
Dimensione 1.7 MB
Formato Adobe PDF
1.7 MB Adobe PDF Visualizza/Apri

I documenti in IRIS sono protetti da copyright e tutti i diritti sono riservati, salvo diversa indicazione

Utilizza questo identificativo per citare o creare un link a questo documento: https://hdl.handle.net/11572/252329
Citazioni
  • ???jsp.display-item.citation.pmc??? ND
  • Scopus 14
  • ???jsp.display-item.citation.isi??? 12
social impact