When primates (both human and non-human) learn to categorize simple visual or acoustic stimuli by means of non-verbal matching tasks, two types of changes occur in their brain: early sensory cortices increase the precision with which they encode sensory information, and parietal and lateral prefrontal cortices develop a categorical response to the stimuli. Contrary to non-human animals, however, our species mostly constructs categories using linguistic labels. Moreover, we naturally tend to define categories by means of multiple sensory features of the stimuli. Here we trained adult subjects to parse a novel audiovisual stimulus space into 4 orthogonal categories, by associating each category to a specific symbol. We then used multi-voxel pattern analysis (MVPA) to show that during a cross-format category repetition detection task three neural representational changes were detectable. First, visual and acoustic cortices increased both precision and selectivity to their preferred sensory feature, displaying increased sensory segregation. Second, a frontoparietal network developed a multisensory object-specific response. Third, the right hippocampus and, at least to some extent, the left angular gyrus, developed a shared representational code common to symbols and objects. In particular, the right hippocampus displayed the highest level of abstraction and generalization from a format to the other, and also predicted symbolic categorization performance outside the scanner. Taken together, these results indicate that when humans categorize multisensory objects by means of language the set of changes occurring in the brain only partially overlaps with that described by classical models of non-verbal unisensory categorization in primates.

Symbolic categorization of novel multisensory stimuli in the human brain / Vigano, S.; Borghesani, V.; Piazza, M.. - In: NEUROIMAGE. - ISSN 1053-8119. - 235:(2021), p. 118016. [10.1016/j.neuroimage.2021.118016]

Symbolic categorization of novel multisensory stimuli in the human brain

Borghesani V.;Piazza M.
2021

Abstract

When primates (both human and non-human) learn to categorize simple visual or acoustic stimuli by means of non-verbal matching tasks, two types of changes occur in their brain: early sensory cortices increase the precision with which they encode sensory information, and parietal and lateral prefrontal cortices develop a categorical response to the stimuli. Contrary to non-human animals, however, our species mostly constructs categories using linguistic labels. Moreover, we naturally tend to define categories by means of multiple sensory features of the stimuli. Here we trained adult subjects to parse a novel audiovisual stimulus space into 4 orthogonal categories, by associating each category to a specific symbol. We then used multi-voxel pattern analysis (MVPA) to show that during a cross-format category repetition detection task three neural representational changes were detectable. First, visual and acoustic cortices increased both precision and selectivity to their preferred sensory feature, displaying increased sensory segregation. Second, a frontoparietal network developed a multisensory object-specific response. Third, the right hippocampus and, at least to some extent, the left angular gyrus, developed a shared representational code common to symbols and objects. In particular, the right hippocampus displayed the highest level of abstraction and generalization from a format to the other, and also predicted symbolic categorization performance outside the scanner. Taken together, these results indicate that when humans categorize multisensory objects by means of language the set of changes occurring in the brain only partially overlaps with that described by classical models of non-verbal unisensory categorization in primates.
Vigano, S.; Borghesani, V.; Piazza, M.
Symbolic categorization of novel multisensory stimuli in the human brain / Vigano, S.; Borghesani, V.; Piazza, M.. - In: NEUROIMAGE. - ISSN 1053-8119. - 235:(2021), p. 118016. [10.1016/j.neuroimage.2021.118016]
File in questo prodotto:
Non ci sono file associati a questo prodotto.

I documenti in IRIS sono protetti da copyright e tutti i diritti sono riservati, salvo diversa indicazione

Utilizza questo identificativo per citare o creare un link a questo documento: http://hdl.handle.net/11572/314495
 Attenzione

Attenzione! I dati visualizzati non sono stati sottoposti a validazione da parte dell'ateneo

Citazioni
  • ???jsp.display-item.citation.pmc??? 0
  • Scopus 0
  • ???jsp.display-item.citation.isi??? 1
social impact