Categories describe semantic divisions between classes of objects and category-based models are widely used for investigation of the conceptual system. One critical issue in this endeavour is the isolation of conceptual from perceptual contributions to category-differences. An unambiguous way to address this confound is combining multiple input-modalities. To this end, we showed participants person/place stimuli using name and picture modalities. Using multivariate methods, we searched for category-sensitive neural patterns shared across input-modalities and thus independent from perceptual properties. The millisecond temporal resolution of magnetoencephalography (MEG) allowed us to consider the precise timing of conceptual access and, by confronting latencies between the two modalities (“time generalization”), how latencies of processing depends on the input-modality. Our results identified category-sensitive conceptual representations common between modalities at three stages and that conceptual access for words was delayed by about 90 msec with respect to pictures. We also show that for pictures, the first conceptual pattern of activity (shared between both words and pictures) occurs as early as 110 msec. Collectively, our results indicated that conceptual access at the category-level is a multistage process and that different delays in access across these two input-modalities determine when these representations are activated.

Temporal dynamics of access to amodal representations of category-level conceptual information / Leonardelli, Elisa; Fait, Elisa; Fairhall, Scott L.. - In: SCIENTIFIC REPORTS. - ISSN 2045-2322. - 9:1(2019), pp. 2391-2399. [10.1038/s41598-018-37429-2]

Temporal dynamics of access to amodal representations of category-level conceptual information

Leonardelli, Elisa;Fait, Elisa;Fairhall, Scott L.
2019-01-01

Abstract

Categories describe semantic divisions between classes of objects and category-based models are widely used for investigation of the conceptual system. One critical issue in this endeavour is the isolation of conceptual from perceptual contributions to category-differences. An unambiguous way to address this confound is combining multiple input-modalities. To this end, we showed participants person/place stimuli using name and picture modalities. Using multivariate methods, we searched for category-sensitive neural patterns shared across input-modalities and thus independent from perceptual properties. The millisecond temporal resolution of magnetoencephalography (MEG) allowed us to consider the precise timing of conceptual access and, by confronting latencies between the two modalities (“time generalization”), how latencies of processing depends on the input-modality. Our results identified category-sensitive conceptual representations common between modalities at three stages and that conceptual access for words was delayed by about 90 msec with respect to pictures. We also show that for pictures, the first conceptual pattern of activity (shared between both words and pictures) occurs as early as 110 msec. Collectively, our results indicated that conceptual access at the category-level is a multistage process and that different delays in access across these two input-modalities determine when these representations are activated.
1
Leonardelli, Elisa; Fait, Elisa; Fairhall, Scott L.
Temporal dynamics of access to amodal representations of category-level conceptual information / Leonardelli, Elisa; Fait, Elisa; Fairhall, Scott L.. - In: SCIENTIFIC REPORTS. - ISSN 2045-2322. - 9:1(2019), pp. 2391-2399. [10.1038/s41598-018-37429-2]
File in questo prodotto:
File Dimensione Formato  
41598_2018_Article_37429.pdf

accesso aperto

Tipologia: Versione editoriale (Publisher’s layout)
Licenza: Creative commons
Dimensione 3.32 MB
Formato Adobe PDF
3.32 MB Adobe PDF Visualizza/Apri

I documenti in IRIS sono protetti da copyright e tutti i diritti sono riservati, salvo diversa indicazione

Utilizza questo identificativo per citare o creare un link a questo documento: https://hdl.handle.net/11572/229642
Citazioni
  • ???jsp.display-item.citation.pmc??? 5
  • Scopus 9
  • ???jsp.display-item.citation.isi??? 10
social impact