Neuroimaging research has identified category-specific neural response patterns to a limited set of object categories. For example, faces, bodies, and scenes evoke activity patterns in visual cortex that are uniquely traceable in space and time. It is currently debated whether these apparently categorical responses truly reflect selectivity for categories or instead reflect selectivity for category-associated shape properties. In the present study, we used a cross-classification approach on fMRI and MEG data to reveal both category-independent shape responses and shape-independent category responses. Participants viewed human body parts (hands and torsos) and pieces of clothing that were closely shape-matched to the body parts (gloves and shirts). Category-independent shape responses were revealed by training multivariate classifiers on discriminating shape within one category (e.g., hands versus torsos) and testing these classifiers on discriminating shape within the other category (e.g., gloves versus shirts). This analysis revealed significant decoding in large clusters in visual cortex (fMRI), starting from 90ms after stimulus onset (MEG). Shape-independent category responses were revealed by training classifiers on discriminating object category (bodies, clothes) within one shape (e.g., hands versus gloves) and testing these classifiers on discriminating category within the other shape (e.g., torsos versus shirts). This analysis revealed significant decoding in bilateral occipitotemporal cortex (fMRI), and from 130 to 200ms after stimulus onset (MEG). Together, these findings provide evidence for concurrent shape and category selectivity in high-level visual cortex, including category-level responses that are not fully explicable by 2D shape properties.

Shape-independent object category responses revealed by MEG and fMRI decoding

Kaiser, Daniel Sebastian;Peelen, Marius Vincent
2016-01-01

Abstract

Neuroimaging research has identified category-specific neural response patterns to a limited set of object categories. For example, faces, bodies, and scenes evoke activity patterns in visual cortex that are uniquely traceable in space and time. It is currently debated whether these apparently categorical responses truly reflect selectivity for categories or instead reflect selectivity for category-associated shape properties. In the present study, we used a cross-classification approach on fMRI and MEG data to reveal both category-independent shape responses and shape-independent category responses. Participants viewed human body parts (hands and torsos) and pieces of clothing that were closely shape-matched to the body parts (gloves and shirts). Category-independent shape responses were revealed by training multivariate classifiers on discriminating shape within one category (e.g., hands versus torsos) and testing these classifiers on discriminating shape within the other category (e.g., gloves versus shirts). This analysis revealed significant decoding in large clusters in visual cortex (fMRI), starting from 90ms after stimulus onset (MEG). Shape-independent category responses were revealed by training classifiers on discriminating object category (bodies, clothes) within one shape (e.g., hands versus gloves) and testing these classifiers on discriminating category within the other shape (e.g., torsos versus shirts). This analysis revealed significant decoding in bilateral occipitotemporal cortex (fMRI), and from 130 to 200ms after stimulus onset (MEG). Together, these findings provide evidence for concurrent shape and category selectivity in high-level visual cortex, including category-level responses that are not fully explicable by 2D shape properties.
2016
4
Kaiser, Daniel Sebastian; Azzalini, Damiano C; Peelen, Marius Vincent
File in questo prodotto:
File Dimensione Formato  
Shape-independent object ....pdf

accesso aperto

Tipologia: Versione editoriale (Publisher’s layout)
Licenza: Tutti i diritti riservati (All rights reserved)
Dimensione 881.44 kB
Formato Adobe PDF
881.44 kB Adobe PDF Visualizza/Apri

I documenti in IRIS sono protetti da copyright e tutti i diritti sono riservati, salvo diversa indicazione

Utilizza questo identificativo per citare o creare un link a questo documento: https://hdl.handle.net/11572/134378
Citazioni
  • ???jsp.display-item.citation.pmc??? 31
  • Scopus 52
  • ???jsp.display-item.citation.isi??? 50
  • OpenAlex ND
social impact