Recent neurophysiological studies show that cortical brain regions involved in the planning and execution of speech gestures are also activated in processing speech sounds. These findings suggest that speech perception is in part mediated by reference to the motor actions afforded in the speech signal. Since interactions between auditory and visual modalities are beneficial in speech perception and face-to-face communication, we used single-pulse transcranial magnetic stimulation (TMS) to investigate whether audiovisual speech perception might induce excitability changes in the left tongue-related primary motor cortex and whether acoustic and visual speech inputs might differentially modulate motor excitability. To this aim, motor-evoked potentials obtained with focal TMS applied over the left tongue primary motor cortex were recorded from participants’ tongue muscles during the perception of matching and conflicting audiovisual syllables incorporating tongue- and/or lip-related phonemes (i.e. visual and acoustic /ba/, /ga/ and /da/, visual /ba/ and acoustic /ga/, visual /ga/ and acoustic /ba/). Compared to the presentation of congruent /ba/ syllable, which primarily involves lip movements when pronounced, exposure to syllables incorporating visual and/or acoustic tongue-related phonemes induced a greater excitability of the left tongue primary motor cortex as early as 100–200 ms after the consonantal onset of the acoustically presented syllable. These results provide evidence that both visual and auditory modalities specifically modulate activity in the tongue primary motor cortex at an early stage during audiovisual speech perception. Because no interaction between the two modalities was observed, these results suggest that information from each sensory channel is recoded separately in the primary motor cortex at that point of time. These findings are discussed in relation to theories assuming a link between perception and action in the human speech processing system and theoretical models of audiovisual interaction.

On the tip of the tongue: modulation of the primary motor cortex during audiovisual speech perception

Cattaneo, Luigi
2010-01-01

Abstract

Recent neurophysiological studies show that cortical brain regions involved in the planning and execution of speech gestures are also activated in processing speech sounds. These findings suggest that speech perception is in part mediated by reference to the motor actions afforded in the speech signal. Since interactions between auditory and visual modalities are beneficial in speech perception and face-to-face communication, we used single-pulse transcranial magnetic stimulation (TMS) to investigate whether audiovisual speech perception might induce excitability changes in the left tongue-related primary motor cortex and whether acoustic and visual speech inputs might differentially modulate motor excitability. To this aim, motor-evoked potentials obtained with focal TMS applied over the left tongue primary motor cortex were recorded from participants’ tongue muscles during the perception of matching and conflicting audiovisual syllables incorporating tongue- and/or lip-related phonemes (i.e. visual and acoustic /ba/, /ga/ and /da/, visual /ba/ and acoustic /ga/, visual /ga/ and acoustic /ba/). Compared to the presentation of congruent /ba/ syllable, which primarily involves lip movements when pronounced, exposure to syllables incorporating visual and/or acoustic tongue-related phonemes induced a greater excitability of the left tongue primary motor cortex as early as 100–200 ms after the consonantal onset of the acoustically presented syllable. These results provide evidence that both visual and auditory modalities specifically modulate activity in the tongue primary motor cortex at an early stage during audiovisual speech perception. Because no interaction between the two modalities was observed, these results suggest that information from each sensory channel is recoded separately in the primary motor cortex at that point of time. These findings are discussed in relation to theories assuming a link between perception and action in the human speech processing system and theoretical models of audiovisual interaction.
2010
6
M., Sato; G., Buccino; M., Gentilucci; Cattaneo, Luigi
File in questo prodotto:
Non ci sono file associati a questo prodotto.

I documenti in IRIS sono protetti da copyright e tutti i diritti sono riservati, salvo diversa indicazione

Utilizza questo identificativo per citare o creare un link a questo documento: https://hdl.handle.net/11572/94150
Citazioni
  • ???jsp.display-item.citation.pmc??? ND
  • Scopus 42
  • ???jsp.display-item.citation.isi??? 37
  • OpenAlex ND
social impact