The current study examined the effects of variability on infant event-related potential (ERP) data editing methods. A widespread approach for analysing infant ERPs is through a trial-by-trial editing process. Researchers identify electroencephalogram (EEG) channels containing artifacts and reject trials that are judged to contain excessive noise. This process can be performed manually by experienced researchers, partially automated by specialized software, or completely automated using an artifact-detection algorithm. Here, we compared the editing process from four different editors—three human experts and an automated algorithm—on the final ERP from an existing infant EEG dataset. Findings reveal that agreement between editors was low, for both the numbers of included trials and of interpolated channels. Critically, variability resulted in differences in the final ERP morphology and in the statistical results of the target ERP that each editor obtained. We also analysed sources of disagreement by estimating the EEG characteristics that each human editor considered for accepting an ERP trial. In sum, our study reveals significant variability in ERP data editing pipelines, which has important consequences for the final ERP results. These findings represent an important step towards developing best practices for ERP editing methods in infancy research.

Understanding the causes and consequences of variability in infant ERP editing practices / Monroy, C.; Domínguez-Martínez, E.; Taylor, B.; Portolés Marin, O.; Parise, E.; Reid, V. M.. - In: DEVELOPMENTAL PSYCHOBIOLOGY. - ISSN 0012-1630. - 63:8(2021), pp. e2221701-e2221712. [10.1002/dev.22217]

Understanding the causes and consequences of variability in infant ERP editing practices

Parise, E.;
2021-01-01

Abstract

The current study examined the effects of variability on infant event-related potential (ERP) data editing methods. A widespread approach for analysing infant ERPs is through a trial-by-trial editing process. Researchers identify electroencephalogram (EEG) channels containing artifacts and reject trials that are judged to contain excessive noise. This process can be performed manually by experienced researchers, partially automated by specialized software, or completely automated using an artifact-detection algorithm. Here, we compared the editing process from four different editors—three human experts and an automated algorithm—on the final ERP from an existing infant EEG dataset. Findings reveal that agreement between editors was low, for both the numbers of included trials and of interpolated channels. Critically, variability resulted in differences in the final ERP morphology and in the statistical results of the target ERP that each editor obtained. We also analysed sources of disagreement by estimating the EEG characteristics that each human editor considered for accepting an ERP trial. In sum, our study reveals significant variability in ERP data editing pipelines, which has important consequences for the final ERP results. These findings represent an important step towards developing best practices for ERP editing methods in infancy research.
2021
8
Monroy, C.; Domínguez-Martínez, E.; Taylor, B.; Portolés Marin, O.; Parise, E.; Reid, V. M.
Understanding the causes and consequences of variability in infant ERP editing practices / Monroy, C.; Domínguez-Martínez, E.; Taylor, B.; Portolés Marin, O.; Parise, E.; Reid, V. M.. - In: DEVELOPMENTAL PSYCHOBIOLOGY. - ISSN 0012-1630. - 63:8(2021), pp. e2221701-e2221712. [10.1002/dev.22217]
File in questo prodotto:
File Dimensione Formato  
Developmental Psychobiology - 2021 - Monroy - Understanding the causes and consequences of variability in infant ERP.pdf

accesso aperto

Tipologia: Versione editoriale (Publisher’s layout)
Licenza: Creative commons
Dimensione 702.95 kB
Formato Adobe PDF
702.95 kB Adobe PDF Visualizza/Apri

I documenti in IRIS sono protetti da copyright e tutti i diritti sono riservati, salvo diversa indicazione

Utilizza questo identificativo per citare o creare un link a questo documento: https://hdl.handle.net/11572/321556
Citazioni
  • ???jsp.display-item.citation.pmc??? 1
  • Scopus 6
  • ???jsp.display-item.citation.isi??? 6
social impact