This article investigated the automatic recognition of felt and musically communicated emotions using electroencephalogram (EEG), electrocardiogram (ECG), and acoustic signals, which were recorded from eleven musicians instructed to perform music in order to communicate happiness, sadness, relaxation, and anger. Musicians' self-reports indicated that the emotions they musically expressed were highly consistent with those they actually felt. Results showed that the best classification performances, in a subject-dependent classification using a KNN classifier were achieved by using features derived from both the EEG and ECG (with an accuracy of 98.11%). Which was significantly more accurate than using ECG features alone, but was not significantly more accurate than using EEG features alone. The use of acoustic features alone or in combination with EEG and/or ECG features did not lead to better performances than those achieved with EEG plus ECG or EEG alone. Our results suggest that emotion detection of playing musicians, both felt and musically communicated, when coherent, can be classified in a more reliable way using physiological features than involving acoustic features. The reported machine learning results are a step toward the development of affective brain-computer interfaces capable of automatically inferring the emotions of a playing musician in real-time.

Emotion Recognition of Playing Musicians from EEG, ECG, and Acoustic Signals / Turchet, Luca; O'Sullivan, Barry; Ortner, Rupert; Guger, Christoph. - In: IEEE TRANSACTIONS ON HUMAN-MACHINE SYSTEMS. - ISSN 2168-2291. - 54:5(2024), pp. 619-629. [10.1109/THMS.2024.3430327]

Emotion Recognition of Playing Musicians from EEG, ECG, and Acoustic Signals

Turchet, Luca
;
2024-01-01

Abstract

This article investigated the automatic recognition of felt and musically communicated emotions using electroencephalogram (EEG), electrocardiogram (ECG), and acoustic signals, which were recorded from eleven musicians instructed to perform music in order to communicate happiness, sadness, relaxation, and anger. Musicians' self-reports indicated that the emotions they musically expressed were highly consistent with those they actually felt. Results showed that the best classification performances, in a subject-dependent classification using a KNN classifier were achieved by using features derived from both the EEG and ECG (with an accuracy of 98.11%). Which was significantly more accurate than using ECG features alone, but was not significantly more accurate than using EEG features alone. The use of acoustic features alone or in combination with EEG and/or ECG features did not lead to better performances than those achieved with EEG plus ECG or EEG alone. Our results suggest that emotion detection of playing musicians, both felt and musically communicated, when coherent, can be classified in a more reliable way using physiological features than involving acoustic features. The reported machine learning results are a step toward the development of affective brain-computer interfaces capable of automatically inferring the emotions of a playing musician in real-time.
2024
5
Turchet, Luca; O'Sullivan, Barry; Ortner, Rupert; Guger, Christoph
Emotion Recognition of Playing Musicians from EEG, ECG, and Acoustic Signals / Turchet, Luca; O'Sullivan, Barry; Ortner, Rupert; Guger, Christoph. - In: IEEE TRANSACTIONS ON HUMAN-MACHINE SYSTEMS. - ISSN 2168-2291. - 54:5(2024), pp. 619-629. [10.1109/THMS.2024.3430327]
File in questo prodotto:
File Dimensione Formato  
Emotion_Recognition_of_Playing_Musicians_From_EEG_ECG_and_Acoustic_Signals.pdf

accesso aperto

Tipologia: Versione editoriale (Publisher’s layout)
Licenza: Creative commons
Dimensione 1.28 MB
Formato Adobe PDF
1.28 MB Adobe PDF Visualizza/Apri

I documenti in IRIS sono protetti da copyright e tutti i diritti sono riservati, salvo diversa indicazione

Utilizza questo identificativo per citare o creare un link a questo documento: https://hdl.handle.net/11572/437572
Citazioni
  • ???jsp.display-item.citation.pmc??? ND
  • Scopus 2
  • ???jsp.display-item.citation.isi??? 1
  • OpenAlex ND
social impact