We present AMIGOS- A dataset for Multimodal research of affect, personality traits and mood on Individuals and GrOupS. Different to other databases, we elicited affect using both short and long videos in two social contexts, one with individual viewers and one with groups of viewers. The database allows the multimodal study of the affective responses, by means of neuro-physiological signals of individuals in relation to their personality and mood, and with respect to the social context and videos' duration. The data is collected in two experimental settings. In the first one, 40 participants watched 16 short emotional videos. In the second one, the participants watched 4 long videos, some of them alone and the rest in groups. The participants' signals, namely, Electroencephalogram (EEG), Electrocardiogram (ECG) and Galvanic Skin Response (GSR), were recorded using wearable sensors. Participants' frontal HD video and both RGB and depth full body videos were also recorded. Participants emotions have been annotated with both self-assessment of affective levels (valence, arousal, control, familiarity, liking and basic emotions) felt during the videos as well as external-assessment of levels of valence and arousal. We present a detailed correlation analysis of the different dimensions as well as baseline methods and results for single-trial classification of valence and arousal, personality traits, mood and social context. The database is made publicly available.

AMIGOS: A Dataset for Affect, Personality and Mood Research on Individuals and Groups / Miranda-Correa, J. A.; Abadi, M. K.; Sebe, N.; Patras, I.. - In: IEEE TRANSACTIONS ON AFFECTIVE COMPUTING. - ISSN 1949-3045. - 12:2(2021), pp. 479-493. [10.1109/TAFFC.2018.2884461]

AMIGOS: A Dataset for Affect, Personality and Mood Research on Individuals and Groups

Sebe N.;
2021-01-01

Abstract

We present AMIGOS- A dataset for Multimodal research of affect, personality traits and mood on Individuals and GrOupS. Different to other databases, we elicited affect using both short and long videos in two social contexts, one with individual viewers and one with groups of viewers. The database allows the multimodal study of the affective responses, by means of neuro-physiological signals of individuals in relation to their personality and mood, and with respect to the social context and videos' duration. The data is collected in two experimental settings. In the first one, 40 participants watched 16 short emotional videos. In the second one, the participants watched 4 long videos, some of them alone and the rest in groups. The participants' signals, namely, Electroencephalogram (EEG), Electrocardiogram (ECG) and Galvanic Skin Response (GSR), were recorded using wearable sensors. Participants' frontal HD video and both RGB and depth full body videos were also recorded. Participants emotions have been annotated with both self-assessment of affective levels (valence, arousal, control, familiarity, liking and basic emotions) felt during the videos as well as external-assessment of levels of valence and arousal. We present a detailed correlation analysis of the different dimensions as well as baseline methods and results for single-trial classification of valence and arousal, personality traits, mood and social context. The database is made publicly available.
2021
2
Miranda-Correa, J. A.; Abadi, M. K.; Sebe, N.; Patras, I.
AMIGOS: A Dataset for Affect, Personality and Mood Research on Individuals and Groups / Miranda-Correa, J. A.; Abadi, M. K.; Sebe, N.; Patras, I.. - In: IEEE TRANSACTIONS ON AFFECTIVE COMPUTING. - ISSN 1949-3045. - 12:2(2021), pp. 479-493. [10.1109/TAFFC.2018.2884461]
File in questo prodotto:
File Dimensione Formato  
AMIGOS_A_Dataset_for_Affect_Personality_and_Mood_Research_on_Individuals_and_Groups.pdf

Solo gestori archivio

Tipologia: Versione editoriale (Publisher’s layout)
Licenza: Tutti i diritti riservati (All rights reserved)
Dimensione 2.01 MB
Formato Adobe PDF
2.01 MB Adobe PDF   Visualizza/Apri

I documenti in IRIS sono protetti da copyright e tutti i diritti sono riservati, salvo diversa indicazione

Utilizza questo identificativo per citare o creare un link a questo documento: https://hdl.handle.net/11572/326170
Citazioni
  • ???jsp.display-item.citation.pmc??? ND
  • Scopus 256
  • ???jsp.display-item.citation.isi??? 249
social impact