We present ASCERTAIN-a multimodal databaASe for impliCit pERsonali Ty and Affect recognitIoN using commercial physiological sensors. To our knowledge, ASCERTAIN is the first database to connect personality traits and emotional states via physiological responses. ASCERTAIN contains big-five personality scales and emotional self-ratings of 58 users along with their Electroencephalogram (EEG), Electrocardiogram (ECG), Galvanic Skin Response (GSR) and facial activity data, recorded using off-The-shelf sensors while viewing affective movie clips. We first examine relationships between users' affective ratings and personality scales in the context of prior observations, and then study linear and non-linear physiological correlates of emotion and personality. Our analysis suggests that the emotion-personality relationship is better captured by non-linear rather than linear statistics. We finally attempt binary emotion and personality trait recognition using physiological features. Experimental results cumulatively confirm that personality differences are better revealed while comparing user responses to emotionally homogeneous videos, and above-chance recognition is achieved for both affective and personality dimensions.
Ascertain: Emotion and personality recognition using commercial sensors / Subramanian, Ramanathan; Wache, Julia; Abadi, Mojtaba Khomami; Vieriu, Radu L.; Winkler, Stefan; Sebe, Nicu. - In: IEEE TRANSACTIONS ON AFFECTIVE COMPUTING. - ISSN 1949-3045. - 9:2(2018), pp. 147-160. [10.1109/TAFFC.2016.2625250]
Ascertain: Emotion and personality recognition using commercial sensors
Subramanian, Ramanathan;Wache, Julia;Abadi, Mojtaba Khomami;Vieriu, Radu L.;Sebe, Nicu
2018-01-01
Abstract
We present ASCERTAIN-a multimodal databaASe for impliCit pERsonali Ty and Affect recognitIoN using commercial physiological sensors. To our knowledge, ASCERTAIN is the first database to connect personality traits and emotional states via physiological responses. ASCERTAIN contains big-five personality scales and emotional self-ratings of 58 users along with their Electroencephalogram (EEG), Electrocardiogram (ECG), Galvanic Skin Response (GSR) and facial activity data, recorded using off-The-shelf sensors while viewing affective movie clips. We first examine relationships between users' affective ratings and personality scales in the context of prior observations, and then study linear and non-linear physiological correlates of emotion and personality. Our analysis suggests that the emotion-personality relationship is better captured by non-linear rather than linear statistics. We finally attempt binary emotion and personality trait recognition using physiological features. Experimental results cumulatively confirm that personality differences are better revealed while comparing user responses to emotionally homogeneous videos, and above-chance recognition is achieved for both affective and personality dimensions.File | Dimensione | Formato | |
---|---|---|---|
07736040.pdf
Solo gestori archivio
Tipologia:
Versione editoriale (Publisher’s layout)
Licenza:
Tutti i diritti riservati (All rights reserved)
Dimensione
814.53 kB
Formato
Adobe PDF
|
814.53 kB | Adobe PDF | Visualizza/Apri |
I documenti in IRIS sono protetti da copyright e tutti i diritti sono riservati, salvo diversa indicazione