We present the first publicly available annotations for the analysis of face-touching behavior. These annotations are for a dataset composed of audio-visual recordings of small group social interactions with a total number of 64 videos, each one lasting between 12 to 30 minutes and showing a single person while participating to four-people meetings. They were performed by in total 16 annotators with an almost perfect agreement (Cohen's Kappa=0.89) on average. In total, 74K and 2M video frames were labelled as face-touch and no-face-touch, respectively. Given the dataset and the collected annotations, we also present an extensive evaluation of several methods: rule-based, supervised learning with hand-crafted features and feature learning and inference with a Convolutional Neural Network (CNN) for Face-Touching detection. Our evaluation indicates that among all, CNN performed the best, reaching 83.76% F1-score and 0.84 Matthews Correlation Coefficient. To foster future research in this problem, code and dataset were made publicly available (github.com/IIT-PAVIS/Face-Touching-Behavior), providing all video frames, face-touch annotations, body pose estimations including face and hands key-points detection, face bounding boxes as well as the baseline methods implemented and the cross-validation splits used for training and evaluating our models.

Analysis of Face-Touching Behavior in Large Scale Social Interaction Dataset / Beyan, C.; Bustreo, M.; Shahid, M.; Bailo, G. L.; Carissimi, N.; Del Bue, A.. - ELETTRONICO. - (2020), pp. 24-32. ((Intervento presentato al convegno 22nd ACM International Conference on Multimodal Interaction, ICMI 2020 tenutosi a Utrecht nel 25-29/10/2020 [10.1145/3382507.3418876].

Analysis of Face-Touching Behavior in Large Scale Social Interaction Dataset

Beyan C.;
2020

Abstract

We present the first publicly available annotations for the analysis of face-touching behavior. These annotations are for a dataset composed of audio-visual recordings of small group social interactions with a total number of 64 videos, each one lasting between 12 to 30 minutes and showing a single person while participating to four-people meetings. They were performed by in total 16 annotators with an almost perfect agreement (Cohen's Kappa=0.89) on average. In total, 74K and 2M video frames were labelled as face-touch and no-face-touch, respectively. Given the dataset and the collected annotations, we also present an extensive evaluation of several methods: rule-based, supervised learning with hand-crafted features and feature learning and inference with a Convolutional Neural Network (CNN) for Face-Touching detection. Our evaluation indicates that among all, CNN performed the best, reaching 83.76% F1-score and 0.84 Matthews Correlation Coefficient. To foster future research in this problem, code and dataset were made publicly available (github.com/IIT-PAVIS/Face-Touching-Behavior), providing all video frames, face-touch annotations, body pose estimations including face and hands key-points detection, face bounding boxes as well as the baseline methods implemented and the cross-validation splits used for training and evaluating our models.
ICMI 2020 - Proceedings of the 2020 International Conference on Multimodal Interaction
New York
Association for Computing Machinery, Inc
9781450375818
Beyan, C.; Bustreo, M.; Shahid, M.; Bailo, G. L.; Carissimi, N.; Del Bue, A.
Analysis of Face-Touching Behavior in Large Scale Social Interaction Dataset / Beyan, C.; Bustreo, M.; Shahid, M.; Bailo, G. L.; Carissimi, N.; Del Bue, A.. - ELETTRONICO. - (2020), pp. 24-32. ((Intervento presentato al convegno 22nd ACM International Conference on Multimodal Interaction, ICMI 2020 tenutosi a Utrecht nel 25-29/10/2020 [10.1145/3382507.3418876].
File in questo prodotto:
Non ci sono file associati a questo prodotto.

I documenti in IRIS sono protetti da copyright e tutti i diritti sono riservati, salvo diversa indicazione

Utilizza questo identificativo per citare o creare un link a questo documento: http://hdl.handle.net/11572/294672
 Attenzione

Attenzione! I dati visualizzati non sono stati sottoposti a validazione da parte dell'ateneo

Citazioni
  • ???jsp.display-item.citation.pmc??? ND
  • Scopus 2
  • ???jsp.display-item.citation.isi??? ND
social impact