Pain inadequate treatment is frequent in modern society, with major medical, ethical, and financial implications. In many healthcare environments, pain is quantified prevalently through subjective measures, such as self-reports from patients or health care providers’ personal experience. Recently, automatic diagnostic tools have been developed to detect and quantify pain more “objectively” from facial expressions. However, it is still unclear if these approaches can distinguish pain from other aversive (but painless) states. In this article, we analyzed the facial responses from a database of video-recorded facial reactions evoked by comparably-unpleasant painful and disgusting stimuli. We modeled this information as function of subjective unpleasantness, as well as the specific state evoked by the stimuli (pain vs . disgust). Results show that a machine learning algorithm could predict subjective pain unpleasantness from facial information, but mistakenly detected unpleasant disgust, especially in those models relying in great extent on the brow lowerer. Importantly, pain and disgust could be disentangled using an ad hoc algorithm that rely on combined information from the eyes and the mouth. Overall, the facial expression of pain contains both specific and unpleasantness-related information shared with disgust. Automatic diagnostic tools should be guided to account for this confounding effect.

State-Specific and Supraordinal Components of Facial Response to Pain / Dirupo, Giada; Garlasco, Paolo; Chappuis, Cyrielle; Sharvit, Gil; Corradi-Dell'Acqua, Corrado. - In: IEEE TRANSACTIONS ON AFFECTIVE COMPUTING. - ISSN 1949-3045. - 13:2(2022), pp. 793-804. [10.1109/TAFFC.2020.2965105]

State-Specific and Supraordinal Components of Facial Response to Pain

Corrado Corradi-Dell'Acqua
Ultimo
2022-01-01

Abstract

Pain inadequate treatment is frequent in modern society, with major medical, ethical, and financial implications. In many healthcare environments, pain is quantified prevalently through subjective measures, such as self-reports from patients or health care providers’ personal experience. Recently, automatic diagnostic tools have been developed to detect and quantify pain more “objectively” from facial expressions. However, it is still unclear if these approaches can distinguish pain from other aversive (but painless) states. In this article, we analyzed the facial responses from a database of video-recorded facial reactions evoked by comparably-unpleasant painful and disgusting stimuli. We modeled this information as function of subjective unpleasantness, as well as the specific state evoked by the stimuli (pain vs . disgust). Results show that a machine learning algorithm could predict subjective pain unpleasantness from facial information, but mistakenly detected unpleasant disgust, especially in those models relying in great extent on the brow lowerer. Importantly, pain and disgust could be disentangled using an ad hoc algorithm that rely on combined information from the eyes and the mouth. Overall, the facial expression of pain contains both specific and unpleasantness-related information shared with disgust. Automatic diagnostic tools should be guided to account for this confounding effect.
2022
2
Dirupo, Giada; Garlasco, Paolo; Chappuis, Cyrielle; Sharvit, Gil; Corradi-Dell'Acqua, Corrado
State-Specific and Supraordinal Components of Facial Response to Pain / Dirupo, Giada; Garlasco, Paolo; Chappuis, Cyrielle; Sharvit, Gil; Corradi-Dell'Acqua, Corrado. - In: IEEE TRANSACTIONS ON AFFECTIVE COMPUTING. - ISSN 1949-3045. - 13:2(2022), pp. 793-804. [10.1109/TAFFC.2020.2965105]
File in questo prodotto:
File Dimensione Formato  
2019-IEEEAffComp.pdf

Solo gestori archivio

Tipologia: Post-print referato (Refereed author’s manuscript)
Licenza: Tutti i diritti riservati (All rights reserved)
Dimensione 1.68 MB
Formato Adobe PDF
1.68 MB Adobe PDF   Visualizza/Apri
2022 - IEEE Aff Comp - Supp.pdf

Solo gestori archivio

Descrizione: Supplementary Information
Tipologia: Altro materiale allegato (Other attachments)
Licenza: Tutti i diritti riservati (All rights reserved)
Dimensione 491.43 kB
Formato Adobe PDF
491.43 kB Adobe PDF   Visualizza/Apri

I documenti in IRIS sono protetti da copyright e tutti i diritti sono riservati, salvo diversa indicazione

Utilizza questo identificativo per citare o creare un link a questo documento: https://hdl.handle.net/11572/389029
Citazioni
  • ???jsp.display-item.citation.pmc??? ND
  • Scopus 4
  • ???jsp.display-item.citation.isi??? 5
  • OpenAlex ND
social impact