Imitation of natural facial behavior in real-time is still challenging when it comes to natural behavior such as laughter and nonverbal expressions. This paper explains our ongoing work on methodologies and tools for estimating Facial Animation Parameters (FAPs) and intensities of Action Units (AUs) in order to imitate lifelike facial expressions with an MPEG-4 complaint Embodied Conversational Agent (ECA) - The GRETA agent (Bevacqua et al. 2007). Firstly, we investigate available open source tools for better facial landmark localization. Secondly, FAPs and intensities of AUs are estimated based on facial landmarks computed with an open source face tracker tool. Finally, the paper discusses our ongoing work to investigate better re-synthesis technology among FAP-based and AU-based synthesis technologies using perceptual studies on: (i) naturalness in synthesized facial expressions; (ii) similarity perceived by the subjects when compared to original user's behavior

Estimation of FAPs and intensities of AUs based on real-time face tracking / Qu, Bingqing; Pammi, Sathish; Niewiadomski, Radoslaw; Chollet, Gérard. - (2012). (Intervento presentato al convegno FAA 2012 tenutosi a Vienna nel 21st September 2012) [10.1145/2491599.2491612].

Estimation of FAPs and intensities of AUs based on real-time face tracking

Niewiadomski, Radoslaw;
2012-01-01

Abstract

Imitation of natural facial behavior in real-time is still challenging when it comes to natural behavior such as laughter and nonverbal expressions. This paper explains our ongoing work on methodologies and tools for estimating Facial Animation Parameters (FAPs) and intensities of Action Units (AUs) in order to imitate lifelike facial expressions with an MPEG-4 complaint Embodied Conversational Agent (ECA) - The GRETA agent (Bevacqua et al. 2007). Firstly, we investigate available open source tools for better facial landmark localization. Secondly, FAPs and intensities of AUs are estimated based on facial landmarks computed with an open source face tracker tool. Finally, the paper discusses our ongoing work to investigate better re-synthesis technology among FAP-based and AU-based synthesis technologies using perceptual studies on: (i) naturalness in synthesized facial expressions; (ii) similarity perceived by the subjects when compared to original user's behavior
2012
Proceedings of the 3rd Symposium on Facial Analysis and Animation
New York
ACM
Qu, Bingqing; Pammi, Sathish; Niewiadomski, Radoslaw; Chollet, Gérard
Estimation of FAPs and intensities of AUs based on real-time face tracking / Qu, Bingqing; Pammi, Sathish; Niewiadomski, Radoslaw; Chollet, Gérard. - (2012). (Intervento presentato al convegno FAA 2012 tenutosi a Vienna nel 21st September 2012) [10.1145/2491599.2491612].
File in questo prodotto:
File Dimensione Formato  
Estimation of FAPs and intensities of AUs based on real-time face tracking - 2491599.2491612.pdf

Solo gestori archivio

Tipologia: Versione editoriale (Publisher’s layout)
Licenza: Tutti i diritti riservati (All rights reserved)
Dimensione 959.36 kB
Formato Adobe PDF
959.36 kB Adobe PDF   Visualizza/Apri

I documenti in IRIS sono protetti da copyright e tutti i diritti sono riservati, salvo diversa indicazione

Utilizza questo identificativo per citare o creare un link a questo documento: https://hdl.handle.net/11572/335963
Citazioni
  • ???jsp.display-item.citation.pmc??? ND
  • Scopus 4
  • ???jsp.display-item.citation.isi??? ND
social impact