Facial expression and gesture recognition algorithms are key enabling technologies for human-computer interaction (HCI) systems. State of the art approaches for automatic detection of body movements and analyzing emotions from facial features heavily rely on advanced machine learning algorithms. Most of these methods are designed for the average user, but the assumption “one-size-fits-all” ignores diversity in cultural background, gender, ethnicity, and personal behavior, and limits their applicability in real-world scenarios. A possible solution is to build personalized interfaces, which practically implies learning person-specific classifiers and usually collecting a significant amount of labeled samples for each novel user. As data annotation is a tedious and time-consuming process, in this paper we present a framework for personalizing classification models which does not require labeled target data. Personalization is achieved by devising a novel transfer learning approach. Specifically, we propose a regression framework which exploits auxiliary (source) annotated data to learn the relation between person-specific sample distributions and parameters of the corresponding classifiers. Then, when considering a new target user, the classification model is computed by simply feeding the associated (unlabeled) sample distribution into the learned regression function. We evaluate the proposed approach in different applications: pain recognition and action unit detection using visual data and gestures classification using inertial measurements, demonstrating the generality of our method with respect to different input data types and basic classifiers. We also show the advantages of our approach in terms of accuracy and computational time both with respect to user-independent approaches and to previous personalization techniques.
Learning Personalized Models for Facial Expression Analysis and Gesture Recognition / Zen, Gloria; Porzi, Lorenzo; Sangineto, Enver; Ricci, Elisa; Sebe, Niculae. - In: IEEE TRANSACTIONS ON MULTIMEDIA. - ISSN 1520-9210. - 18:4(2016), pp. 775-788.
|Titolo:||Learning Personalized Models for Facial Expression Analysis and Gesture Recognition|
|Autori:||Zen, Gloria; Porzi, Lorenzo; Sangineto, Enver; Ricci, Elisa; Sebe, Niculae|
|Titolo del periodico:||IEEE TRANSACTIONS ON MULTIMEDIA|
|Anno di pubblicazione:||2016|
|Numero e parte del fascicolo:||4|
|Codice identificativo Scopus:||2-s2.0-84963858085|
|Codice identificativo ISI:||WOS:000372790300018|
|Digital Object Identifier (DOI):||http://dx.doi.org/10.1109/TMM.2016.2523421|
|Citazione:||Learning Personalized Models for Facial Expression Analysis and Gesture Recognition / Zen, Gloria; Porzi, Lorenzo; Sangineto, Enver; Ricci, Elisa; Sebe, Niculae. - In: IEEE TRANSACTIONS ON MULTIMEDIA. - ISSN 1520-9210. - 18:4(2016), pp. 775-788.|
|Appare nelle tipologie:||03.1 Articolo su rivista (Journal article)|
File in questo prodotto:
|TMM2016.pdf||Versione editoriale (Publisher’s layout)||Tutti i diritti riservati (All rights reserved)||Administrator|
|Learning Personalized Models-TMM2016-1.pdf||Post-print referato (Refereed author’s manuscript)||Tutti i diritti riservati (All rights reserved)||Open Access Visualizza/Apri|