In this paper, we present an unconstrained visual gaze estimation system. The proposed method extracts the visual field of view of a person looking at a target scene in order to estimate the approximate location of interest (visual gaze). The novelty of the system is the joint use of head pose and eye location information to fine tune the visual gaze estimated by the head pose only, so that the system can be used in multiple scenarios. The improvements obtained by the proposed approach are validated using the Boston University head pose dataset, on which the standard deviation of the joint visual gaze estimation improved by 61.06% horizontally and 52.23% vertically with respect to the gaze estimation obtained by the head pose only. A user study shows the potential of the proposed system. © 2010 IEEE.

Visual Gaze Estimation by Joint Head and Eye Information

Sebe, Niculae;
2010-01-01

Abstract

In this paper, we present an unconstrained visual gaze estimation system. The proposed method extracts the visual field of view of a person looking at a target scene in order to estimate the approximate location of interest (visual gaze). The novelty of the system is the joint use of head pose and eye location information to fine tune the visual gaze estimated by the head pose only, so that the system can be used in multiple scenarios. The improvements obtained by the proposed approach are validated using the Boston University head pose dataset, on which the standard deviation of the joint visual gaze estimation improved by 61.06% horizontally and 52.23% vertically with respect to the gaze estimation obtained by the head pose only. A user study shows the potential of the proposed system. © 2010 IEEE.
2010
Proceedings of the 2010 20th International Conference on Pattern Recognition
Washington DC
IEEE Computer Society
9780769541099
R., Valenti; A., Lablack; Sebe, Niculae; C., Djeraba; T., Gevers
File in questo prodotto:
Non ci sono file associati a questo prodotto.

I documenti in IRIS sono protetti da copyright e tutti i diritti sono riservati, salvo diversa indicazione

Utilizza questo identificativo per citare o creare un link a questo documento: https://hdl.handle.net/11572/84623
Citazioni
  • ???jsp.display-item.citation.pmc??? ND
  • Scopus 13
  • ???jsp.display-item.citation.isi??? ND
  • OpenAlex 12
social impact