Eye tracking in the wild

Arthur E C Pece, Dan Witzner Hansen

Research output: Conference Article in Proceeding or Book/Report chapterBook chapterResearchpeer-review


An active contour tracker is presented which can be used for gaze-based interaction with off-the-shelf components. The underlying contour model is based on image statistics and avoids explicit feature detection. The tracker combines particle filtering with the EM algorithm. The method exhibits robustness to light changes and camera defocusing; consequently, the model is well suited for use in systems using off-the-shelf hardware, but may equally well be used in controlled environments, such as in IR-based settings. The method is even capable of handling sudden changes between IR and non-IR light conditions, without changing parameters. For the purpose of determining where the user is looking, calibration is usually needed. The number of calibration points used in different methods varies from a few to several thousands, depending on the prior knowledge used on the setup and equipment. We examine basic properties of gaze determination when the geometry of the camera, screen, and user is unknown. In particular we present a lower bound on the number of calibration points needed for gaze determination on planar objects, and we examine degenerate configurations. Based on this lower bound we apply a simple calibration procedure, to facilitate gaze estimation. © 2004 Elsevier Inc. All rights reserved.
Translated title of the contributionEye tracking in the wild
Original languageEnglish
Title of host publicationEye tracking in the wild
Number of pages27
Publication date2005
ISBN (Print)1077-3142
Publication statusPublished - 2005


  • Condensation
  • Contour tracking
  • Expectation maximization
  • Eye tracking
  • Gaze estimation
  • Particle filter
  • eye detection
  • gaze tracking
  • eyetr
  • pupil detection
  • eye localization
  • eye movements
  • bayesian model

Cite this