Using Priors to Compensate Geometrical Problems in Head-Mounted Eye Trackers

Fabricio Batista Narcizo, Zaheer Ahmed, Dan Witzner Hansen

Research output: Contribution to conference - NOT published in proceeding or journalConference abstract for conferenceResearchpeer-review


The use of additional information (a.k.a. priors) to help the eye tracking process is presented as an alternative to compensate classical geometrical problems in head-mounted eye trackers. Priors can be obtained from several distinct sources, such as: sensors to collect information related to distance, location, luminance, movement, speed; information extracted directly from the scene camera; calibration of video capture devices and other components of the eye tracker; information collected from a totally controlled environment; among others. Thus, priors are used to improve the robustness of eye tracking in real applications, for example, (1) If the distance between the subject and the viewed target is known, it is possible to estimate subject’s current point of regard even when target moves in depth and suffers influence of parallax error; and (2) if the tridimensional angular rotation is known, it is possible to compensate the error induced by the head rotations using linear regression. Experiments with simulated eye tracking data and in real scenarios of elite sports have been showing that the use of priors to support the eye tracking systems help produce more accurate and precise gaze estimation specially for uncalibrated head-mounted setups.
Original languageEnglish
Publication date23 Aug 2017
Publication statusPublished - 23 Aug 2017
Event19th European Conference on Eye Movements - Bergische Universität Wuppertal, Wuppertal, Germany
Duration: 20 Aug 201724 Aug 2017
Conference number: 19th


Conference19th European Conference on Eye Movements
LocationBergische Universität Wuppertal
Internet address


Dive into the research topics of 'Using Priors to Compensate Geometrical Problems in Head-Mounted Eye Trackers'. Together they form a unique fingerprint.

Cite this