Abstract
This paper investigates how eye tracking and
gaze estimation can help create better mixedreality
personal computing systems involving
both physical (real world) and virtual (digital)
objects. The role of gaze is discussed in the light
of the situative space model (SSM)
determining the set of objects (physical and
virtual) which a given human agent can
perceive, and act on, in any given moment in
time. The analysis and discussion results in
ideas for how to extend the SSM model in order
to better incorporate the role of gaze in everyday
human activities, and for taking advantage of
emerging mobile eye tracking technology.
gaze estimation can help create better mixedreality
personal computing systems involving
both physical (real world) and virtual (digital)
objects. The role of gaze is discussed in the light
of the situative space model (SSM)
determining the set of objects (physical and
virtual) which a given human agent can
perceive, and act on, in any given moment in
time. The analysis and discussion results in
ideas for how to extend the SSM model in order
to better incorporate the role of gaze in everyday
human activities, and for taking advantage of
emerging mobile eye tracking technology.
Original language | English |
---|---|
Title of host publication | IUI '11: Proceedings of the 16th international conference on Intelligent user interfaces : IUI'2011 |
Number of pages | 4 |
Publication date | 2011 |
Pages | 67-70 |
Publication status | Published - 2011 |
Event | IUI '11: 16th International Conference on Intelligent User Interfaces - Palo Alto, United States Duration: 13 Feb 2011 → 16 Feb 2011 Conference number: 16 |
Conference
Conference | IUI '11: 16th International Conference on Intelligent User Interfaces |
---|---|
Number | 16 |
Country/Territory | United States |
City | Palo Alto |
Period | 13/02/2011 → 16/02/2011 |
Keywords
- Interaction paradigm
- Gaze tracking