Abstract
This paper investigates how eye tracking and
gaze estimation can help create better mixedreality
personal computing systems involving
both physical (real world) and virtual (digital)
objects. The role of gaze is discussed in the light
of the situative space model (SSM)
determining the set of objects (physical and
virtual) which a given human agent can
perceive, and act on, in any given moment in
time. The analysis and discussion results in
ideas for how to extend the SSM model in order
to better incorporate the role of gaze in everyday
human activities, and for taking advantage of
emerging mobile eye tracking technology.
gaze estimation can help create better mixedreality
personal computing systems involving
both physical (real world) and virtual (digital)
objects. The role of gaze is discussed in the light
of the situative space model (SSM)
determining the set of objects (physical and
virtual) which a given human agent can
perceive, and act on, in any given moment in
time. The analysis and discussion results in
ideas for how to extend the SSM model in order
to better incorporate the role of gaze in everyday
human activities, and for taking advantage of
emerging mobile eye tracking technology.
Originalsprog | Engelsk |
---|---|
Titel | IUI '11: Proceedings of the 16th international conference on Intelligent user interfaces : IUI'2011 |
Antal sider | 4 |
Publikationsdato | 2011 |
Sider | 67-70 |
Status | Udgivet - 2011 |
Begivenhed | IUI '11: 16th International Conference on Intelligent User Interfaces - Palo Alto, USA Varighed: 13 feb. 2011 → 16 feb. 2011 Konferencens nummer: 16 |
Konference
Konference | IUI '11: 16th International Conference on Intelligent User Interfaces |
---|---|
Nummer | 16 |
Land/Område | USA |
By | Palo Alto |
Periode | 13/02/2011 → 16/02/2011 |