Investigations of the Role of Gaze in Mixed-Reality Personal Computing

Thomas Pederson, Dan Witzner Hansen, Diako Mardanbeigi

    Publikation: Konference artikel i Proceeding eller bog/rapport kapitelKonferencebidrag i proceedingsForskningpeer review

    Abstract

    This paper investigates how eye tracking and
    gaze estimation can help create better mixedreality
    personal computing systems involving
    both physical (real world) and virtual (digital)
    objects. The role of gaze is discussed in the light
    of the situative space model (SSM)
    determining the set of objects (physical and
    virtual) which a given human agent can
    perceive, and act on, in any given moment in
    time. The analysis and discussion results in
    ideas for how to extend the SSM model in order
    to better incorporate the role of gaze in everyday
    human activities, and for taking advantage of
    emerging mobile eye tracking technology.
    OriginalsprogEngelsk
    TitelIUI '11: Proceedings of the 16th international conference on Intelligent user interfaces : IUI'2011
    Antal sider4
    Publikationsdato2011
    Sider67-70
    StatusUdgivet - 2011
    BegivenhedIUI '11: 16th International Conference on Intelligent User Interfaces - Palo Alto, USA
    Varighed: 13 feb. 201116 feb. 2011
    Konferencens nummer: 16

    Konference

    KonferenceIUI '11: 16th International Conference on Intelligent User Interfaces
    Nummer16
    Land/OmrådeUSA
    ByPalo Alto
    Periode13/02/201116/02/2011

    Emneord

    • Interaction paradigm
    • Gaze tracking

    Fingeraftryk

    Dyk ned i forskningsemnerne om 'Investigations of the Role of Gaze in Mixed-Reality Personal Computing'. Sammen danner de et unikt fingeraftryk.

    Citationsformater