Investigations of the Role of Gaze in Mixed-Reality Personal Computing

Thomas Pederson, Dan Witzner Hansen, Diako Mardanbeigi

    Research output: Conference Article in Proceeding or Book/Report chapterArticle in proceedingsResearchpeer-review

    Abstract

    This paper investigates how eye tracking and
    gaze estimation can help create better mixedreality
    personal computing systems involving
    both physical (real world) and virtual (digital)
    objects. The role of gaze is discussed in the light
    of the situative space model (SSM)
    determining the set of objects (physical and
    virtual) which a given human agent can
    perceive, and act on, in any given moment in
    time. The analysis and discussion results in
    ideas for how to extend the SSM model in order
    to better incorporate the role of gaze in everyday
    human activities, and for taking advantage of
    emerging mobile eye tracking technology.
    Original languageEnglish
    Title of host publicationIUI '11: Proceedings of the 16th international conference on Intelligent user interfaces : IUI'2011
    Number of pages4
    Publication date2011
    Pages67-70
    Publication statusPublished - 2011
    EventIUI '11: 16th International Conference on Intelligent User Interfaces - Palo Alto, United States
    Duration: 13 Feb 201116 Feb 2011
    Conference number: 16

    Conference

    ConferenceIUI '11: 16th International Conference on Intelligent User Interfaces
    Number16
    Country/TerritoryUnited States
    CityPalo Alto
    Period13/02/201116/02/2011

    Keywords

    • Interaction paradigm
    • Gaze tracking

    Fingerprint

    Dive into the research topics of 'Investigations of the Role of Gaze in Mixed-Reality Personal Computing'. Together they form a unique fingerprint.

    Cite this