Towards Wearable Gaze Supported Augmented Cognition

Andrew Toshiaki Kurauchi, Carlos Hitoshi Morimoto, Diako Mardanbeigi, Dan Witzner Hansen

    Research output: Contribution to conference - NOT published in proceeding or journalPaperResearchpeer-review

    Abstract

    Augmented cognition applications must deal with the problem of how to exhibit information in an orderly, understandable, and timely fashion. Though context have been suggested to control the kind, amount, and timing of the information delivered, we argue that gaze can be a fundamental tool to reduce the amount of information and provide an appropriate mechanism for low and divided attention interaction. We claim that most current gaze interaction paradigms are not appropriate for wearable computing because they are not designed for divided attention. We have used principles suggested by the wearable computing community to develop a gaze supported augmented cognition application with three interaction modes. The application provides information of the person being looked at. The continuous mode updates information every time the user looks at a different face. The key activated discrete mode and the head gesture activated mode only update the information when the key is pressed or the gesture is performed. A prototype of the system is currently under development and it will be used to further investigate these claims.
    Original languageEnglish
    Publication date2013
    Publication statusPublished - 2013
    EventCHI 2013: Gaze Interaction in the Post-WIMP World - Paris, France
    Duration: 27 Apr 201327 Apr 2013
    http://gaze-interaction.net/

    Workshop

    WorkshopCHI 2013
    Country/TerritoryFrance
    CityParis
    Period27/04/201327/04/2013
    Internet address

    Keywords

    • gaze interaction
    • wearable computing
    • augmented cognition

    Fingerprint

    Dive into the research topics of 'Towards Wearable Gaze Supported Augmented Cognition'. Together they form a unique fingerprint.

    Cite this