Towards Wearable Gaze Supported Augmented Cognition

Andrew Toshiaki Kurauchi, Carlos Hitoshi Morimoto, Diako Mardanbeigi, Dan Witzner Hansen

    Publikation: Konferencebidrag - EJ publiceret i proceeding eller tidsskriftPaperForskningpeer review

    Abstract

    Augmented cognition applications must deal with the problem of how to exhibit information in an orderly, understandable, and timely fashion. Though context have been suggested to control the kind, amount, and timing of the information delivered, we argue that gaze can be a fundamental tool to reduce the amount of information and provide an appropriate mechanism for low and divided attention interaction. We claim that most current gaze interaction paradigms are not appropriate for wearable computing because they are not designed for divided attention. We have used principles suggested by the wearable computing community to develop a gaze supported augmented cognition application with three interaction modes. The application provides information of the person being looked at. The continuous mode updates information every time the user looks at a different face. The key activated discrete mode and the head gesture activated mode only update the information when the key is pressed or the gesture is performed. A prototype of the system is currently under development and it will be used to further investigate these claims.
    OriginalsprogEngelsk
    Publikationsdato2013
    StatusUdgivet - 2013
    BegivenhedCHI 2013: Gaze Interaction in the Post-WIMP World - Paris, Frankrig
    Varighed: 27 apr. 201327 apr. 2013
    http://gaze-interaction.net/

    Workshop

    WorkshopCHI 2013
    Land/OmrådeFrankrig
    ByParis
    Periode27/04/201327/04/2013
    Internetadresse

    Emneord

    • gaze interaction
    • wearable computing
    • augmented cognition

    Fingeraftryk

    Dyk ned i forskningsemnerne om 'Towards Wearable Gaze Supported Augmented Cognition'. Sammen danner de et unikt fingeraftryk.

    Citationsformater