Gaze-Based Controlling a Vehicle

Diako Mardanbeigi, Dan Witzner Hansen

Publikation: Konferencebidrag - EJ publiceret i proceeding eller tidsskriftPaperForskningpeer review

Abstract

Research and applications of gaze interaction has mainly been conducted on a 2 dimensional surface (usually screens) for controlling a computer or controlling the movements of a robot. Emerging wearable and mobile technologies, such as google glasses may shift how gaze is used as an interactive modality if gaze trackers are embedded into the head- mounted devices. The domain of gaze-based interactive applications increases dramatically as interaction is no longer constrained to 2D displays. This paper proposes a general framework for gaze-based controlling a non- stationary robot (vehicle) as an example of a complex gaze-based task in environment. This paper discusses the possibilities and limitations of how gaze interaction can be performed for controlling vehicles not only using a remote gaze tracker but also in general challenging situations where the user and robot are mobile and the movements may be governed by several degrees of freedom (e.g. flying). A case study is also introduced where the mobile gaze tracker is used for controlling a Roomba vacuum cleaner.
OriginalsprogEngelsk
Publikationsdato2013
Antal sider6
StatusUdgivet - 2013
BegivenhedCHI 2013: Gaze Interaction in the Post-WIMP World - Paris, Frankrig
Varighed: 27 apr. 201327 apr. 2013
http://gaze-interaction.net/

Workshop

WorkshopCHI 2013
Land/OmrådeFrankrig
ByParis
Periode27/04/201327/04/2013
Internetadresse

Emneord

  • Gaze-based interaction
  • robot
  • vehicle
  • craft
  • head- gestures
  • eye tracking
  • driving

Fingeraftryk

Dyk ned i forskningsemnerne om 'Gaze-Based Controlling a Vehicle'. Sammen danner de et unikt fingeraftryk.

Citationsformater