Interacting with Objects in the Environment by Gaze and Hand Gestures

Jeremy Hales, Diako Mardanbeigi, David Rozado

    Publikation: Artikel i tidsskrift og konference artikel i tidsskriftKonferenceabstrakt i tidsskriftForskningpeer review


    A head-mounted wireless gaze tracker in the form of gaze tracking glasses is used here for continuous and mobile monitoring of a subject's point of regard on the surrounding environment. We combine gaze tracking and hand gesture recognition to allow a subject to interact with objects in the environment by gazing at them, and controlling the object using hand gesture commands. The gaze tracking glasses was made from low-cost hardware consisting of a safety glasses' frame and wireless eye tracking and scene cameras. An open source gaze estimation algorithm is used for eye tracking and user's gaze estimation. A visual markers recognition library is used to identify objects in the environment through the scene camera. A hand gesture classification algorithm is used to recognize hand-based control commands. When combining all these elements the emerging system permits a subject to move freely in an environment, select the object he wants to interact with using gaze (identification) and transmit a command to it by performing a hand gesture (control). The system identifies the target for interaction by using visual markers. This innovative HCI paradigm opens up new forms of interaction with objects in smart environments.
    TidsskriftJournal of E M D R Practice and Research
    Udgave nummer3
    StatusUdgivet - 2013
    Begivenhed17th European Conference on Eye Movements - AF Borgen, Lund, Sverige
    Varighed: 11 aug. 201316 aug. 2013


    Konference17th European Conference on Eye Movements
    Lokation AF Borgen


    Dyk ned i forskningsemnerne om 'Interacting with Objects in the Environment by Gaze and Hand Gestures'. Sammen danner de et unikt fingeraftryk.