Abstract
A head-mounted wireless gaze tracker in the form of gaze tracking glasses is used here for continuous and mobile monitoring of a subject's point of regard on the surrounding environment. We combine gaze tracking and hand gesture recognition to allow a subject to interact with objects in the environment by gazing at them, and controlling the object using hand gesture commands. The gaze tracking glasses was made from low-cost hardware consisting of a safety glasses' frame and wireless eye tracking and scene cameras. An open source gaze estimation algorithm is used for eye tracking and user's gaze estimation. A visual markers recognition library is used to identify objects in the environment through the scene camera. A hand gesture classification algorithm is used to recognize hand-based control commands. When combining all these elements the emerging system permits a subject to move freely in an environment, select the object he wants to interact with using gaze (identification) and transmit a command to it by performing a hand gesture (control). The system identifies the target for interaction by using visual markers. This innovative HCI paradigm opens up new forms of interaction with objects in smart environments.
Original language | English |
---|---|
Journal | Journal of E M D R Practice and Research |
Volume | 6 |
Issue number | 3 |
ISSN | 1933-3196 |
Publication status | Published - 2013 |
Event | 17th European Conference on Eye Movements - AF Borgen, Lund, Sweden Duration: 11 Aug 2013 → 16 Aug 2013 http://ecem2013.eye-movements.org |
Conference
Conference | 17th European Conference on Eye Movements |
---|---|
Location | AF Borgen |
Country/Territory | Sweden |
City | Lund |
Period | 11/08/2013 → 16/08/2013 |
Internet address |
Keywords
- gaze tracking
- hand gesture recognition
- human-computer interaction
- smart environments
- visual markers recognition