Projekter pr. år
Abstract
While the new generation of eyewear computers have increased expectations of a wearable computer, providing input to these devices is still challenging. Hand-held devices, voice commands, and hand gestures have already been explored to provide input to the wearable devices. In this paper, we examined using head and eye movements to point on a graphical user interface of a wearable computer. The performance of users in head and eye pointing has been compared with mouse pointing as a baseline method. The result of our experiment showed that the eye pointing is significantly faster than head or mouse pointing; however, our participants thought that the head pointing is more accurate and convenient.
Originalsprog | Engelsk |
---|---|
Titel | Proceedings of the 2014 11th International Conference on Wearable and Implantable Body Sensor Networks Workshops |
Antal sider | 4 |
Forlag | IEEE Computer Society Press |
Publikationsdato | 19 jun. 2014 |
ISBN (Trykt) | 978-1-4799-6136-8 |
Status | Udgivet - 19 jun. 2014 |
Emneord
- Eye pointing
- Gaze tracking
- Head pointing
- Head tracking
- head-mounted display
- wearable computing
Fingeraftryk
Dyk ned i forskningsemnerne om 'Head and eye movement as pointing modalities for eyewear computers'. Sammen danner de et unikt fingeraftryk.Projekter
- 1 Afsluttet
-
iCareNet: Intelligent Context-Aware Systems for Healthcare, Wellness, and Assisted Living
Bardram, J. (PI), Houben, S. (CoI), Pederson, T. (CoI) & Jalaliniya, S. (CoI)
01/01/2011 → 31/12/2014
Projekter: Projekt › Forskning