Evaluating gaze-based interface tools to facilitate point-and-select tasks with small targets

Henrik Skovsgaard, Julio C. Mateo, John Paulin Hansen

    Research output: Journal Article or Conference Article in JournalJournal articleResearchpeer-review


    Gaze interaction affords hands-free control of computers. Pointing to and selecting small targets using gaze alone is difficult because of the limited accuracy of gaze pointing. This is the first experimental comparison of gaze-based interface tools for small-target (e.g. <12 × 12 pixels) point-and-select tasks. We conducted two experiments comparing the performance of dwell, magnification and zoom methods in point-and-select tasks with small targets in single- and multiple-target layouts. Both magnification and zoom showed higher hit rates than dwell. Hit rates were higher when using magnification than when using zoom, but total pointing times were shorter using zoom. Furthermore, participants perceived magnification as more fatiguing than zoom. The higher accuracy of magnification makes it preferable when interacting with small targets. Our findings may guide the development of interface tools to facilitate access to mainstream interfaces for people with motor disabilities and other users in need of hands-free interaction.
    Original languageEnglish
    JournalBehaviour and Information Technology
    Issue number6
    Pages (from-to)821-831
    Number of pages9
    Publication statusPublished - 2011


    Dive into the research topics of 'Evaluating gaze-based interface tools to facilitate point-and-select tasks with small targets'. Together they form a unique fingerprint.

    Cite this