Touch-less interaction with medical images using hand & foot gestures

Shahram Jalaliniya, Jeremiah Smith, Miguel Sousa, Lars Büthe, Thomas Pederson

    Research output: Conference Article in Proceeding or Book/Report chapterArticle in proceedingsResearchpeer-review

    Abstract

    Sterility restrictions in surgical settings make touch-less interaction an interesting solution for surgeons to interact directly with digital images. The HCI community has already explored several methods for touch-less interaction including those based on camera-based gesture tracking and voice control. In this paper, we present a system for gesture-based interaction with medical images based on a single wristband sensor and capacitive floor sensors, allowing for hand and foot gesture input. The first limited evaluation of the system showed an acceptable level of accuracy for 12 different hand & foot gestures; also users found that our combined hand and foot based gestures are intuitive for providing input.
    Original languageEnglish
    Title of host publicationUbiComp '13 Adjunct Proceedings of the 2013 ACM conference on Pervasive and ubiquitous computing adjunct publication
    Number of pages10
    PublisherAssociation for Computing Machinery
    Publication date2013
    ISBN (Print)978-1-4503-2215-7
    DOIs
    Publication statusPublished - 2013

    Keywords

    • gesture-based interaction
    • touch-less interaction in hospital
    • wearable sensor
    • floor sensor

    Fingerprint

    Dive into the research topics of 'Touch-less interaction with medical images using hand & foot gestures'. Together they form a unique fingerprint.

    Cite this