Body-Based User Interfaces

Paul Strohmeier, Henning Pohl, Jess Mcintosh, Aske Mottelson, Jarrod Knibbe, Yvonne Jansen, Joanna Bergström, Kasper Hornbæk

Research output: Conference Article in Proceeding or Book/Report chapterBook chapterResearchpeer-review

Abstract

The relation between the body and computer interfaces has undergone several shifts since the advent of computing. In early models of interaction, the body was treated as a periphery to the mind, much like a keyboard is peripheral to a computer. The goal of the interface designers was to optimize the information flow between the brain and the computer, using these imperfect peripheral devices. Toward the end of the previous century the social body, as well as the material body and its physical manipulation skills started receiving increased consideration from interaction designers. The goal of the interface designer shifted, requiring the designer to understand the role of the body in a given context, and adapting the interface to respect the social context and to make use of the tacit knowledge that the body has of how the physical world functions. Currently, we are witnessing another shift in the role of the body. It is no longer merely something that requires consideration for interface design. Instead, advances in technology and our understanding of interaction allows the body to become part of the interface. We call these body-based user interfaces. We identify four ways in which the body becomes part of the interface: (1) The interaction might occur on or in the human body, for example using implanted tactile stimulation or touch interfaces on the body. Here the material of the body becomes part of the interface. (2) The interaction changes the morphology of the body and corresponding control structures, for example by providing users with additional skills, such as drawing or playing instruments or additional limbs that help complete complex tasks. Here the shape of the body, or the corresponding ability to act, is affected by the interface. (3) The interaction engages with or modifies how we perceive the world, for example by manipulating the sense of direction in VR or allowing users to experience non-existent stimuli, such as mid-air friction. Here, the idiosyncrasies of multi-modal perception and perceptive acts become part of the user interface. Finally, (4) the interaction might engage with the experience of having a body, for example by manipulating the sense of body ownership, location, or agency. Here the introspective access to one’s own body is used in the design of the interface. In this chapter, we present a brief history of the body’s role in human–computer interaction, leading up to a definition of body-based user interfaces. We follow this by presenting examples of interfaces that reflect the different ways in which interfaces can be body-based. We conclude by presenting outlooks on benefits, drawbacks, and possible futures of body-based user interfaces.
Original languageEnglish
Title of host publicationRoutledge Handbook of Bodily Awareness
EditorsAdrian Alsmith, Matthew Longo
Number of pages24
Volume1
Place of PublicationUnited Kingdom
PublisherRoutledge
Publication dateNov 2022
Edition1
Pages478
Article number31
Chapter7
ISBN (Print)9780367337315
Publication statusPublished - Nov 2022

Keywords

  • Human-Computer Interaction
  • Body-Based User Interfaces
  • Tactile Stimulation
  • Morphological Augmentation
  • Perceptual Modulation

Fingerprint

Dive into the research topics of 'Body-Based User Interfaces'. Together they form a unique fingerprint.

Cite this