Abstract
People typically interact with information visualizations using a mouse. Their physical movement, orientation, and distance to visualizations are rarely used as input. We explore how to use such spatial relations among people and visualizations (i.e., proxemics) to drive interaction with visualizations, focusing here on the spatial relations between a single user and visualizations on a large display. We implement interaction techniques that zoom and pan, query and relate, and adapt visualizations based on tracking of users' position in relation to a large high-resolution display. Alternative prototypes are tested in three user studies and compared with baseline conditions that use a mouse. Our aim is to gain empirical data on the usefulness of a range of design possibilities and to generate more ideas. Among other things, the results show promise for changing zoom level or visual representation with the user's physical distance to a large display. We discuss possible benefits and potential issues to avoid when designing information visualizations that use proxemics
Originalsprog | Engelsk |
---|---|
Tidsskrift | IEEE Transactions on Visualization and Computer Graphics |
Vol/bind | 19 |
Udgave nummer | 12 |
Sider (fra-til) | 2386-2395 |
Antal sider | 10 |
ISSN | 1077-2626 |
DOI | |
Status | Udgivet - 2013 |
Udgivet eksternt | Ja |
Emneord
- Data visualization
- Navigation
- Information filters
- Encoding
- distance
- Proxemics
- information visualization
- user study
- large displays
- user tracking
- movement
- orientation