Resized Grasping in VR: Estimating Thresholds for Object Discrimination

Joanna Bergström, Aske Mottelson, Jarrod Knibbe

Research output: Conference Article in Proceeding or Book/Report chapterArticle in proceedingsResearchpeer-review


Previous work in VR has demonstrated how individual physical objects can represent multiple virtual objects in different locations by redirecting the user's hand. We show how individual objects can represent multiple virtual objects of different sizes by resizing the user's grasp. We redirect the positions of the user's fingers by visual translation gains, inducing an illusion that can make physical objects seem larger or smaller. We present a discrimination experiment to estimate the thresholds of resizing virtual objects from physical objects, without the user reliably noticing a difference. The results show that the size difference is easily detected when a physical object is used to represent an object less than 90% of its size. When physical objects represent larger virtual objects, however, then scaling is tightly coupled to the physical object's size: smaller physical objects allow more virtual resizing (up to a 50% larger virtual size). Resized Grasping considerably broadens the scope of using illusions to provide rich haptic experiences in virtual reality.
Original languageUndefined/Unknown
Title of host publicationProceedings of the 32nd Annual ACM Symposium on User Interface Software and Technology
Place of PublicationNew York, NY, USA
PublisherAssociation for Computing Machinery
Publication date2019
ISBN (Print)9781450368162
Publication statusPublished - 2019
Externally publishedYes
SeriesUIST '19

Cite this