TY - PAT
T1 - Computer-implemented gaze interaction method and apparatus
AU - Hansen, Dan Witzner
AU - Mardanbegi, Diako
N1 - Application number: US201515126596 20150316
Priority number(s): DKPA201470128 20140317 ; WO2015EP55435 20150316
PY - 2017/5/4
Y1 - 2017/5/4
N2 - A computer-implemented method of communicating via interaction with a user-interface based on a person's gaze and gestures, comprising: computing an estimate of the person's gaze comprising computing a point-of-regard on a display through which the person observes a scene in front of him; by means of a scene camera, capturing a first image of a scene in front of the person's head (and at least partially visible on the display) and computing the location of an object coinciding with the person's gaze; by means of the scene camera, capturing at least one further image of the scene in front of the person's head, and monitoring whether the gaze dwells on the recognised object; and while gaze dwells on the recognised object: firstly, displaying a user interface element, with a spatial expanse, on the display face in a region adjacent to the point-of-regard; and secondly, during movement of the display, awaiting and detecting the event that the point-of-regard coincides with the spatial expanse of the displayed user interface element. The event may be processed by communicating a message.
AB - A computer-implemented method of communicating via interaction with a user-interface based on a person's gaze and gestures, comprising: computing an estimate of the person's gaze comprising computing a point-of-regard on a display through which the person observes a scene in front of him; by means of a scene camera, capturing a first image of a scene in front of the person's head (and at least partially visible on the display) and computing the location of an object coinciding with the person's gaze; by means of the scene camera, capturing at least one further image of the scene in front of the person's head, and monitoring whether the gaze dwells on the recognised object; and while gaze dwells on the recognised object: firstly, displaying a user interface element, with a spatial expanse, on the display face in a region adjacent to the point-of-regard; and secondly, during movement of the display, awaiting and detecting the event that the point-of-regard coincides with the spatial expanse of the displayed user interface element. The event may be processed by communicating a message.
KW - Gaze estimation
KW - User-interface interaction
KW - Point-of-regard
KW - Scene camera
KW - Gesture-based communication
KW - Gaze estimation
KW - User-interface interaction
KW - Point-of-regard
KW - Scene camera
KW - Gesture-based communication
UR - https://worldwide.espacenet.com/publicationDetails/biblio?II=0&ND=3&adjacent=true&locale=en_EP&FT=D&date=20170504&CC=US&NR=2017123491A1&KC=A1
UR - https://www.google.com/patents/US20170123491
M3 - Patent
M1 - PCT/EP2015/055435
ER -