Perceptual User Interfaces Logo
University of Stuttgart Logo

GazeProjector: Location-independent gaze interaction on and across multiple displays

Christian Lander, Sven Gehring, Antonio Krüger, Sebastian Boring, Andreas Bulling

DFKI Research Reports, pp. 1–10, 2015.


Abstract

Mobile gaze-based interaction with multiple displays may occur from arbitrary positions and orientations. However, maintaining high gaze estimation accuracy still represents a significant challenge. To address this, we present GazeProjector, a system that combines accurate point-of-gaze estimation with natural feature tracking on displays to determine the mobile eye tracker’s position relative to a display. The detected eye positions are transformed onto that display allowing for gaze-based interaction. This allows for seamless gaze estimation and interaction on (1) multiple displays of arbitrary sizes, (2) independently of the user’s position and orientation to the display. In a user study with 12 participants we compared GazeProjector to existing well- established methods such as visual on-screen markers and a state-of-the-art motion capture system. Our results show that our approach is robust to varying head poses, orientations, and distances to the display, while still providing high gaze estimation accuracy across multiple displays without re-calibration. The system represents an important step towards the vision of pervasive gaze-based interfaces.

Links


BibTeX

@techreport{lander15_techrep, author = {Lander, Christian and Gehring, Sven and Kr{\"{u}}ger, Antonio and Boring, Sebastian and Bulling, Andreas}, title = {GazeProjector: Location-independent gaze interaction on and across multiple displays}, volume = {1}, year = {2015}, pages = {1--10}, institution = {German Research Center for Artificial Intelligence (DFKI)}, video = {https://www.youtube.com/watch?v=peuL4WRfrRM} }