Perceptual User Interfaces Logo
University of Stuttgart Logo

3DGazeSim: 3D Gaze Estimation from 2D Pupil Positions on Monocular Head-Mounted Eye Trackers

We collected eye tracking data from 14 participants aged between 22 and 29 years. 10 recordings were collected from each participant, 2 for each depth (calibration and test) at 5 different depths from a public display (1m, 1.25m, 1.5m, 1.75m and 2m). Display dimensions were 121.5cm × 68.7cm. We use a 5×5 grid pattern to disply 25 calibration points and an inner 4×4 grid for displaying 16 test points. This is done by randomly moving a target marker on these grid positions and capturing images from eye/scene camera at 30 Hz. We further perform marker detection using ArUco library on target points to compute their 3D coordinates w.r.t. scene camera. In addition, we are given the 2D position of pupil center in each frame of the eye-camera from a state-of-the-art dark-pupil head-mounted eye tracker (PUPIL). The eye tracker consists of a 1280×720 resolution scene camera and a 640×360 resolution eye camera. The PUPIL software used was v0.5.4.

Data is collected in an indoor setting and adds up to over 7 hours of eye tracking. Current dataset includes marker tracking results using ArUco per frame for every recording along with pupil tracking results from PUPIL eye tracker also for every frame of the eye video. We have also included camera intrinsic parameters for both eye camera and scene camera along with some post processed results such as frames corresponding to gaze intervals for every grid point. For more information on data format and how to use it please refer to the README file inside the dataset. In case you want to access the raw videos from both scene and eye camera please contact the authors.

Our evaluations on this data show the effectiveness of our new 2D-to-3D mapping approach together with multiple depth calibration data in reducing gaze estimation error.

More information on this data and the analysis made can be found here.

Download: Please download the full dataset here (81.4 Mb).

Contact: Andreas Bulling,

The data is only to be used for non-commercial scientific purposes. If you use this dataset in a scientific publication, please cite the following paper:

  1. 3D Gaze Estimation from 2D Pupil Positions on Monocular Head-Mounted Eye Trackers

    3D Gaze Estimation from 2D Pupil Positions on Monocular Head-Mounted Eye Trackers

    Mohsen Mansouryar, Julian Steil, Yusuke Sugano, Andreas Bulling

    Proc. ACM International Symposium on Eye Tracking Research and Applications (ETRA), pp. 197-200, 2016.

    Abstract Links BibTeX Project