Robust Recognition of Reading Activity in Transit Using Wearable Electrooculography
Abstract
This dataset was recorded to investigate the problem of recognising reading activity from eye movements. The experimental setup was designed with two main objectives in mind: (1) to record eye movements in an unobtrusive manner in a mobile real-world setting, and (2) to evaluate how well reading can be recognised for persons in transit. We defined a scenario of travelling to and from work containing a semi-naturalistic set of reading activities. It involved subjects reading freely chosen text without pictures while engaged in a sequence of activities such as sitting at a desk, walking along a corridor, walking along a street, waiting at a tram stop and riding a tram.
Download (20.2 Mb)
The data is only to be used for non-commercial scientific purposes. If you use this dataset in a scientific publication, please cite the following paper:
-
Multimodal Recognition of Reading Activity in Transit Using Body-Worn Sensors
ACM Transactions on Applied Perception (TAP), 9 (1), pp. 1–21, 2012.
-
Robust Recognition of Reading Activity in Transit Using Wearable Electrooculography
Proc. International Conference on Pervasive Computing (Pervasive), pp. 19-37, 2008.
The dataset has the following characteristics:
* ~6 hours of eye movement data recorded using a wearable Electrooculography (EOG) system* 8 participants (4 female, 4 male), aged between 23 and 35 years
* 4 experimental runs for each participant: calibration (walking around a circular corridor for approximately 2 minutes while reading continuously), baseline (walk and tram ride to and from work without any reading), two runs of reading in the same scenario separate horizontal and vertical EOG channels, joint sampling frequency of 128Hz
* fully ground truth annotated (reading vs. not reading) using a wireless Wii Remote controller