In the Blink of an Eye: Combining Head Motion and Eye Blink Frequency for Activity Recognition with Google Glass
Shoya Ishimaru, Jens Weppner, Kai Kunze, Koichi Kise, Andreas Dengel, Paul Lukowicz, Andreas Bulling
Proc. ACM Augmented Human International Conference (AH), pp. 1–4, 2014.
Abstract
We demonstrate how information about eye blink frequency and head motion patterns derived from Google Glass sensors can be used to distinguish different types of high level activities. While it is well known that eye blink frequency is correlated with user activity, our aim is to show that (1) eye blink frequency data from an unobtrusive, commercial platform which is not a dedicated eye tracker is good enough to be useful and (2) that adding head motion patterns information significantly improves the recognition rates. The method is evaluated on a data set from an experiment containing five activity classes (reading, talking, watching TV, mathematical problem solving, and sawing) of eight participants showing 67% recognition accuracy for eye blinking only and 82% when extended with head motion patterns.Links
Paper: ishimaru14_ah.pdf
BibTeX
@inproceedings{ishimaru14_ah,
author = {Ishimaru, Shoya and Weppner, Jens and Kunze, Kai and Kise, Koichi and Dengel, Andreas and Lukowicz, Paul and Bulling, Andreas},
title = {In the Blink of an Eye: Combining Head Motion and Eye Blink Frequency for Activity Recognition with Google Glass},
booktitle = {Proc. ACM Augmented Human International Conference (AH)},
year = {2014},
pages = {1--4},
doi = {10.1145/2582051.2582066}
}