OpenGaze toolkit released
We are very happy to announce that, today, we released OpenGaze, the first open source toolkit for camera-based gaze estimation and interaction. Gaze estimation methods that only require an off-the-shelf camera have significantly improved and promise a wide range of new applications in gaze-based interaction and attentive user interfaces. However, these methods are not yet widely used in the human-computer interaction (HCI) community. With OpenGaze, we aim to democratize their use in HCI and therefore designed the toolkit specifically for gaze interface designers.
OpenGaze is described in more detail in the following paper:
-
Evaluation of Appearance-Based Methods and Implications for Gaze-Based Applications
Proc. ACM SIGCHI Conference on Human Factors in Computing Systems (CHI), pp. 1–13, 2019.
Here are some related news you might like to read next:
- Two papers accepted at CHI
- Paper accepted at CHI 2023
- Paper accepted at CHI 2022
- Paper accepted at CHI 2021
- Paper accepted at CHI 2020