Gaze Gesture Recognition by Graph Convolutional Networks
Lei Shi, Cosmin Copot, Steve Vanlanduit
Frontiers in Robotics and AI, 8, 2021.
Abstract
Gaze gestures are extensively used in the interactions with agents/computers/robots. Either remote eye tracking devices or head-mounted devices (HMDs) have the advantage of hands-free during the interaction. Previous studies have demonstrated the success of applying machine learning techniques for gaze gesture recognition. More recently, graph neural networks (GNNs) have shown great potential applications in several research areas such as image classification, action recognition, and text classification. However, GNNs are less applied in eye tracking researches. In this work, we propose a graph convolutional network (GCN)–based model for gaze gesture recognition. We train and evaluate the GCN model on the HideMyGaze! dataset. The results show that the accuracy, precision, and recall of the GCN model are 97.62%, 97.18%, and 98.46%, respectively, which are higher than the other compared conventional machine learning algorithms, the artificial neural network (ANN) and the convolutional neural network (CNN).Links
BibTeX
@article{shi21_frai,
author = {Shi, Lei and Copot, Cosmin and Vanlanduit, Steve},
title = {Gaze Gesture Recognition by Graph Convolutional Networks},
journal = {Frontiers in Robotics and AI},
year = {2021},
volume = {8},
paper = {222},
doi = {10.3389/frobt.2021.709952}
}