Perceptual User Interfaces Logo
University of Stuttgart Logo

GazeEMD: Detecting Visual Intention in Gaze-Based Human-Robot Interaction

Lei Shi, Cosmin Copot, Steve Vanlanduit

Robotics, 10(2), pp. 1–18, 2021.


Abstract

In gaze-based Human-Robot Interaction (HRI), it is important to determine human visual intention for interacting with robots. One typical HRI interaction scenario is that a human selects an object by gaze and a robotic manipulator will pick up the object. In this work, we propose an approach, GazeEMD, that can be used to detect whether a human is looking at an object for HRI application. We use Earth Mover’s Distance (EMD) to measure the similarity between the hypothetical gazes at objects and the actual gazes. Then, the similarity score is used to determine if the human visual intention is on the object. We compare our approach with a fixation-based method and HitScan with a run length in the scenario of selecting daily objects by gaze. Our experimental results indicate that the GazeEMD approach has higher accuracy and is more robust to noises than the other approaches. Hence, the users can lessen cognitive load by using our approach in the real-world HRI scenario.

Links


BibTeX

@article{shi21_robotics, author = {Shi, Lei and Copot, Cosmin and Vanlanduit, Steve}, title = {{GazeEMD}: Detecting Visual Intention in Gaze-Based Human-Robot Interaction}, journal = {Robotics}, year = {2021}, volume = {10}, number = {2}, paper = {68}, pages = {1--18}, doi = {10.3390/robotics10020068} }