Perceptual User Interfaces Logo
University of Stuttgart Logo

A Performance Analysis of Invariant Feature Descriptors in Eye Tracking based Human Robot Collaboration

Lei Shi, Cosmin Copot, Stijn Derammelaere, Steve Vanlanduit

Proc. International Conference on Control, Automation and Robotics (ICCAR), pp. 256–260, 2019.


Abstract

For eye tracking applications in Human Robot Collaboration (HRC), it is essential for the robot to be aware of where the human gaze is located in the scene. Using feature detectors and feature descriptors, the human gaze can be projected to the image from which robot could know where a human is looking at. The motion that occurs during the collaboration may affect the performance of the descriptor. In this paper, we analyse the performance of SIFT, SURF, AKAZE, BRISK and ORB feature descriptor in a real scene for eye tracking in HRC where different variances co-exist. We use a robotic arm and two cameras to test the descriptors instead of directly testing on eye tracking glasses in order that different accelerations can be tested quantitatively. Results show that BRISK, AKAZE and SURF are more favourable considering accuracy, stability and computation time.

Links


BibTeX

@inproceedings{shi19_iccar, author = {Shi, Lei and Copot, Cosmin and Derammelaere, Stijn and Vanlanduit, Steve}, title = {A Performance Analysis of Invariant Feature Descriptors in Eye Tracking based Human Robot Collaboration}, booktitle = {Proc. International Conference on Control, Automation and Robotics (ICCAR)}, year = {2019}, pages = {256--260}, doi = {10.1109/ICCAR.2019.8813478} }