Our goal is to develop next-generation human-machine interfaces that offer human-like interactive capabilities. To this end, we research fundamental computational methods as well as ambient and on-body systems to sense, model, and analyse everyday non-verbal human behavior and cognition.
Latest News
Spotlight
- NeurIPS'20: Improving Natural Language Processing Tasks with Human Gaze-Guided Neural Attention
- ETRA'20: Combining Gaze Estimation and Optical Flow for Pursuits Interaction
- CHI'20: Quantification of Users’ Visual Attention During Everyday Mobile Device Interactions
- virtual MPaL'20: ‘Long nose’ and ‘naso lungo’: Establishing the need for retrodiction in computational models of word learning
- IMWUT'19: Classifying Attention Types with Thermal Imaging and Eye Tracking
- TPAMI'19: MPIIGaze: Real-World Dataset and Deep Appearance-Based Gaze Estimation
- ICMI'19: Emergent Leadership Detection Across Datasets
- ETRA'19: Reducing Calibration Drift in Mobile Eye Trackers by Exploiting Mobile Phone Usage
- ETRA'19: Privacy-Aware Eye Tracking Using Differential Privacy
- ETRA'19: PrivacEye: Privacy-Preserving Head-Mounted Eye Tracking Using Egocentric Scene Image and Eye Movement Features
- CHI'19: Evaluation of Appearance-Based Methods and Implications for Gaze-Based Applications
- ETRA'19: A fast approach to refraction-aware 3D eye-model fitting and gaze prediction
- CHI'19: A Design Space for Gaze Interaction on Head-mounted Displays
- ComCo'19: Predicting Gaze Patterns: Text Saliency for Integration into Machine Learning Tasks
- MM'19: Moment-to-Moment Detection of Internal Thought during Video Viewing from Eye Vergence Behavior