Our goal is to develop next-generation human-machine interfaces that offer human-like interactive capabilities. To this end, we research fundamental computational methods as well as ambient and on-body systems to sense, model, and analyse everyday non-verbal human behavior and cognition.
Latest News
Spotlight
- GAZE'23: Multimodal Integration of Human-Like Attention in Visual Question Answering
- INTERACT'23: Exploring Natural Language Processing Methods for Interactive Behaviour Modelling
- CogSci'23: Improving Neural Saliency Prediction with a Cognitive Model of Human Visual Attention
- COGAIN'23: GazeCast: Using Mobile Devices to Allow Gaze-based Interaction on Public Displays
- CHI'23: Impact of Privacy Protection Methods of Lifelogs on Remembered Memories
- TOCHI'22: Understanding, Addressing, and Analysing Digital Eye Strain in Virtual Reality Head-Mounted Displays
- TVCG'22: VisRecall: Quantifying Information Visualisation Recallability via Question Answering
- GMML'22: Facial Composite Generation with Iterative Human Feedback
- CHI'22: Designing for Noticeability: The Impact of Visual Importance on Desktop Notifications
- COLING'22: Neuro-Symbolic Visual Dialog
- TVCG'21: EHTask: Recognizing User Tasks from Eye and Head Movements in Immersive Virtual Reality
- TVCG'21: FixationNet: Forecasting Eye Fixations in Task-Oriented Virtual Environments
- CoNLL'21: VQA-MHUG: A gaze dataset to study multimodal neural attention in VQA
- CHI'21: A Critical Assessment of the Use of SSQ as a Measure of General Discomfort in VR Head-Mounted Displays
- ICCV'21: Neural Photofit: Gaze-based Mental Image Reconstruction
- NeurIPS'20: Improving Natural Language Processing Tasks with Human Gaze-Guided Neural Attention
- ETRA'20: Combining Gaze Estimation and Optical Flow for Pursuits Interaction
- CHI'20: Quantification of Users’ Visual Attention During Everyday Mobile Device Interactions
- IMWUT'19: Classifying Attention Types with Thermal Imaging and Eye Tracking
- TPAMI'19: MPIIGaze: Real-World Dataset and Deep Appearance-Based Gaze Estimation
- '19: A Review of EEG Features for Emotion Recognition (in Chinese)
- ETRA'19: A fast approach to refraction-aware 3D eye-model fitting and gaze prediction
- CHI'19: A Design Space for Gaze Interaction on Head-mounted Displays
- MM'19: Moment-to-Moment Detection of Internal Thought during Video Viewing from Eye Vergence Behavior
- ETRA'19: Reducing Calibration Drift in Mobile Eye Trackers by Exploiting Mobile Phone Usage
- ICMI'19: Emergent Leadership Detection Across Datasets
- ComCo'19: Predicting Gaze Patterns: Text Saliency for Integration into Machine Learning Tasks
- ETRA'19: PrivacEye: Privacy-Preserving Head-Mounted Eye Tracking Using Egocentric Scene Image and Eye Movement Features
- ETRA'19: Privacy-Aware Eye Tracking Using Differential Privacy
- CHI'19: Evaluation of Appearance-Based Methods and Implications for Gaze-Based Applications