Our goal is to develop next-generation human-machine interfaces that offer human-like interactive capabilities. To this end, we research fundamental computational methods as well as ambient and on-body systems to sense, model, and analyse everyday non-verbal human behavior and cognition.
PhD Position Available
We have a PhD position available in the context of the project Eyes4ICU. The topic is “Gaze Behaviour Analysis for Early Diagnosis and Monitoring of Autism”.
If you are highly motivated and capable of addressing and solving scientifically challenging problems, and if you are interested in doing research in an internationally oriented and highly successful team, you should apply by following instructions on our jobs page.
HiWi Position Available
Three HiWi positions are available: Deep Learning Approaches for Theory of Mind, Human Modeling for Human-AI cooperation with Inverse Reinforcement Learning and Generative Models, and Web Interface for Annotation Workflow. See our jobs page for more information.
Latest News
Spotlight
- INTERACT'23: Exploring Natural Language Processing Methods for Interactive Behaviour Modelling
- CogSci'23: Improving Neural Saliency Prediction with a Cognitive Model of Human Visual Attention
- COGAIN'23: GazeCast: Using Mobile Devices to Allow Gaze-based Interaction on Public Displays
- CHI'23: Impact of Privacy Protection Methods of Lifelogs on Remembered Memories
- UIST'23: SUPREYES: SUPer Resolution for EYES Using Implicit Neural Representation Learning
- GAZE, CVPRW'23: Multimodal Integration of Human-Like Attention in Visual Question Answering
- UIST'23: Usable and Fast Interactive Mental Face Reconstruction
- TOCHI'22: Understanding, Addressing, and Analysing Digital Eye Strain in Virtual Reality Head-Mounted Displays
- TVCG'22: VisRecall: Quantifying Information Visualisation Recallability via Question Answering
- CHI'22: Designing for Noticeability: The Impact of Visual Importance on Desktop Notifications
- COLING'22: Neuro-Symbolic Visual Dialog
- TVCG'21: EHTask: Recognizing User Tasks from Eye and Head Movements in Immersive Virtual Reality
- TVCG'21: FixationNet: Forecasting Eye Fixations in Task-Oriented Virtual Environments
- CoNLL'21: VQA-MHUG: A gaze dataset to study multimodal neural attention in VQA
- CHI'21: A Critical Assessment of the Use of SSQ as a Measure of General Discomfort in VR Head-Mounted Displays
- ICCV'21: Neural Photofit: Gaze-based Mental Image Reconstruction
- NeurIPS'20: Improving Natural Language Processing Tasks with Human Gaze-Guided Neural Attention
- ETRA'20: Combining Gaze Estimation and Optical Flow for Pursuits Interaction
- CHI'20: Quantification of Users’ Visual Attention During Everyday Mobile Device Interactions
- IMWUT'19: Classifying Attention Types with Thermal Imaging and Eye Tracking
- TPAMI'19: MPIIGaze: Real-World Dataset and Deep Appearance-Based Gaze Estimation
- '19: A Review of EEG Features for Emotion Recognition (in Chinese)
- ETRA'19: A fast approach to refraction-aware 3D eye-model fitting and gaze prediction
- CHI'19: A Design Space for Gaze Interaction on Head-mounted Displays
- MM'19: Moment-to-Moment Detection of Internal Thought during Video Viewing from Eye Vergence Behavior
- ETRA'19: Reducing Calibration Drift in Mobile Eye Trackers by Exploiting Mobile Phone Usage
- ICMI'19: Emergent Leadership Detection Across Datasets
- ComCo'19: Predicting Gaze Patterns: Text Saliency for Integration into Machine Learning Tasks
- ETRA'19: PrivacEye: Privacy-Preserving Head-Mounted Eye Tracking Using Egocentric Scene Image and Eye Movement Features
- ETRA'19: Privacy-Aware Eye Tracking Using Differential Privacy
- CHI'19: Evaluation of Appearance-Based Methods and Implications for Gaze-Based Applications