Perceptual User Interfaces Logo
University of Stuttgart Logo

Visual Scanpath Prediction for Mobile User Interfaces

Dataset image

Description: In contrast to modelling visual attention through saliency, which generally only provides a single 2D distribution of visual attention over a stimuli, scanpaths can fully capture the temporal dynamics of gaze behaviour and provide a temporal ordering on human fixations. Consequently, methods to predict scanpaths (Bao & Chen 2020, Assens et al. 2018), i.e. to generate plausible and human-like sequences of fixations on new visual stimuli has many practical applications in human-computer interaction and beyond. However, there are currently no methods to predict scanpaths on mobile user interfaces (UIs).

In this project, we will explore and propose new methods for scanpath prediction on mobile UIs. We will first evaluate existing methods that predict scanpaths on natural images and then investigate ways to improve and design a method that is specifically tailored for this use case. For evaluations, we will use the Mobile UI Saliency dataset (Leiva et al. 2020).

Supervisor: Mihai Bâce

Distribution: 30% Literature, 10% Data Preparation, 30% Implementation, 30% Analysis and Evaluation

Requirements: Experience with Tensorflow/PyTorch

Literature: Bao, Wentao and Zhenzhong Chen. 2020. Human scanpath prediction based on deep convolutional saccadic model. Neurocomputing, 404, p.154-164.

Assens, Marc, Xavier Giro-i-Nieto, Kevin McGuinness and N. O'Connor. 2018. PathGAN: Visual Scanpath Prediction with Generative Adversarial Networks. arxiv:1809.00567.

Leiva, Luis A., Yunfei Xue, Avya Bansal, Hamed Tavakoli, Tuðçe Köroðlu, Jingzhou Du, Niraj Dayama and Antti Oulasvirta. 2020. Understanding visual saliency in mobile user interfaces. Proceedings of the 22nd International Conference on Human-Computer Interaction with Mobile Devices and Services (MobileHCI).