Perceptual User Interfaces Logo
University of Stuttgart Logo

DisMouse: Disentangling Information from Mouse Movement Data

Guanhua Zhang, Zhiming Hu, Andreas Bulling

Proc. ACM Symposium on User Interface Software and Technology (UIST), pp. 1–13, 2024.




Abstract

Mouse movement data contain rich information about users, performed tasks, and user interfaces, but separating the respective components remains challenging and unexplored. As a first step to address this challenge, we propose DisMouse – the first method to disentangle user-specific and user-independent information and stochastic variations from mouse movement data. At the core of our method is an autoencoder trained in a semi-supervised fashion, consisting of a self-supervised denoising diffusion process and a supervised contrastive user identification module. Through evaluations on three datasets, we show that DisMouse 1) captures complementary information of mouse input, hence providing an interpretable framework for modelling mouse movements, 2) can be used to produce refined features, thus enabling various applications such as personalised and variable mouse data generation, and 3) generalises across different datasets. Taken together, our results underline the significant potential of disentangled representation learning for explainable, controllable, and generalised mouse behaviour modelling.

Links


BibTeX

@inproceedings{zhang24_uist, title = {DisMouse: Disentangling Information from Mouse Movement Data}, author = {Zhang, Guanhua and Hu, Zhiming and Bulling, Andreas}, year = {2024}, pages = {1--13}, booktitle = {Proc. ACM Symposium on User Interface Software and Technology (UIST)}, doi = {} }

Acknowledgements

The authors thank the International Max Planck Research School for Intelligent Systems (IMPRS-IS) for supporting G. Zhang. Z. Hu was funded by the Deutsche Forschungsgemeinschaft (DFG, German Research Foundation) under Germany’s Excellence Strategy - EXC 2075 – 390740016. A. Bulling was funded by the European Research Council (ERC; grant agreement 801708). We acknowledge the support by the Stuttgart Center for Simulation Science (SimTech). We would like to thank Mayar Elfares and Yanzhou Chen for their technical support, and thank anonymous reviewers for their helpful feedback.