Hussain, Z; Shawe-Taylor, J; Pasupa, K; (2010) Learning relevant eye movement feature spaces across users. Eye Tracking Research and Applications Symposium (ETRA) 181 - 186. 10.1145/1743666.1743711.
Full text not available from this repository.
In this paper we predict the relevance of images based on a lowdimensional feature space found using several users' eye movements. Each user is given an image-based search task, during which their eye movements are extracted using a Tobii eye tracker. The users also provide us with explicit feedback regarding the relevance of images. We demonstrate that by using a greedy Nyström algorithm on the eye movement features of different users, we can find a suitable low-dimensional feature space for learning. We validate the suitability of this feature space by projecting the eye movement features of a new user into this space, training an online learning algorithm using these features, and showing that the number of mistakes (regret over time) made in predicting relevant images is lower than when using the original eye movement features. We also plot Recall-Precision and ROC curves, and use a sign test to verify the statistical significance of our results. © 2010 ACM.
|Title:||Learning relevant eye movement feature spaces across users|
|UCL classification:||UCL > School of BEAMS > Faculty of Engineering Science > Computer Science|
Archive Staff Only: edit this record