Steptoe, W and Oyekoya, O and Murgia, A and Wolff, R and Rae, J and Guimaraes, E and Roberts, D and Steed, A (2009) Eye Tracking for Avatar Eye Gaze Control During Object-Focused Multiparty Interaction in Immersive Collaborative Virtual Environments. In: Steed, A and Reiners, D and Lindeman, RW, (eds.) IEEE VIRTUAL REALITY 2009, PROCEEDINGS. (pp. 83 - 90). IEEE COMPUTER SOC
Full text not available from this repository.
In face-to-face collaboration, eye gaze is used both as a bidirectional signal to monitor and indicate focus of attention and action, as well as a resource to manage the interaction. In remote interaction supported by Immersive Collaborative Virtual Environments (ICVEs), embodied avatars representing and controlled by each participant share a virtual space. We report on a study designed to evaluate methods of avatar eye gaze control during an object-focused puzzle scenario performed between three networked CAVE (TM)-like systems. We compare tracked gaze, in which avatars' eyes are controlled by head-mounted mobile eye trackers worn by participants, to a gaze model informed by head orientation for saccade generation, and static gaze featuring non-moving eyes. We analyse task performance, subjective user experience, and interactional behaviour. While not providing statistically significant benefit over static gaze, tracked gaze is observed as the highest performing condition. However, the gaze model resulted in significantly lower task performance and increased error rate.
|Title:||Eye Tracking for Avatar Eye Gaze Control During Object-Focused Multiparty Interaction in Immersive Collaborative Virtual Environments|
|Event:||IEEE Virtual Reality Conference 2009|
|Dates:||2009-03-14 - 2009-03-18|
|Keywords:||Immersive Collaborative Virtual Environments, Eye Tracking, Avatars, Eye Gaze, Behavioural Realism|
|UCL classification:||UCL > School of BEAMS > Faculty of Engineering Science > Computer Science|
Archive Staff Only: edit this record