Trusting to learn: Trust and privacy issues in serious games.
Trust and Trustworthy Computing.
(pp. 116 - 130).
Organizations are increasingly investing in technology-enhanced learning systems to improve their employees’ skills. Serious games are one example; the competitive and fun nature of games is supposed to motivate employee participation. But any system that records employee data raises issues of privacy and trust. In this paper, we present a study on privacy and trust implications of serious games in an organizational context. We present findings from 32 interviews with potential end-users of a serious games platform called TARGET. A qualitative analysis of the interviews reveals that participants anticipate privacy risks for the data generated in game playing, and their decision to trust their fellow employees and managers depends on the presence of specific trust signals. Failure to minimize privacy risks and maximize trust will affect the acceptance of the system and the learning experience – thus undermining the primary purpose for which it was deployed. Game designers are advised to provide mechanisms for selective disclosure of data by players, and organizations should not use gaming data for appraisal or selection purposes, and clearly communicate this to employees.
|Title:||Trusting to learn: Trust and privacy issues in serious games|
|Event:||4th International Conference, TRUST 2011, Pittsburgh, PA, USA, June 22-24, 2011|
|Open access status:||An open access version is available from UCL Discovery|
|Additional information:||The final publication is available at link.springer.com|
|UCL classification:||UCL > School of Life and Medical Sciences
UCL > School of Life and Medical Sciences > Faculty of Brain Sciences
UCL > School of Life and Medical Sciences > Faculty of Brain Sciences > Psychology and Language Sciences (Division of) > UCL Interaction Centre
UCL > School of BEAMS
UCL > School of BEAMS > Faculty of Engineering Science
Archive Staff Only