Abstract
Information about interactive virtual environments, such as games, is perceived by users through a virtual camera. While most interactive applications let users control the camera, in complex navigation tasks within 3D environments users often get frustrated with the interaction. In this paper, we propose inclusion of camera control as a vital component of affective adaptive interaction in games. We investigate the impact of camera viewpoints on psychophysiology of players through preference surveys collected from a test game. Data is collected from players of a 3D prey/predator game in which player experience is directly linked to camera settings. Computational models of discrete affective states of fun, challenge, boredom, frustration, excitement, anxiety and relaxation are built on biosignal (heart rate, blood volume pulse and skin conductance) features to predict the pairwise self-reported emotional preferences of the players. For this purpose, automatic feature selection and neuro-evolutionary preference learning are combined providing highly accurate affective models. The performance of the artificial neural network models on unseen data reveals accuracies of above 80% for the majority of discrete affective states examined. The generality of the obtained models is tested in different test-bed game environments and the use of the generated models for creating adaptive affect-driven camera control in games is discussed.
Originalsprog | Engelsk |
---|---|
Tidsskrift | User Modeling and User-Adapted Interaction |
Vol/bind | 20 |
Udgave nummer | 4 |
Sider (fra-til) | 313-340 |
ISSN | 0924-1868 |
Status | Udgivet - 2010 |
Emneord
- Camera control
- player experience modeling
- skin conductance
- blood volumepulse
- neuro-evolution
- preference learning