Abstract
Automatic camera control aims to define a framework to control virtual camera movements in dynamic and unpredictable virtual environments while ensuring a set of desired visual properties. We inves- tigate the relationship between camera placement and playing behaviour in games and build a user model of the camera behaviour that can be used to control camera movements based on player preferences. For this purpose, we collect eye gaze, camera and game-play data from subjects playing a 3D platform game, we cluster gaze and camera information to identify camera behaviour profiles and we employ machine learning to build predictive models of the virtual camera behaviour. The perfor- mance of the models on unseen data reveals accuracies above 70% for all the player behaviour types identified. The characteristics of the gener- ated models, their limits and their use for creating adaptive automatic camera control in games is discussed.
Original language | English |
---|---|
Book series | Lecture Notes in Computer Science |
Volume | 6815 |
Pages (from-to) | 25-36 |
ISSN | 0302-9743 |
Publication status | Published - 2011 |
Keywords
- Automatic camera control
- Virtual environments
- Player preferences
- Eye gaze data
- Machine learning models