Fusing Visual and Behavioral Cues for Modeling User Experience in Games

Noor Shaker, Stylianos Asteriadis, Georgios N. Yannakakis, Kostas Karpouzis

Research output: Journal Article or Conference Article in JournalJournal articleResearchpeer-review


Estimating affective and cognitive states in condi- tions of rich human-computer interaction, such as in games, is a field of growing academic and commercial interest. Entertain- ment and serious games can benefit from recent advances in the field as, having access to predictors of the current state of the player (or learner) can provide useful information for feeding adaptation mechanisms that aim to maximize engagement or learning effects. In this paper, we introduce a large data corpus derived from 58 participants that play the popular Super Mario Bros platform game and attempt to create accurate models of player experience for this game genre. Within the view of the current research, features extracted both from player gameplay behavior and game levels, and player visual characteristics have been utilized as potential indicators of reported affect expressed as pairwise preferences between different game sessions. Using neuroevolutionary preference learning and automatic feature selection, highly accurate models of reported engagement, frus- tration, and challenge are constructed (model accuracies reach 91%, 92% and 88% for engagement, frustration and challenge, respectively). As a step further, the derived player experience models can be used to personalize the game level to desired levels of engagement, frustration and challenge, as game content is mapped to player experience through the behavioral and expressivity patterns of each player.
Original languageEnglish
JournalI E E E Transactions on Systems, Man and Cybernetics, Part A: Systems & Humans
Issue number6
Pages (from-to)1519-1531
Publication statusPublished - 2013


Dive into the research topics of 'Fusing Visual and Behavioral Cues for Modeling User Experience in Games'. Together they form a unique fingerprint.

Cite this