Multiplayer Modeling via Multi-Armed Bandits

Robert C Gray, Jichen Zhu, Santiago Ontañón

Research output: Conference Article in Proceeding or Book/Report chapterArticle in proceedingsResearchpeer-review

Abstract

This paper focuses on player modeling in multiplayer adaptive games. While player modeling has received a significant amount of attention, less is known about how to use player modeling in multiplayer games, especially when an experience management AI must make decisions on how to adapt the experience for the group as a whole. Specifically, we present a multi-armed bandit (MAB) approach for modeling groups of multiple players. Our main contributions are a new MAB framework for multiplayer modeling and techniques for addressing the new challenges introduced by the multiplayer context, extending previous work on MAB-based player modeling to account for new group-generated phenomena not present in single-user models. We evaluate our approach via simulation of virtual players in the context of multiplayer adaptive exergames.
Original languageEnglish
Title of host publication2021 IEEE Conference on Games (CoG)
Number of pages8
Publication date2021
Pages01-08
DOIs
Publication statusPublished - 2021

Keywords

  • Player Modeling
  • Multiplayer Games
  • Adaptive Games
  • Multi-Armed Bandit
  • Experience Management AI

Fingerprint

Dive into the research topics of 'Multiplayer Modeling via Multi-Armed Bandits'. Together they form a unique fingerprint.

Cite this