Multiplayer Modeling via Multi-Armed Bandits

Robert C Gray, Jichen Zhu, Santiago Ontañón

Publikation: Konference artikel i Proceeding eller bog/rapport kapitelKonferencebidrag i proceedingsForskningpeer review

Abstract

This paper focuses on player modeling in multiplayer adaptive games. While player modeling has received a significant amount of attention, less is known about how to use player modeling in multiplayer games, especially when an experience management AI must make decisions on how to adapt the experience for the group as a whole. Specifically, we present a multi-armed bandit (MAB) approach for modeling groups of multiple players. Our main contributions are a new MAB framework for multiplayer modeling and techniques for addressing the new challenges introduced by the multiplayer context, extending previous work on MAB-based player modeling to account for new group-generated phenomena not present in single-user models. We evaluate our approach via simulation of virtual players in the context of multiplayer adaptive exergames.
OriginalsprogEngelsk
Titel2021 IEEE Conference on Games (CoG)
Antal sider8
Publikationsdato2021
Sider01-08
DOI
StatusUdgivet - 2021

Emneord

  • Player Modeling
  • Multiplayer Games
  • Adaptive Games
  • Multi-Armed Bandit
  • Experience Management AI

Fingeraftryk

Dyk ned i forskningsemnerne om 'Multiplayer Modeling via Multi-Armed Bandits'. Sammen danner de et unikt fingeraftryk.

Citationsformater