Towards an experiment on perception of affective music generation using MetaCompose

Marco Scirea, Peter Eklund, Julian Togelius, Sebastian Risi

Research output: Conference Article in Proceeding or Book/Report chapterArticle in proceedingsResearchpeer-review

Abstract

MetaCompose is a music generator based on a hybrid evolutionary
technique combining FI-2POP and multi-objective optimization. In
this paper we employ the MetaCompose music generator to create
music in real-time that expresses different mood-states in a gameplaying environment (Checkers) and present preliminary results of
an experiment focusing on determining (i) if differences in player
experience can be observed when using affective-dynamic music
compared to static music; and (ii) if any difference is observed when
the music supports the game’s internal narrative/state. Participants
were tasked to play two games of Checkers while listening to two
(out of three) different set-ups of game-related generated music.
The possible set-ups were: static expression, consistent affective
expression, and random affective expression.
Original languageEnglish
Title of host publicationGECCO '18 Proceedings of the Genetic and Evolutionary Computation Conference Companion
Place of PublicationNew York, NY, USA
PublisherAssociation for Computing Machinery
Publication date2018
Pages131-132
ISBN (Electronic)978-1-4503-5764-7
DOIs
Publication statusPublished - 2018

Fingerprint

Dive into the research topics of 'Towards an experiment on perception of affective music generation using MetaCompose'. Together they form a unique fingerprint.

Cite this