Can You Feel It?: Evaluation of Affective Expression in Music Generated by MetaCompose

Marco Scirea, Peter Eklund, Julian Togelius, Sebastian Risi

Research output: Journal Article or Conference Article in JournalConference articleResearchpeer-review

Abstract

This paper describes an evaluation conducted on the MetaCompose music generator, which is based on evolutionary computation and uses a hybrid evolutionary technique that combines FI-2POP and multi-objective optimization. The main objective of MetaCompose is to create music in real-time that can express different mood-states. The experiment presented here aims to evaluate: (i) if the perceived mood experienced by the participants of a music score matches intended mood the system is trying to express and (ii) if participants can identify transitions in the mood expression that occur mid-piece. Music clips including transitions and with static affective states were produced by MetaCompose and a quantitative user study was performed. Participants were tasked with annotating the perceived mood and moreover were asked to annotate in real-time changes in valence. The data collected confirms the hypothesis that people can recognize changes in music mood and that MetaCompose can express perceptibly different levels of arousal. In regards to valence we observe that, while it is mainly perceived as expected, changes in arousal seems to also influence perceived valence, suggesting that one or more of the music features MetaCompose associates with arousal has some effect on valence as well.
Original languageEnglish
JournalGECCO '17
Pages (from-to)211-218
Number of pages8
DOIs
Publication statusPublished - 2017

Fingerprint

Dive into the research topics of 'Can You Feel It?: Evaluation of Affective Expression in Music Generated by MetaCompose'. Together they form a unique fingerprint.

Cite this