Mood Expression in Real-Time Computer Generated Music using Pure Data

Marco Scirea, Mark Nelson, Yun-Gyung Cheong, Byung Chull Bae

Publikation: Konference artikel i Proceeding eller bog/rapport kapitelKonferencebidrag i proceedingsForskningpeer review


This paper presents an empirical study that investigated if procedurally generated music based on a set of musical features can elicit a target mood in the music listener. Drawn from the two-dimensional affect model proposed by Russell, the musical features that we have chosen to express moods are intensity, timbre, rhythm, and dissonances. The eight types of mood investigated in this study are being bored, content, happy, miserable, tired, fearful, peaceful, and alarmed. We created 8 short music clips using PD (Pure Data) programming language, each of them represents a particular mood. We carried out a pilot study and present a preliminary result.
TitelProceedings of the ICMPC-APSCOM 2014 Joint Conference
Antal sider5
Forlag College of Music, Yonsei University
Publikationsdatoaug. 2014
StatusUdgivet - aug. 2014


Dyk ned i forskningsemnerne om 'Mood Expression in Real-Time Computer Generated Music using Pure Data'. Sammen danner de et unikt fingeraftryk.