Mood Expression in Real-Time Computer Generated Music using Pure Data

Research output: Conference Article in Proceeding or Book/Report chapterArticle in proceedingsResearchpeer-review

View graph of relations

This paper presents an empirical study that investigated if procedurally generated music based on a set of musical features can elicit a target mood in the music listener. Drawn from the two-dimensional affect model proposed by Russell, the musical features that we have chosen to express moods are intensity, timbre, rhythm, and dissonances. The eight types of mood investigated in this study are being bored, content, happy, miserable, tired, fearful, peaceful, and alarmed. We created 8 short music clips using PD (Pure Data) programming language, each of them represents a particular mood. We carried out a pilot study and present a preliminary result.
Original languageEnglish
Title of host publicationProceedings of the ICMPC-APSCOM 2014 Joint Conference
Number of pages5
Publisher College of Music, Yonsei University
Publication dateAug 2014
Publication statusPublished - Aug 2014

    Research areas

  • music, mood, affective computing


No data available

ID: 80253435