This paper presents an empirical study that investigated if procedurally generated music based on a set of musical features can elicit a target mood in the music listener. Drawn from the two-dimensional affect model proposed by Russell, the musical features that we have chosen to express moods are intensity, timbre, rhythm, and dissonances. The eight types of mood investigated in this study are being bored, content, happy, miserable, tired, fearful, peaceful, and alarmed. We created 8 short music clips using PD (Pure Data) programming language, each of them represents a particular mood. We carried out a pilot study and present a preliminary result.
|Title of host publication||Proceedings of the ICMPC-APSCOM 2014 Joint Conference|
|Number of pages||5|
|Publisher||College of Music, Yonsei University|
|Publication date||Aug 2014|
|Publication status||Published - Aug 2014|
- affective computing