Control of Affective Content in Music Production
Authors
Abstract
Music is a ubiquitous media in our lives, used in many contexts. Thepossibility to select appropriate affective music can be helpful to adapt
music to our emotional interest. Our work intends to design a system to
control affective content in music production. This is done by taking into
account a knowledge base with mappings between affective states
(happiness, sadness, etc.) and music features (rhythm, melody, etc.). The
knowledge base is grounded on background knowledge from Music
Psychology. Our system starts with the reception of an emotional
description specified by the user. Next, mappings are selected from the
knowledge base, according to the emotional description. Music is
retrieved from a music base (recorded sound and MIDI files) according
to similarity metrics between music features (of mappings and music
base). Afterward, selected music can be subject to transforming,
sequencing and remixing algorithms and then played. The inclusion of
third party composition software is also envisaged. To assess the system,
listener emotional state can be analysed using psychophysiological or
self-report measures.
Keywords
Music ComputingSubject
Music ComputingConference
International Symposium on Performance Science, November 2007PDF File
Cited by
Year 2011 : 1 citations
Rad, R., Firoozabadi, M. and Rezazadeh, I. "Discriminating Affective States in Music Induction Environment Using Forehead Bioelectric Signals". 1st Middle East Conference on Biomedical Engineering, 343 - 346.