CISUC

A Musical System for Emotional Expression

Authors

Abstract

The automatic control of emotional expression in (tonal) music is a challenge that is far from being solved. This thesis presents EDME - a system with such capabilities used for the generation of novel musical works which express a particular emotion as specified by the user. The system works with standard MIDI files and develops in two stages: the first offline, the second online. In the first stage, MIDI files are partitioned in segments with uniform emotional content. These are subjected to a process of feature extraction, then classified according to emotional values of valence and arousal and stored in a music base. In the second stage, segments are selected and transformed according to user specified emotion and then arranged into song-like structures. The modularity, adaptability and flexibility of our system’s architecture make it applicable in various contexts like video-games, theatre, films and healthcare contexts.

The system is using a knowledge base, grounded on empirical results of works of Music Psychology, which was refined with experimental data obtained with questionnaires. For the experimental setups, we prepared questionnaires with musical segments of different emotional content. Each subject classified each segment after listening to it, with values for valence and arousal. We inferred that the experiments conducted via online had a high degree of reliability, despite the fact of being done in a non-controlled context.

We also calibrated/validated EDME in two experiments where we intended to verify the accuracy of EDME in classifying valence and arousal by using experimental data obtained in a controlled environment. The first experiment collected data with ques- tionnaires based on Self-Assessment Manikin. The second experiment collected be- havioral and physiological data. The data show that corrugator muscle activity increase with arousal; heart rate measure in beats per minute increase with arousal, and gal- vanic skin response increase with both valence and arousal. Only for zigomatic muscle activity there is a significant increase with both, valence and arousal.

Keywords

Knowledge-based system, automatic music production, expression of emotions, music and emotions, real-time system, tonal music.

Subject

Music Computing

PhD Thesis

A Musical System for Emotional Expression 2013

PDF File


Cited by

No citations found