CISUC

A Musical System for Emotional Expression

Authors

Abstract

The automatic control of emotional expression in music is a challenge that is far
from being solved. This paper describes research conducted with the aim of
developing a system with such capabilities. The system works with standard MIDI
files and develops in two stages: the first offline, the second online. In the first stage,
MIDI files are partitioned in segments with uniform emotional content. These are
subjected to a process of features extraction, then classified according to emotional
values of valence and arousal and stored in a music base. In the second stage,
segments are selected and transformed according to the desired emotion and then
arranged in song-like structures.
The system is using a knowledge base, grounded on empirical results of works of
Music Psychology that was refined with data obtained with questionnaires; we also
plan to use data obtained with other methods of emotional recognition in a near
future. For the experimental setups, we prepared web-based questionnaires with
musical segments of different emotional content. Each subject classified each
segment after listening to it, with values for valence and arousal. The modularity,
adaptability and flexibility of our system's architecture make it applicable in various
contexts like video-games, theater, films and healthcare contexts.

Keywords

Music Computing

Subject

Music Computing

Journal

Knowledge-Based Systems, Vol. 23, #8, pp. 901-913, Music Computing, July 2010

PDF File


Cited by

Year 2015 : 2 citations

 Eigenfeldt, Arne, and Jim Bizzocchi and Miles Thorogood. "Applying Valence and Arousal Values to a Unified Video, Music, and Sound Generative Multimedia Work", GA2015 – XVIII Generative Art Conference

 Williams, D., Kirke, A., Miranda, E., Daly, I., Hallowell, J., Weaver, J., Malik, A., Roesch, E., Hwang, F., Nasuto, S. (2015). "Investigating Perceived Emotional Correlates of Rhythmic Density in Algorithmic Music Composition", ACM Transactions on Applied Perception, 12(3): article 8. DOI: 10.1145/2749466

Year 2014 : 4 citations

 Morreale, Fabio and De Angeli, Antonella and Masu, Raul and Rota, Paolo and Conci, Nicola "Collaborative creativity: The Music Room". Personal and Ubiquitous Computing 18(5):1187--1199.
}

 Wang, Hui-Min and Huang, Sheng-Chieh "Musical Rhythms Affect Heart Rate Variability: Algorithm and Models". Advances in Electrical Engineering.

 P{\'e}rez Valencia, Juli{\'a}n Esteban and Mendoza Lareo, Santiago Eduardo "Sonia: interfaz para la producci{\'o}n musical en tiempo real integrando dispositivos de entrada".

 Alexandraki, Chrisoula. "Real-time Machine Listening and Segmental Re-synthesis for Networked Music Performance." (2014).

Year 2013 : 3 citations

 Klugel, N., Groh, G. "Towards Mapping Timbre to Emotional Affect". New Interfaces for Musical Expression.

 Chen, P., Lin, K., Chen, H. "Emotional Accompaniment Generation System Based on Harmonic Progression". IEEE Transactions on Multimedia.

 Chen, P., Lin, K., Chen, H. "Automatic accompaniment generation to evoke specific emotion". IEEE International Conference on Multimedia and Expo (ICME), 1-6.

Year 2012 : 1 citations

 Liu, Y. and Liu, M. and Lu, Z., Song, M. "Extracting Knowledge from On-Line Forums for Non-Obstructive Psychological Counseling Q&A System". International Journal of Intelligence Science, Scientific Research Publishing, 2(2):40-48.

Year 2011 : 1 citations

 Wang, H., Lee, Y., Yen, B., Wang, C., Huang, S. and Tang, K. "A physiological valence/arousal model from musical rhythm to heart rhythm". IEEE International Symposium on Circuits and Systems, 1013-1016.