Steven R. Livingstone, Ralf Muhlberger, Andrew R. Brown, and William F. Thompson. Changing Musical Emotion: A Computational Rule System for Modifying Score and Performance. Computer Music Journal, 34:1, pp. 41–64, Spring 2010.
The authors present CMERS, “a Computational Music Emotion Rule System for the real-time control of musical emotion that modifies features at both the score level and the performance level.” The paper compares CMERS to other computer-based musical expressiveness algorithms, as part of a larger effort to find a complete systematic categorization of all of the emotions that can be expressed and evoked through music.
The authors first conducted a survey of past efforts to categorize emotions, and after meta-analysis of the results, devised a two-dimensional graph. The vertical axis runs from Active to Passive. The horizontal axis runs from Negative to Positive. The Negative/Active quadrant includes such emotions as anger and agitation. The Passive/Positive quadrant includes serenity and tenderness. The authors then paired particular musical devices with each emotion, both compositional and performative. For example, sadness is correlated with slow tempo, minor mode, low pitch height, complex harmony, legato articulation, soft dynamics, slow note onset, and so on.