Page  00000046 Mood Mapping Technologies Within Hybrid Audio Design Lars Graugaard and Jens Arnspang lag@cs.aaue.dk, ja@cs.aaue.dk Department of Software and Media Technology Aalborg University Esbjerg Abstract Contemporary audio design and performance systems aim at creating a world by using technology for immersion beyond the state of art of virtual reality. Based on our interactive designs and ongoing European projects we report research of this type. 1 Attaching Musical Expressivity to A Soundscape One of interactive music's strongest assets is its ability to extract musically significant elements in real-time from a performance, and inject a modeling of that essence into an accompanying digital soundscape. An approach for correlating such data to an implicit chord structure for pitch cueing and for presence augmentation of the performer is used in the interactive opera La Quintrala (Graugaard 2004). La Quintrala is an opera for five singers and interactive, computer generated sounds, with a duration of 120 minutes excluding intermission. The interactive accompaniment is generated in real-time, with sound synthesis algorithms dynamically being affected through analysis of the singing. The sound synthesis methods were chosen beforehand and change throughout the course of the opera, whereas pitch content for the synthezised sounds partly is generated by algorithms, partly extracted from a continuous comparison of the voice to a chord structure stored in memory. The chord structure defines the primary supportive pitches, and links the notated and electronic score together, addressing the needs of the singers for tonal 'indicators' at any given moment. Further audio analysis was needed in order to retain expressive performance data. The vague definition of 'expressive data' centers on flexible timing and use of dynamics and articulation, and we often refer to this highly important part of music performance as 'phrasing' (Lippe 1998). Paradoxically, this expressive layer carries much of the audio content of the composition because the composition defines and delimits its own 'expressive space' as an implicit consequence of the composer's decisions of the more precise notations of pitch, time and rhythm. The expressive layer is therefore a hidden yet integral part of any musical composition, and is defined at the moment of composing even though it only comes into existence at the moment of performance. The aim for the interactive relationship between the singers and the composition in La Quintrala is twofold: comparing pitches for chordal verification and support, and projecting the singer's subjective and relative interpretation of expressive parameters onto aspects of the accompanying soundscape. 2 Emotion Technology Usage During the Brussels MOSART project (Musical Orchestration Systems in Algorithmic Research and Technology), see (Arnspang 2003), one partner provided an interactive opera for the Salzburg Festival, where the main character was schizophrenic; at some points dark and sinister and should sing in a low pitched and sonore ways, and at other points bright and febrile and should sing in high pitched and vivid ways. Cameras watched the action of the human singer, detected his state of personality from movements, and modified the singing voice according to both composition, libretto, and live performer's choices. During the Brussels BENOGO project (Granum 2002), a connection between digital worlds and human feeling of being present is attempted. One tool which is being discussed is mood mapping. Such attempts are found at the triple point between the disciplines of human computer interaction, the discipline of scene rendering, and the vast knowledge existing in design on cues and parameters essential for scene definition and scene building. References Arnspang, J., 2003. MOSART Brussels Project, September 2000 -2003. Granum, A., 2002. Benogo Brussels Project, September 2002 -August 2005. Graugaard, L., 2004. La Quintrala, interactive opera for five singers and computer generated accompaniment, dur. 120 minutes, premiered September 2nd 2004. Lippe, C., 1998. Real-Time Interaction Among Composers, Performers, and Computer Systems; Proceedings of the 1998 International Computer Music Conference. 46