Page  1 ï~~EXPORING NEW COMPOSER/PERFORMER INTERACTIONS USING REAL-TIME NOTATION Christopher McClelland Michael Alcorn Sonic Arts Research Centre Queens University Belfast Belfast BT7 INN ABSTRACT Composers of live and interactive electronic music have found methods of creating an integral relationship between the acoustic sound of an instrument and the electronic sounds produced using technologies. In such a relationship, sound is passed from performer to computer and is manipulated by the composer. One feature of interaction that is missing from this relationship is the potential for the composer to provide instructions to the musician during the performance. This composer-performer interaction opens up a unique set of possibilities. For example, simple alterations to the dynamics in the score can be made in real-time, or detailed rhythmic and pitch patterns can be created in real-time, disseminated across an ensemble. Simple display of information can be achieved using applications such as Max/MSP Jitter, PureData and Supercollider. However, these programs cannot support the complex notational systems required by modern compositional techniques. This paper provides an overview of the main components of a real-time notation environment (eScore), focusing on those points that are related to composer-performance interactions, specifically notation and display. Related tools and techniques, and exemplar projects are discussed as a basis for its development. 1. INTRODUCTION The motivation in this field is to explore new compositional and performance practices, some of which present an interesting middle ground between the domains of composition and improvisation. Such motivation also provides an opportunity to investigate performer-composer interactions that promise a closer coupling between the generation of instrumental and electronic sounds during performance. 1.1. Performance Models Computer and network technologies have made composer-performer interaction within music performance easier to implement. The use of real-time notation further enhances the relationship between the performer and composer. The following model summarizes the key types of interaction (see figure 1). Figure 1. Performance Relationship Models Michael Alcorn's composition for string quartet and live score, Leave No Trace [1], explores the close coupling between the composer and real-time score. In this work, the performers and composer are in the same space and in direct communication with each other throughout the performance. The composer chooses fragments of musical material in real-time to be displayed on-screen, which are subsequently performed by the musicians. The performances of these fragments are tightly integrated with the triggering of electronic processes and samples. An experiment created by McAllister, Alcorn and Strain [8] allows an audience to generate graphic information on PDA devices and relay musically meaningful gestures to a group of performers. Audience participation has also been explored using more traditional notation systems by Kevin C. Baird [2] in a project that uses an audience feedback system to decide aspects of the generated score such as dynamics and articulation. These modes of interaction can be extended over a network, and in particular through the use of the Internet. The composer of a work can be composing in real-time whilst being geographically displaced from his/her performance. Georg Hadju has developed an application Quintet.net[5] to deliver notation and performance instructions to performers in different locations. This application has been used by musicians to perform pieces that were not originally intended for network performance, for example John Cage's Five, as well as by composers writing pieces specifically for the network [7].

Page  2 ï~~2. EXISTING SOLUTIONS Whilst the composer of score-based compositions has at his disposal algorithmic tools such as OpenMusic, CLM and Patchwork, and notation applications such as Finale, Lilypond, and Sibelius, none of these tools are dedicated to nor efficient enough at generating and displaying the desired material in real-time. There have been instances where these applications have been adapted for the production of scores in live circumstances. A notable example of this is found in Passage by Marek Choloniewski [3], which uses Finale to display scrolling notation based on MIDI data generated from analysis of gestures during a silent performance prior to the beginning of the piece. No Clergy by Kevin Baird [2] uses Lilypond to generate graphic versions of a score on the fly to be displayed in a web browser for performance by musicians. Composers have also found methods of displaying instructions using existing applications such as Max/MSP in their compositional process. The lcd object has been used by Winkler [13] in works as early as 1995, as a means of displaying musical graphics on the performers' screens. A yet-to-be-released Max/MSP object, MaxScore [6], should provide a more integrated method of producing scores within Max/MSP. Furthermore, there are a number of stand-alone applications such as LiveScore [14], and the Active Notation System [8], which concentrate on specific areas of notation and display. LiveScore uses a time proportional system where space and beam length indicate the duration of notes. The Active Notation System uses a display method which is described later in this paper. 3. eSCORE Since 2004 [6] the authors have been actively developing a real-time notation solution, eScore. This environment is capable of producing high-quality, multi-faceted notational systems, assisted by algorithmic processes and real-time analysis that are capable of being interfaced with external controllers. eScore can be described as a framework and set of tools [11] that assist a composer in displaying instructions to a performer during a performance. The intention of eScore is to provide a flexible environment for developing interactive components for score-based composition in a similar fashion to that which composers have enjoyed with live-audio processing applications using both innovative and familiar methods of working with music notation. The environment began as an application that displayed Penderecki-style notation [10], opting for suggested pitch ranges and graphic noteheads to provide a wide compositional palette but avoiding some of the complexities of conventional notation. More traditional notation systems have been explored in subsequent versions of the software. The eScore project has evolved into a crossplatform, stand-alone application for composition and performance that is flexible enough to deal with modern notational ideas, and can interface with common tools already in use by composers in the field of live and interactive music. 3.1. Interoperability Modern compositional techniques call on a wide range of applications in order to be realised. For example, audio analysis/processing, computer-assisted-composition, notation, and graphic applications are often required in combination for a project to be fulfilled. Whilst the majority of features are not required in a real-time context, elements such as analysis information, images and notational ideas are of benefit. As a means of importing and exporting data from these applications, three formats are used in eScore: MusicXML for notation, SVG for vector graphics, and OpenSoundControl (OSC) to communicate with a wide range [4] of sound-based applications. 3.2. Composition Environment An OSC namespace is implemented in eScore that enables a composer to use a notational "drawing board" where low-level drawing commands such as lines and shapes can be addressed, notational events can be built through noteheads, stems, and staves, and predefined material can be called upon. In addition tools are provided to assist in the building of notational events using generators and compatibility with the applications explained above. This enables the composer to restrict and impose different parameters, for example pitch, dynamics and articulation upon a musical phrase. 3.3. Performance Environment During a typical performance using eScore, a computer or laptop is given to each performer and one is used by the composer. The computers are connected to each other via a wired or wireless network, using the clientserver model. This design enables the score to be distributed to the members of the ensemble, and also lends itself to distributing algorithmic calculations and the rendering of complex graphics. Other setup possibilities include several displays connected to one computer, or a projector used to display material to the group. 3.4. Modes of Communication There are two fundamental issues to be considered in relation to communication between composer and performer during performance. These are: the type of notation sent to the performer, and most appropriate method of display. These issues will be explored in more depth in the following sections. 4. NOTATION Using eScore, the composer may call upon text, musical notation, video and graphic images in order to develop certain compositional ideas and communicate them to the performer. In addition, the use of a display instead of

Page  3 ï~~paper allows extensions of traditional notation such as colour and animation to be brought into the notational ideas. A good deal of flexibility can be achieved by using existing fonts along with raster, and particularly vector graphics. Apart from the various means of import explained earlier, a core database of symbols is given. The database is extended by an online repository of symbols'. Similar to the functions of the offline users are able to upload and tag musical symbols and have the option to provide downloadable versions of musical symbols for the wider community. An API is to be created so that relevant data from this web application can be imported into eScore, Finale, Illustrator or other applications. 4.1. Leave No Trace For Michael Alcomrn's Leave No Trace for String Quartet, a corpus of musical materials (events) were prepared. Nine event types were required that were based on traditional musical notation. These were generated using LISP, exported to SVG format, and imported into eScore. These fragments had parameters of the notation removed so that they could be decided during performance. Examples of these fragments are displayed in figure 2. PP "=1 ':IT ---------- 1> Figure 3. Tablet control of Leave No Trace The performers are given an example of each of the fragments and a short video rendition of the display before the performance. In performance the fragments are sent to the screen without preview time, asking the performer to respond as quickly as possible to the new material. Significantly, the structure of the piece is not explained by the composer before the performance, leaving the musicians unaware of how the piece is to unfold. This is an aspect of the work that has evolved successfully from performance to performance. 5. DISPLAY The authors found through experiments with musicians, and through other study-like pieces, that there are a number of decisions to be made over the display of musical material on the performer's screen. This process is governed by musical decisions rather than by issues of visual perception. Three methods of display have been identified: pages, scattering and scrolling. The methods have different compose times (the time to generate or trigger new material) and provide different types of performer reactions, factors which often have an effect on the resulting musical material. 5.1. Scatter Display For optimal visual perception musical material should be displayed at the centre of the screen with nothing to disrupt peripheral vision; this centralised material must also move enough to prevent the performer from being distracted. Moving material around the screen increases the time taken by the musician to register the material. In Leave No Trace it was found that moving the material across a wider space on the screen produced a more erratic performance, complementing the desired musical intentions of the composer. The musical material used in this piece can be sent immediately to the screen, and the performer's response can be almost immediate too (depending on the speed of his reaction to the changing screen position). Not all material is suited to this method of display; improvisatory and short passages are most successful. The compose time is also very short: in Leave No Trace fragments are generated and displayed in less than 10 milliseconds. Figure 2. some fragments used in Leave No Trace. The composer uses a graphics tablet (figure 3.) to make decisions regarding which event types to display on the performers' screens. Variables such as register, dynamics, rhythm and bow position can be allocated during the performance. 'http://www.realtimenotation.com/escore/

Page  4 ï~~5.2. Pages Display 7. REFERENCES This system, as explored in the study Undecided [12], employs 'preview' and 'current' display areas, or as Nigel Morgan describes them, 'active' and 'inactive' areas [9]. Performers can read the 'current' area just like a page. Once they have finished one 'side' they can continue to the next, with the computer doing the 'page turning'. The composer on the other hand will be filling the 'preview' area in advance. When the performer has finished the 'current' area, the composer or performer, through a foot pedal for example, can turn to the next area. This introduces a larger compose time, a factor that depends on the size of the page, and tempo required. 5.3. Scrolling Display Scrolling chunks of musical material is problematic because the graphics become blurred when moving at fractions of a pixel per frame. An alternative is to have a scrolling line or indicator that tells the performer his current position in the score. This method can be coupled with the page view and lends itself to more specific and rhythmical material. Ideally the composer should be composing slightly out of the performer's peripheral vision to avoid distractions. [1] Alcomrn, M. "Leave No Trace", String Quartet, eScore and Live-Electronics, http://www.michaelalcorn.org/ (Accessed 12 January 2008), 2007 [2] Baird, Kevin C. "Real-time Generation of Music Notation via Audience Interaction Using Python and GNU Lilypond", Proceedings of the International Conference on New Interfaces for Musical Expression (NIME05), Vancouver, BC, Canada, 2005 [3] Choloniewski, M. "Passage", Interactive Octet for Instruments and Computer, http://www.studiomch.art.pl/. (Accessed 12 January 2008), 2001. [4] Freed, A., Schmeder, A., Zbyszynski, M. "OSC Showcase for Maker Faire", Maker Faire, San Mateo, CA, USA, 2007 [5] Hajdu, G. "Quintet.net: An environment for composing and performing music on the Intemnet". Leonardo Journal, Volume 38, Number 1, 2005, pp. 23-30 [6] Hadju, G. "Playing Performers" Music in the Global Village Conference, Budapest, 2007. [7] Hadju, G. "Composition and improvisation on the Net", SMC'04 Conference Proceedings. IRCAM, Paris, 2004 [8] Legard, P., Morgan, N., "Re-conceptualizing Performance with 'Active' Notation", ICCMR Series, 2007. [9] McAllister, G., Alcorn, M., Strain, P. "Interactive Performance with Wireless PDAs", Proceedings from the International Computer Music Conference, Miami, USA, 2004. [10] McClelland C., Alcorn M., "Real-time notation in interactive and live electronic performance", Live Algorithms for Music Workshop, London, 2005 [11] McClelland C., Alcorn M., "eScore: Towards A Framework for the Composition and Performance of Realtime Notation", DMRN+2, London, 2007 [12] McClelland C., "Undecided", String Quartet and Realtime Score. 2007 [13] Winkler, G. E. The Realtime Score. A Missing Link in Computer Music Performance. In Proceedings of the Sound and Music Computing Conference, Paris, 2004. [14] Wulfson, H., Barrett, G. D., and Winter, M. "Automatic Notation Generators". In Proceedings of the 7th international Conference on New interfaces For Musical Expression, New York, 2007 Figure 4. An Example of a Scrolling and Pages Display Combined 6. CONCLUSION & FUTURE WORK A great deal of work is to be done in the area of visual perception to explore the best methods to present musical materials on the display so that they can be performed with the desired musical result. User-based studies will be conducted in order to study this further. Participation will be sought by musicians from different backgrounds, particularly classical musicians, improvisers, and non-readers of music. In order to study more complex composerperformer relationships an ensemble work is planned which will develop more integrated relationships that will be extended in subsequent work to involve audience and network interaction. eScore is intended for public release and the projected compositional output is intended to supplement its continuing development.