Page  00000001 Interactive Paths Through Tree Music Judith Shatin Mclntire Dept. of Music Box 400176 University of Virginia +1-434-924-3052 Email: shatin@virginia. edu Categories and Subject Descriptors Composition, new interfaces, interactive systems, open source General Terms Algorithm, Design, Documentation, Experimentation Keywords Composition, new interfaces, interactive systems, open source Figure 1: Brzezinski's Tree Sculptures Abstract 1 Introduction This paper discusses the design of Tree Music, an interactive computer music installation created with RTcmix and GAIA (Graphical Audio Interface Application), a new open-source software package by Dave Topper, Technical Director of the Virginia Center for Computer Music. Commissioned by the University of Virginia Art Museum to accompany the New Directions exhibit of tree-trunk sculpture by Emilie Brzezinski (Figure 1), Tree Music involves four networked levels of interactivity, each projecting a different relationship between the composer and the spectator A wireless camera tracks changes in motion and occlusion; GAIA uses these data to navigate four networked levels. While people visiting the installation repeatedly may experience some sonic overlap, there is a vanishingly small probability that they will have identical musical experiences during subsequent visits. As interactive options have proliferated, composers have developed a variety of schemas, ranging from simple triggers to complex interactive systems. They have opened a fascinating space between cause and effect, between sounds of the world and new worlds of sound, between experience and understanding, and between composer and audience. They have also created processes that yield fluctuating rather than fixed results, often leaving room for the audience to directly influence their own sonic experiences. Two major compositional/performance modes have developed. One, which I call the environmental mode, tracks aspects of the surroundings, and uses the data to produce music. We might measure changes in the external environment, such as wind; or changes in human activity that affects an aspect of the environment. Pieces that explore the environmental mode include Garth Paine's Map2 [1], presented at NIME2004 Proceedings ICMC 2004

Page  00000002 at Shizuoka University in Hamamatsu, Japan. In that instance, peoples movement through the space resulted in perceptual sonic change. The second compositional/performance mode I call the somatic mode. Here, sensors measure change in the performer's musculature or in its manifestation in, say, the angle of a performance tool. For example, Joseph Rovan, in his Collide, uses a DATA glove that measures force, bend and motion in his finger, hand and arm gestures and uses them to control both audio and video [2]. The AoBachi, Diana Young and Ichiro Fujinaga's wireless interface for Japanese drumming, tracks gesture parameters as the performer plays the drum [3]. Here, the performer may flexibly alter the outcome. And, while the audience may have a different experience at each performance, it has no effect on the music. These modes depend on software that facilitates dynamic interaction. While there are a number of programs, such as MAX MSP, PD and Super Collider in common use, as well as a variety of custom ones, Toppers GAIA program, built around the RTcmix engine, handily combines a GUI with an accessible Perl scripting environment, supporting a wide variety of interactive schema. Its constellation of attributes: open-source, Linux-based, combination of GUI and Perl scripting environment seemed most promising and led me to inaugurate GAIA with Tree Music. Tree Music (installed from 6/27-9/14/03 at the University of Virginia Art Museum), fits the category of environmental mode. It uses motion (change in pixels between frames) and occlusion (change in pixels from starting frame), to generate data. GAIA uses this information, received via the wireless camera, to trigger four networked response processes. They are as follows: (a) Entry into the exhibit space triggers level 1-computer selection and performance of a mini set-piece from a collection of set pieces (b) Continued changes trigger level 2-the dynamic creation and playback of micro-ensembles of precomposed phrases; and (c) Continued changes cause the program to move to level 3-the interactive level, in which the music responds to spectator motion, with particulars depending on which interactive region is chosen. (d) No change for five minutes sends the program to level 4 -a special section where one of a group of extended polyrhythms is played. The top-level GAIA script keeps track of the relationships among these processes, creating different pauses between them and monitoring the associated states. 2 Tree Music design Tree Music raised several compositional questions. First, how should the music relate to the sculptures? I wanted to mark the sound with the sculptural process, just as the wood was marked by the physical action of saw and chisel. To do so, I recorded the sculptor as she worked and used those recordings as source material. I wanted to convey the sculptural process in the music. To that end, I created sounds that ranged from recognizable implements-saw, chisel and broom-to sounds shaped beyond recognition. This created what Field [4], calls a "sonic metaphor": the sounds of the sculpting implements suggest the musical process of building the piece. The second question concerned interactions between the music, the exhibit visitors, and the sculptures. As I mentioned earlier, I defined four levels of interaction. Each has different degrees of interaction and fixedness, which range from relatively fixed mini set-pieces, to the aleatoric generation of micro-phrase ensembles, to interactive sections in which the visitors' motions directly, affect the music, to an enticement section in which the music responds to the sculpture during periods when there are no people in the room. During the latter section-for which I reserved special sounds (such as the voices of the sculptor and the composer)people might overhear the resultant music from adjoining rooms and be drawn to the exhibit. Creating Tree Music thus involved four phases: (1) sound source development; (2) network design of the relatively fixed, open and interactive regions, with the creation of multiple paths within and between them; (3) construction of a top-level control system that links time and movement between the sections; (4) on-site testing. The last was tricky, because we expected a range in the number of visitors (roughly between 1 and 20). RTcmix was used for sound palette development and GAIA for the rest of these tasks. 2.1 Network Organization Each level contains multiple possible pathways and pauses of random duration. 1. Mini Set-Pieces: A person entering the room causes the computer to choose one of the precomposed mini set-pieces and perform it. This is the most fixed part of the work. 2. Micro-phrase Ensembles: After a brief pause (4-8 sec), if the system senses motion, micro-phrase ensembles are created and performed, with controlledrandom choices of soundfiles and types of layering Proceedings ICMC 2004

Page  00000003 / YES.choos. a.............. laton m. Kas min"? ^.........i. Pqrq. cOe & Way ay rma ttS fT'e^^Il YES Figure 2: Tree Music flow chart based on low, medium and high density families of soundfiles. The resultant ensembles, configured for a variety of listening experiences, last between 0.5 and 3 sec. 3. Interactive: If the system senses change after section 2 ends, it activates one of four interactive instruments. Parameters such as amplitude,register and filter strength change as a function of changing motion in the room. 4. Hypermeasure: Five minutes of no change trigger the enticment section, with its hyper-extended polyrhythms. These unfold over 5 to 32 minutes. This section, like the first, is quite fixed: once chosen, each polyrhythm plays as composed. Figure 2 summarizes the structure. The program had to be robust to run for four hours per day/six days per week. Scripts were written to start up GAIA after the machine was booted up and to restart the system if necessary. 2.2 GAIA configurations To develop the network of interactive levels and associated states, and to map the videotracking information, multiple GAIA configurations (CFGs) were developed. Each contains GUI interface objects and associated Perl scripts laid out on what Topper calls a "canvas" [5]. An extensible set of widgets monitors real-time data, videotracking, RTcmix scripts, logical conditions and MIDI. There is also a suite of sliders and data boxes. Data within the different Perl scripts can be set as local or global to the entire system, just as in Perl. Each box with a Perl script has a HIDE or SHOW button, so the CFG display can be less cluttered. The main CFG controls the changing flow within and between sections, as triggered by change in motion and occlusion data. It also handles the movement between and within the four networked sections.CFGs can be nested. For example, the CFG that handles video capture is represented in the main CFG as a text box with sliders. Other components of the video capture CFG are hidden, though the video CFG remains accessible in a CFG file list. This enabled me to work with components in small segments, which I then assembled into the main CFG. The combination of hierarchical organization and ease of accessibility of component parts were crucial to the design of Tree Music's network. Figure 3 shows a CFG excerpt, which includes the Perl script that pops up when one clicks SHOW. This CFG sets global variables for occlusion and motion input. 3 Future Directions Aesthetic: Interactive music presents numerous aesthetic questions. Should those experiencing the music be aware of correlations between their actions and their musical experience? How fluid should this relationship be? How does the experience of participation affect the perception of the music? Tree Music embodies a dualistic response to these questions, contrasting sections in which spectators have only an indirect effect on the music, with sections in which the relationship between their physical movements and the changing sonic response is overt. The goal here was to enable the audience to hear "composed" segments as well as "spontaneously" created ones, with fluid movement between them. These issues are ripe for further exploration, and others have been doing just that. For example, Sile O'Modhrain and Georg Essl's Pebble Box offers a combination of haptic/sonic experience in which the user is aware that handling the pebbles is correlated with motion of the physical items and the sound produced, Proceedings ICMC 2004

Page  00000004 Acknowledgments FlA j i B Rn E J Stop occEslon niM...$now.. tinre=t $bme Figure 3: Configuration example whether of pebbles, water, or a variety of other sounding objects. [6] Technical: GAIA not only provides a flexible interface; it also offers generality and free availability. The hierarchical nesting of CFGs and associated scripts supports a broad range of compositional schemas. The ability to harvest data from a variety of devices makes this an ideal system for the creation of interactive pieces in the environmental and somatic modes I described earlier. Thus GAIA holds great potential for interactive dance and video. And, the increasing use of highresolution and high-speed real-time sensor interfaces, such as those being created by La Kitchen [7], will provide further impetus to GAIA's development. Other uses for GAIA are being explored as part of the computer music curriculum at the University of Virginia. Thanks to Dave Topper for developing GAIA and for his help in working out numerous technical programming details; to Emilie Brzezinski for opening her studio and for her collaborative esprit; to the helpful staff of the University of Virginia Art Museum; to Museum Director Jill Hartz for providing the opportunity to develop this work; and to the Virginia Commission for the Arts for its support. References [1] G. Paine, "Map 2," Installation in Auditorium Lobby, International Conference on New Interfaces for Musical Expression (NIME), Hamamatsu, 2004. [2] J. Rovan, "Program notes for Collide," Website, 2002, rovan-note-COLLIDE.doc, retrieved 2/14/2004. [3] D. Young and I. Fujinaga, "Aobachi. A new interface for japanese drumming," in Proceedings of the International Conference on New Interfaces for Musical Expression (NIME), Hamamatsu, 2004, pp. 23-26. [4] A. Field, "Simulation and reality: the new sonic objects," in Music, Electronic Media and Culture, S. Emmerson, Ed. Hampshire, England: Ashgate, 2004, pp. 36-55. [5] J. Shatin and D. Topper, "Tree Music: Composing with GAIA," in Proceedings of the International Conference on New Interfacesfor Musical Expression (NIME), Hamamatsu, 2004, pp. 51-54. [6] S. O'Madhrain and G. Essl, "Pebblebox and Crumblebag: Tactile interfaces for granular synthesis," in Proceedings of the International Conference on New Interfaces for Musical Expression (NIME), Hamamatsu, 2004, pp. 74-79. [7] H. Coduys and C. Coduys, "Toaster and Kroonde: Highresolution and high-speed real-time sensor interfaces," in Proceedings of the International Conference on New Interfaces for Musical Expression (NIME), Hamamatsu, 2004, pp. 205-206. Proceedings ICMC 2004