Page  261 ï~~Making Motion Musical: Gesture Mapping Strategies for Interactive Computer Music Todd Winkler Brown University ToddWinkler@ Brown.Edu Abstract The increase in sophistication of new devices that allow gesture and movement to be translated into computer data holds great promise for interactive composition, dance, and creating responsive music in virtual reality systems. Data describing human motion can produce musically satisfying results by their impact on sound and musical processes. This paper will take a general look at the use of physical gesture data as primary compositional constraints in interactive music systems. Theoretical concepts for the interpretation and evaluation of these data will be discussed. Finally, these devices and techniques will be shown be viable in multimedia applications. Background The invention of the Theremin in 1919 prophesied the development of music produced by electrical means and is even more remarkable as the first musical instrument performed without physical touch. The results produced by moving the hands in space were more subtle and varied than a simple oscillator would suggest because the sound reflected the expressive quality of human movement. The world caught up with Leon Theremin in the 1960s and 1970s when several composers rediscovered the exploration of movement to create electronic music. Of particular note is Variations V (1965), a collaborative work featuring music by John Cage and choreography by Merce Cunningham, with a system designed by Gordon Mumma and David Tudor to derive sounds from the movements of dancers, who produced music based on their proximity to several electronic sensors placed on stage. Thus, the entire floor was transformed into a musical instrument responsive to movement throughout the space (Nyman). Since the 1980s, the sophistication and accuracy of digital technology has brought opportunities for the further refinement of motion-based instruments. Researchers in both dance and music have created numerous systems showing the viability of using computers to interpret data from motion detectors and body sensors. Years of ongoing work at STEIM and the MIT Media Lab are of particular interest in this area. Corporate interests are now pushing their own research and we will soon be inundated with body suits, gloves, motion detectors, and other devices for virtual reality environments. In addition, physical computer interface devices not intended specifically for music, such as a mouse, have been frequently used to perform or shape computer music processes. While these devices may lack the resolution and refinement of musical instruments, composers have already proven that any device that translates physical movement into numbers is fair game for an interactive musical performance (as witnesses by the frequent adaptation of Mattel's Nintendo PowerGlove). The abundance of successful research will insure composers reliable and accurate movement data. With many of these technicalities solved, some attention can now be shifted away from instrument building and towards musical performance and composition. Specifically, how can composers create music based on movement data? How can movement be used to shape and structure musical material? What specific performance parameters will create and modify sound? It is doubtful that such movement data can be simply applied to pre-existing musical models. An examination of the physical parameters of movement, their limitations, and the methods used for their measurement, will yield clues to an appropriate musical response, one where imaginative connections engage performers by their power to shape sound. In his insightful essay on "Effort and Expression," Joel ICMC PROCEEDINGS 199526 261

Page  262 ï~~Ryan states that "in fact, the physicality of the performance interface gives definition to the modeling process itself' (Ryan). Interactive music systems can be used to interpret these data, extending the performer's power of expression beyond a simple one-to-one relationship of triggered sound, to include the control of compositional processes, musical structure, signal processing, and sound synthesis. Characteristics of Movement Transducers Looking at acoustic instruments and their related digital models, motion makes music by exciting physical devices that produce sound or MIDI data. Physical energy is transformed into sound through a musical instrument mechanism. Two different classes of movement transducers are the focus of this discussion. Although the wide variety of devices for detecting movement have unique sensing apparatuses, capabilities, and features, many share common attributes. Spatial sensors detect location in space, by proximity to hardware sensors or by location within a projected grid. Body sensors measure the angle, force, or position of various parts of the body. These two methods are by no means mutually exclusive. Examples of spatial sensors includes the Radio Drum (Mathews), which measures the location of two batons in three dimensions in relation to a rectangular radio receiver; Donald Buchla's Lightning (Rich), which uses an infrared signal to locate a performer within a user-definable two-dimensional grid, and several systems that analyze location and movement in space using video cameras, such as David Rokby's Very Nervous System (Cooper), and the Virtual Stage Environment (Lovell). Inexpensive pressure triggers, light beams, and motion detectors have been used extensively to identify a person's location within a space. Examples of body sensors include MIDI Dancer (Coniglio), which analyzes body shape by measuring the angles of arm, leg, and hip joints; BioMuse (Knapp and Lufsted), which measures electrical voltages produced by muscle contraction, and a host of hand-based controllers such as The Hands (Waiswicz), which also measures proximity. Regardless of sensor type, it is important to recognize not only what is being measured, but how it is being measured. Most of these devices specify a limited range of motion (left-right, high-low, open-closed) and divide that range into a limited number of discrete steps, with data sent out more or less continuously along this range. The resolution or scale of these steps, as well as the device sampling rate will be factors in musical interpretation. These numbers can be filtered, scaled, limited or otherwise processed by software in preparation for entering an interactive system. Some devices may have less continuous reporting, sending out values representing predetermined trigger points. These numbers determine the location or body position of a performer over time within a pre-defined range. Timing and location may be used to determine the speed or force of an action. Software can interpret these numbers to create music based on either a performer's exact location or position, or by movement relative to a previous location or position. The number of parameters transmitted by a device also defines some of its musical capabilities. Spatial sensors work in one, two, or three dimensions. Body sensors may track one or more parts of the body. Movement as an Instrument Does human movement have constraints similar to musical instruments which might suggest something akin to idiomatic expression? Is there a distinct character to the movement of the hands? What is finger music? What is running music? What is the sound of one hand clapping? These questions may be answered by allowing the physicality of movement to impact on musical material and processes. These relationships may be established by viewing the body and space as musical instruments, free from the associations of acoustic instruments, but with similar limitations that can lend character to sound through idiomatic movements. Traditional studies in orchestration, as well as studies in sound synthesis, begin by examining the physical properties of instruments and their methods for producing sound. Physical constraints produce unique timbral characteristics, and suggest musical material that will be idiomatic or appropriate for a particular instrument's playing technique. These reflect the weight, force, pressure, speed, 262 2IC MC PROCEEDINGS 1995

Page  263 ï~~and range used to produce sound. In turn, the sound reflects, in some way, the effort or energy used to create it. The fact that brass tones add upper partials as they grow louder is a classic example. Each part of the body has its unique limitation in terms of direction, weight, range of motion, speed and force. In addition, actions can be characterized by ease of execution, accuracy, repeatability, fatigue, and response. The underlying physics of movement lends insight into the selection of musical material. Thus, a delicate curling of the fingers should produce a very different sonic result than a violent and dramatic leg kick, since the size, weight and momentum alone would have different physical ramifications. To do this, physical parameters can be appropriately mapped to musical parameters, such as weight to density or register, tension to dissonance, or physical space to simulated acoustical space, although such simple oneto-one correspondences are not always musically successful. The composer's job then, is not only to map movement data to musical parameters, but to interpret these numbers to produce musically satisfying results. The stage, room, or space also has its boundaries, limiting speed, direction, and maneuverability. Psychological intensity increases or decreases with stage position, as the audience perceives more energy and focus as the performer moves upstage. Apparent gravity pulls the performer downstage with less effort, in a way similar to the very real pull of vertical downward motion. Intensity may also vary with the amount of activity confined to a given space. These physical and psychological properties of space are ripe for musical interpretation. Being aware of the underlying physics of movement does not necessarily imply an obvious musical solution. Indeed, since computers are only simulating real world events, tampering with the apparent laws of physics is a luxury made possible in virtual environments. By being aware of these laws, it is possible to alter them for provocative and intriguing artistic effects, creating models of response unique to the computer. More furious and strenuous activity, for example, could result in quieter sounds and silence. At the same time, a small yet deliberate nod of the head could set off an explosion of sound. Such "unnatural" correlations makes motion all the more meaningful. Use in Interactive Music Systems Interactive music systems use software to interpret human action to effect some aspect of music generated or modified by computers. Typically, musical material, in the form of MIDI data, is analyzed for musical features such as tempo, rhythm, or intervals. However, movement data is equally effective, and has been frequently used as input to interactive music systems. Movement data may be filtered, scaled, or selected in preparation as parameter input to compositional algorithms. Performer's actions may translate into immediate musical results, they may have delayed reactions, or they may be analyzed over longer spans of time for direction, repetition, and other patterns. Influences may be used in a less obvious manner when applied to the unfolding of higher level events in more fully developed musical environments. These techniques give performers the feeling of participating, along with the computer, in the creation of a musical work. The feeling of interactivity depends on the amount of freedom the performer has to produce and perceive significant results, and the ability for the computer to respond in a way that makes sense and and naturally elicits the performer's participation. Highly interactive systems are more complex but potentially rewarding. With more parameters available for change, performers will need extra practice time to "learn by ear" the idiosyncrasies of a system capable of intricate connections between movement and music. The computer's response must be believable in the sense that it seems appropriate for the action taken, and appropriate for the style of music and movement. Interactive music succeeds when it encourages spontaneity while residing within the boundaries of a dynamic artistic context that is whole and engaging. The quality of interaction lies on a continuum. On the low end lie predetermined sequences or sound files that are triggered by a performer. While robust branching structures and mixing capabilities allow the performer to experience musical selection, the samples or MIDI files, when triggered, produce the same results. A predetermined score could be made slightly interactive by allowing a performer to control only one parameter of the music, such as the tempo or dynamics. Highly interactive pieces are less predictable since the performer controls many more significant musical parameters. The composition can change ICMC PROCEEDINGS 199526 263

Page  264 ï~~dramatically according to a performer's actions and interpretation of the results. Melodic construction, harmony, tempo, register, dynamics, and timbre can all be influenced in real-time by movement information. Multimedia Extensions Many of these devices offer artists fascinating environments to explore the combination of sound, text and images through performances or installations, inviting performer and audience participation by their responsiveness to human gesture. Several of the devices mentioned above have already been successfully employed as controllers for video and still images played from laser disk players or projected directly from the computer. One recent example is In Plane, a work by Mark Coniglio which uses MIDI Dancer to expand the role of a a solo dancer to control video playback, a lighting board, and the position of a video camera on a 24-foot robotic platform. In one section, sensors identify several types of body position and call up similar images stored on laser disk (Coniglio, 1995). Rockby's Very Nervous System has been used by visual artist Paul Garrin for an installation entitled White Devil. The work uses twelve large monitors to display video of a menacing guard dog who appears to track the movement and proximity of the viewer. Conclusion If Leon Theremin were alive today, he would be amused by the corporate giants racing to discover the "killer app" that will bring interactive electronic images and sound to the masses. He had given musicians a seventy year lead to develop concepts and content exploring motion and sound. Indeed, many of the important discoveries presented at previous ICMC conferences regarding interactive systems, and issues surrounding real-time processing, sensing, and scheduling, seem to have gone largely unnoticed by the corporate world and popular media as they take their first baby steps toward creating intuitive sensors and systems that still await appropriate content. Undoubtedly, composers will have a role to play in developing these new media, and there will be a strong need and increased opportunities to create music in response to human movement. References Coniglio, Mark. In Plane. Personal conversation. 1995. Cooper, Douglas. "Very Nervous System." Wire Magazine. 3, 3 (1995): 134-170. Knapp, Ben, and Hugh Lusted. A Bioelectrical Controller for Computer Music Applications. Computer Music Journal, 14(l):p.42, 1990. Mathews, Max. "The Radio Drum as a Synthesizer Controller." In Proceedings of the 1989 International Computer Music Conference, ed., T. Wells and D. Bufler. San Francisco: Computer Music Association, 1989. Lovell, Robb and John D. Mitchell. "Using Human Movement to Control Activities in Theatrical Environments." In Proceedings for the Fifth Biennial Symposium for Arts and Technology, ed., N. Zahler. New London: Connecticut College, 1995. Nyman, Michael. Experimental Music: Cage and Beyond. New York: Schirmer Books, 1980. Rich, Robert. "Buchla Lightening MIDI Controller." Electronic Musician 7, 10 (1991): 102-108. Rowe, Robert. Interactive Music Systems. Cambridge, Mass.: MIT Press, 1992. Ryan, Joel. "Effort and Expression." In Proceedings of the 1992 International Computer Music Conference, ed., A. Strange. San Francisco: Computer Music Association, 1992. Waiswicz, Michel. 'THE HANDS: A Set of Remote MIDI Controllers." In Proceedings of the 1985 International Computer Music Conference, ed., B. Truax. San Francisco: Computer Music Association, 1985. 264 I C M C P R O C EE D I N G S 1995