Page  149 ï~~IBM Computer Music Center-- Studio Report James Wright and Daniel V. Oppenheim Computer Music Center T.J. Watson Research Center, IBM Yorktown Heights, NY 10598 USA, Abstract: The Computer Music Center at the IBM T.J. Watson Research Center was founded in 1993. Center themes include both research and the development of commercial applications. Recent work includes the MMorph real-time music morpher (Oppenheim, ICMC'95) and the KidRiffs music creativity program. Current work include expression modeling, the Nuance music representation, and further development of both the Sonnet visual programming language and the Dmix environment for music composition and performance. Keywords: music, multimedia, composition, representation, interactive Introduction The Computer Music Center is a group within the Mathematical Sciences Department at the IBM T.J. Watson Research Center. Our primary mission is to conduct research and development in computer music, with a focus on tools and systems for interactive music composition and performance. However, because computer music draws on knowledge from many different domains, we also use music as a testbed for building tools and technologies applicable to other areas in computer science, mathematics, and cognition. Center members include David Jameson (manager), Mat Baskin, Peter Desain (visiting scientist), Martin O'Connor, Daniel Oppenheim, Bruce Radtke and Jim Wright. The Center maintains a studio used for composition and research purposes. Equipment includes a number of synthesizers as well as a Kyma DSP system, multi-track digital recording equipment and a variety of MIDI controllers and software tools running on PC, Macintosh and AIX platforms. Visiting composers have graced our studio from time to time and we hope to increase this activity in future. Over the past 18 months, the Center has hosted concerts and talks by Roger Dannenberg, Roger Rowe and Mari Kimura, Max Matthews and Patrick Moraz. Recent international performances of Oppenheim's work have included MediaMix 96 International Conference, York University, U.K. and the 1993 World Music Days (ISCM) in Mexico City. Current activities, papers and other information are available on our Web site: Recent Work KidRiffs is a multimedia CD-ROM application that fosters the creative "hands-on" exploration of music. We call it a musical playground. KidRiffs presents instruments, scales, rhythms, patterns and other musical elements in a playful, non-directive manner. It includes some surprising capabilities: users can remap musical phrases into different musical scales (e.g. major, phrygian, eastern) and create mirror effects (where each input note spawns a new instance of a multi-timbral pattern, parameterized by pitch, scale, and velocity). KidRiffs has won several awards including a 1995 Parent's Choice award and a 1996 CES Innovations award. A special "touch-screen" version was installed in December 1995 at the IBM ThinkPlace Innoventions exhibit at Epcot Center, Disneyworld. A European version (in French, German, Italian and Spanish) was released in Spring 1996. MMorph (Oppenheim) is a system for compositional morphing of music in real-time. Two or more pieces of MIDI-based music are blended in real time using a process of selective interpolation. The resulting output can contain various elements and aspects of any or ICMC Proceedings 1996 149 Wright & Oppenheim

Page  150 ï~~all of the musical inputs. For example, one can morph continuously between a Bach prelude, Mozart minuet, or Brazilian tango to produce extreme shifts; or morph between four different performances of the same piece to change between upbeat, somber, angry and joyful moods. Current Work Sonnet (Jameson, Radtke) is a visual programming language designed for manipulating real-time data of many kinds (e.g. musical data, network data). Sonnet was originally designed as a sonification tool for monitoring and debugging programs. It is currently being re-implemented to add additional music capabilities. Sonnet has several features of particular interest for music applications: 1) Rapid response for real-time event handling; 2) fast, interactive prototyping (no lengthy compile- link phase required); 3) inherent suitability for distributed applications; 4) ability to load external modules (e.g. C++ music engine, VBX or OCX controls) when special features or extra performance is needed. Nuance (Wright, Oppenheim) is a flexible content representation designed to support the creation, performance, manipulation and analysis of music and other time-based media. Nuance is an open-ended representation which supports the evolution of families of dialects specialized for different needs and domains. It uses simple building blocks and inspectable, role-based specifications to promote clarity, generality and extensibility. Nuance supports explicit modeling of expressive elements, and extends the common "note event" model to include what happens inside a note, between successive notes, and how these are influenced by the overall musical context. Nuance attributes can be either declarative or procedural (generative). Finally, Nuance employs an efficient language-neutral binary format suitable for persistent, distributed and realtime streams of content, as well as an optional ASCII format. Components of Musical Expression is a research project conducted by Peter Desain (visiting scientist at Watson) and his associate Henkjan Honing. The current work focuses on the intended and meaningful deviations in timing and dynamics in music performance, aspects that make one performance so much more interesting than another. Desain & Honing are currently engaged in research which yields computational models of various sorts of musical expression, both for performance and perception. Early results of this research were previously applied in their Expresso system, which supports musically meaningful editing operations. LeNNY (Oppenheim, Wright) is a research framework for specifying and editing expressive nuance. LeNNY uses expressive attributes to modify or extend traditional score-type parameters. Expression attributes and contours may be associated with different levels of a compositional or performance hierarchy. Performer objects combine score-type and expressive attributes in order to render the final musical result. An expression editor will support the graphical representation and manipulation of hierarchical expressive attributes and contours. Dmix (Oppenheim) is a research-oriented environment for music composition and performance written in Smalltalk. It was originally developed at the CCRMA (Stanford U.) and has been in active use for several years. Dmix provides a rich set of objects for handling the many aspects of musical composition, expression and performance, including graphic editors, modifiers, algorithmic input facilities, and real-time editing and interaction components. A major design goal of Dmix is to allow composers to develop and express their ideas in a natural, uninterrupted fashion, quickly switching between various compositional tools and approaches without breaking their creative flow. Dmix employs the "Slappability" user interface concept to make this complex set of facilities easy to use and interconnect in many different ways. With Slappability, tools can be created by extracting and transforming musical attributes and users can enhance graphic editors without programming knowledge. A port to ParcPlace Visual Works 2.5 is largely complete, and will make Dmix available on Mac OS, Windows 95 and possibly other platforms. Wright & Oppenheim 150 ICMC Proceedings 1996