Page  469 ï~~An object-oriented approach to the combination of rule-based and generative algorithmic composition methods for real-time interactive applications. Dr Tony Myatt Department of Music, University of York, UK. aml2 Abstract This paper describes the production of a system to generate musical compositions in real-time from the motion of kinetic sculptures with chaotic mechanisms. Video tracking methods produce data which is related to the motion of components within the sculpture. This data is applied to algorithmic composition software, running a script that incorporates rule-based mechanisms and iterative algorithms. The combination of these two approaches enables the resulting compositions to interpret, reflect and counterpoint the motions of the sculpture according to the motions of its components. Rule-based systems control how movement-related information is mapped to the parameters which control MIDI events and the formal organisation of the musical output. In addition, rules that determine instrumentation are also encoded. Iterative processes respond to the relative movement of objects to generate control parameters for the sound sources. These two methods are implemented to produce a hierarchy: A high-level rule-based system controlling the parameters of "instrument-objects". "Instrument-objects" respond to controlling parameters according to pre-defined behaviours and characteristics. This application of algorithmic composition techniques combines descriptive and generative methods, and presents the resulting compositions in an appropriate performing environment for the numerous outcomes of algorithmically generated music. I Introduction My interest in the use of chaotic systems as a source for musical material began with mapping algorithms to primary musical elements such as pitch, loudness and duration. Following experimentation and the prod1;tion of several compositions using these methods, I extended my investigations to include naturally occurring chaotic systems in addition to algorithmic models of chaotic behaviour. Computer-based audio tools (envfoll [CDP:1995]) enabled the extraction of data from physical chaotic systems. A typical example of this approach was the application of a gating program to an audio recording of a dripping water tap, adjusted to produce a chaotic flow. Using a gating/amplitude detection program, timing data was extracted and used as rhythmic information. Importantly, the use of this type of chaotic source allowed some familiarity with behavioural patterns of the chaotic system which I found beneficial in determining musical mappings of this data. The following text outlines further developments of this approach, through the design of a system which enables compositions to be produced in real-time in response to the movement of a kinetic sculpture. The system described also has applications for gestural control of sound synthesis, interaction with algorithmic processes and performance environments controlled by any visible movement. 1.1 Mobiles - Kinetic Art and Music The concept of the project, Mobiles, is to link the movement of a sculpture to the generation of musical material: To produce soundtracks which directly correspond, reflect, highlight or counterpoint its motion. The sculptures, designed and constructed by Peter Fluck, are based upon the mechanisms of chaotic pendula and produce a complex and intricate movement of shapes, colours, lines and forms. This paper describes the initial work in the Mobiles series which uses MIDI synthesisers as sound sources. Future pieces will include algorithmically controlled synthesis, computer animation and responsive sculptures. 1.2 Continuous Music The final works are intended as pieces which can perform in art galleries or concert halls. The former can provide a fruitful performance site for algorithmic compositions of this type outside of the context of concert hall traditions. Audiences are mobile and can choose how much of the piece they wish to hear. Returning to the installation provides a different aural and visual experience on each visit as the musical and physical mechanisms evolve over an extended time period. 2 Hardware and Software Systems To enable the composition algorithms to respond to the sculpture's movement, information about its motion is passed to the algorithmic composition software. With these sculptures it is not possible to have any physical or electrical connection without unduly loading the mechanism so the most appropriate detection system is light based. Video methods were adopted for early experiments in appropriate musical interpretations of the sculpture's chaotic behaviour. These experiments used a simple brightness detection method to track a single object. This was implemented using two micro-computers; one that derived positional information from a video image and one that processed that information and ran the algorithmic composition software. This work indicated that the movement of the sculpture was capable of generating interesting musical material, and that to give accurate information for real-time work any video tracking software used for the final works would be required to follow multiple objects at a high frame rate. The computing platform chosen to implement both the video tracking and algorithmic software was the Silicon Graphics Indy running under the IRIX 5.3 operating system. This provided sufficiently fast processing to both track a video image and perform the algorithmic composition tasks on one machine. In addition, it features video inputs in a variety of formats as part of its standard configuration. Video tracking software was written to ICMC Proceedings 1996 469 Myatt

Page  470 ï~~enable eight objects to be tracked at twenty five frames-per-second based upon colour differences between different elements of the sculpture. The software generates two-dimensional coordinates for each object every twenty-fifth of a second. The tracked object position data is passed to Tabula Vigilans [Orton: 1994a] a script-based algorithmic composition program. 3 Algorithmic Methods Tabula Vigilans uses a 'C'-like script language to define algorithms and processes [Orton: 1994b]. Each main performance script used to produce compositions from the sculpture contains organisational information, organisational mechanisms and a series of "instrument-objects". 3.1 Instrument-objects In Mobiles I information about the movement of the sculpture is used in several different ways: a. To trigger specific musical events corresponding to the interaction of the sculpture's elements; b. to iteratively evolve primary musical elements (e.g. pitch, loudness); c. to determine the formation of musical material from pre-defined sets (e.g. harmonic fields, a range of duration values). If there is no movement, no positional information is processed; when the sculpture stops, the sound halts (falls silent or sustains, depending upon which "instrument-object" is performing). The faster the movement of each sculptural element, the quicker the resulting musical lines (up to twenty-five events per second). In this implementation an "instrument-object" is defined as a function that produces a sounding event in response to the movement data from the tracking program. It contains an algorithm which defines how this is achieved, information relating to the type of timbre it produces and the type of movement information it requires. Each "instrument-object" has a series of parameters which describe its behaviour: The type of input information it requires; the type of response it will have to this input information; the timbre and the pitch register in which it is most likely to perform. Information about the sculpture's behaviour, or the energy of the overall mechanism, can be obtained by tracking a relatively small number of sculptural elements. Additionally, the relative motion of these elements can also be used to drive instrument algorithms. Simple examples of the behaviour of "instrument-objects" for this implementation are: To determine the interval between the current and next pitch according to the vertical displacement of an object; to determine the loudness of a note according to the horizontal displacement of an object - movement to the left increases loudness, to the right decreases loudness; the relative position of two objects continuously varies the rate and loudness of the resulting musical material - fast, loud music is played when objects are close together, quiet, slow music when they are far apart; a musical event occurs when two objects pass over each other. 3.2 Organisation The compositions are organised using statistically derived formal schemes and rules relating the parallel and serial combination of the "instrument-objects". A program was written to produce formal schemes based upon a user input of the required performance duration, the number of instrument combination methods available, the minimum time required between instrument changes and the linear density of section events. These parameters are used: To generate a varied range of section lengths; to determine the number of sections lengths required for the specified performance duration; to distribute the section boundaries according to the chosen linear density and to allocate the instrument combination methods. Combination rules were then written to define appropriate material for each section. These are high-level descriptions of the characteristics which the "instrument-objects" are capable of producing. In addition to the parameters of each "instrument-object" defining their input requirements and behaviour, each instrument can realise a particular musical function: Some are foreground or solo instruments; some generate chords primarily intended to function as background material or accompaniments; some provide percussive interjections etc. Combining these with the "instrument-object" parameters allows the production of rules which define the type of musical material that can occur within each section of the formal scheme. Rules are also used to determine the choice of instruments in adjacent sections which, for example, enables certain timbres to remain constant through a number of formal sections or can allow (or inhibit) movement between pitch registers. The combination rules are realised by matching the attributes of each section with those of the "instrument-objects". This allows the use of simple timing routines to trigger the selected instruments during performance. 4 Conclusion and future work The design and use of the composition system described above has suggested a number of further explorations. There are many possibilities for alternative musical interpretations of kinetic art, for the application of video tracking as a human-computer interface and for the control of algorithmic instruments, for performance applications and multiple media work. The use of high-level descriptive rules for defining musical compositions is a goal I have had for some time, and I am encouraged by the results from this project to pursue more comprehensive realisations. Acknowledgments I would like to thank Dr. Rob Fletcher of the University of York Computing Service for his programming support. References [Endrich et al: 1994] Archer Endrich et al,CDP 'Groucho' Sound Processing Programs, Composers Desktop Project, York, UK. [Orton: 1994a] Richard Orton, Tabula Vigilans, Proceedings of the Symposium on Computer Animation and Computer Music, Canberra, Australia: Australian Centre for the Arts and Technology. [Orton: 1994b] Richard Orton, Tabula Vigilans Manual Version 1.0, Composers Desktop Project, York, UK. Myatt 470 ICMC Proceedings 1996