Page  147 ï~~A Meta-Trumpet(er) Jonathan Impett 50 Haliburton Road Twickenham TWl iPF England Abstract This paper describes an instrument I composition environment built around a single instrument - the trumpet- and a particular musician - the writer. An outline of the principles guiding the development the instrument is followed by a technical description of the system. 1 Introduction "...it seems that a new type of musician is necessary, an 'artist-conceptor' of new abstract and free forms, tending towards complexities, and then towards generalisations, on several levels of sound organisation." [ Xenakis 19851 The "meta-trumpet" system consists of an instrument fitted with a range of physical sensors (their output converted to MIDI by a STEIM Sensorlab), sound-to-MIDI conversion, and software which processes the incoming data and provides a composition and scheduling environment in which the composer can determine the system's response. In general, the output is directed to MIDI-controlled instruments and processors. Aspects of performance already inherent in playing the trumpet are abstracted and extended to become: a) musical material for compositional purposes, and b) means of direct control over other parameters. The information generated could be seen as depicting a broader performance situation, of which the sound of the trumpet is one view. Centro di Sonologia Computazionale Via San Francesco 11 1-35100 Padova This is thus a "meta-" instrument in that it consists not only of the trumpet itself, but also the physical actions performed to and with it, the acoustic output and the musical / logical behaviour of the whole. The sound of the instrument is integrated into the system, being directed to the sound processing elements, which in turn are controlled by the software whose input comprises the physical parameters of performance and of the sound itself. Different views in any number of dimensions can be constructed through the resulting continuous information / sound space, including the possibilities of folding parameters onto or inside each other, and the use of temporal processes. 2 Background and Principles A central aim in designing the instrument was to permit a close and dynamic relationship between performer, instrument and impovised or composed musical material, whether computerstored "score" or constructed in real-time from performance data. As in most successful instruments, the roles of sound source and controller can be fully integrated - on a wider surface than the trumpet alone - by effectively incorporating the computer ( and thus potentially the composition) in the instrument. The player can maintain a directness and spontaneity of communication whilst retaining "higher" levels of abstraction and formalisation of both structural and sound material. This can be seen as part of a move ICMC Proceedings 1994 147 Interactive Performance

Page  148 ï~~to re-empower not only the performer and the performing environment, but by extension also the listener. The intention was to build musical presence into the structure of production, by contrast with the pre-accreditation which we tend to demand ( risking by not risking ).One could then construct situations - compositions - the performer's knowledge of which could develop infinitely, and in a way organic to his own musical personality, without there being any question of "mastering" the work. This is what happens ideally in any musical situation, of course, but our tendency to fix and counterfeit has become so prevalent that it seems useful to design such habits out of a system of musical production. As far as possible, this is implemented without compromising the richness of the instrument and its technique, or adding extraneous techniques for the performer - most of the actions already form part of conventional performance. In keeping with this idea, it proved possible for the trumpet at the heart of the system to remain inviolate, thanks to the delicate and inventive work of the designer and builder of the electronics, Bert Bongers of Den Haag. The system described here was developed concurrently with the piece "Mirror-Rite" (Impett, 1994 ). All of the output, including soundprocessing, is calculated in real time from performance data. The output devices - sampling machine, synthesisers, processors and digital mixing - are all controlled by the system via MIDI. The instrument is also currently being used in the integrated environment of the IRCAM Signal Processing Workstation [Impett, 1993]. 3 Performance Parameters The position of the trumpet is read constantly in a 2m2 2-dimensional screen ( a third dimension is to be added), by ultrasound receivers below and to the side of the performer. A cluster of transmitters is attached to the bell of the trumpet, so as to be almost omnidirectional. Combined with timing information, this data produces a vector describing direction and speed of movement. A choreography of virtual instruments and processes can be constructed Within the resultant space. Information about the player's physical contact with the trumpet is provided by pressure sensors and mercury switches.Two pressure sensors are mounted on the right of the third valve casing, where ordinarily lie the only two fingers not engaged in moving valves or slides, or in supporting the trumpet. Below the centre of the bell there are two mercury switches, together providing a 4-stage value for the left-to-right inclination of the instrument. The three valves each contain a shielded magnet inserted into the cavity at the bottom of the piston. Their fields are read by corresponding sensors fitted in specially built extended lower valve caps. The valve positions can be used by the player as controllers, or interpreted as parameters abstracted from the act of performance, and provide a means of cross-checking pitch conversion. Breath pressure seemed instinctively an essential parameter to use: physically and musically the most direct, and simple to measure in technical terms. In fact what had appeared intuitively quite clear, and is without doubt central to the instrument, proved more difficult to quantify. Inserting a closed tube or balloon ( the usual ways of measuring breath pressure) is obviously out of the question. Both within the instrument and inside the player's mouth, the pressure - quite normal in the iirst case and very high in the latter - varies very little with volume, but changes in a more complex relationship with tessitura. In any case it was impossible to measure breath pressure or speed without compromising either playing technique or the acoustic integrity of the instrument. Ultimately Interactive Performance 148 ICMC Proceedings 1994

Page  149 ï~~it was concluded that the best representation of this dimly-defined intuitive parameter is simply volume, but measured on different scales, according to whether a note is being played. If no MIDI noteon currently registers, volume is read on a breathnoise scale. This leads to a thorny issue in the MIDI representation of non-keyboard instruments: the deriving of a single value to denote the intensity or volume of a note, which however compromised or approximate is often useful for compositional or analysis purposes. Velocity is precisely that - rate of attack - and has a tangential relationship to what is required. Instead, a maximum volume value is taken, within a variable time window from the note onset. Pitch, pitchbend and volume are derived by an IVL Pitchrider. Experiments with various more or less sophisticated FFT devices have proved no more satisfactory, when they are tuned down to the robustness required for the unpredictability of live performance. An interesting aspect of the physical additions to the trumpet ( heavier valves and valve caps, a more inert bell, substantial mass added to the final bow ) is that they closely resemble modifications arrived at for quite other reasons by builders concerned with the acoustic development of the trumpet. Perhaps a form of evolutionary resonance! The outputs of all the analogue sensors are converted to MIDI by a STEIM Sensorlab "real world to MIDI interface ( Studio STEIM, Amsterdam). In addition there are six switches mounted next to the valves along the bell, and to the left of the first valve, for changing the function or mapping of other controllers, or for triggering events directly. 4 Software Performance data is processed on two levels - system and composition - behind which run a variable number of independent schedulers. Each scheduler can be switched in and out separately from the composition level, or have its contents transformed in some way without affecting other generated material. Timing information could, for example, be expanded or contracted. This offers a means for the composer to group compositional processes or material into streams of any breadth, whilst retaining their autonomy of behaviour and keeping track of their origin, as well as to interrupt or resume separate strands of development. At the first level, performance data is filtered, mapped and stored according to the system configuration given in an initialising function which each composition contains. By configuring the data buffers and flagging certain types of analysis and filtering in the initialisation, the performance of the system can be optimised for each composition. This may for example take account of how far back a composition will refer, in which parameters. Various continuous parameters are also calculated at this stage, including time-based values such as movement vectors and note density, and other statistical information. The results are posted as system variables, and filtered events passed on to the appropriate compositional function. These are referred to as "scenes", and organised by MIDI input channel, for each of which any number of scenes can be given. Scene changes are independent for each input channel, and either can be scheduled initially, or a scene can contain the logic by which a change will occur. All events - system, compositional or MIDI output - are scheduled in the same way. The processing and composition within a scene can make use of the structures provided, which allow many types of processing of different events using simple language and syntax. The system incorporates a range of tools for accessing 1CMC Proceedings 1994 149 Interactive Performance

Page  150 ï~~performance parameters and buffers, basic processing ( delays, transposition ), MIDI events, scheduling, cueing and pattern recognition. By writing directly in 'C', the event processing and composition can be as complex as necessary. To establish a continuity in the evolution of a piece, MIDI files can be written or read as and when necessary. This software environment was originally developed on an AtariST. The current version runs on a Macintosh. 5 Summary The meta-trumpet and its repertoire arose from ideas forming slowly and for personal reasons, the common direction of which seemed to be to find a continuity between the natures and actions of a musician, an instrument, a composition and the act and environment of its performance. In the case of the musician, between short- and long-scope thought and activity. Each of these elements has its own richness and conventions, and the system described here is one solution to the question of how to create a space in which it might be possible not only to draw but to work with such continuities. References [Impett 1993] Jonathan Impett. A Songline. Sonic Arts Network Journal, London, 1993. [Impett 1994] Jonathan Impett. Ladder of Escape 7. Attacca Records, Amsterdam 1994. [Xenakis, 1985] lannis Xenakis. Arts - Sciences: Alloys, Pendragon Press, New York, 1985. Interactive Performance 150 ICMC Proceedings 1994