Page  465 ï~~Quad Pan, spatialization system of music in live. Isidoro Perez, Javier Arias, Pablo Fernindez, Luciano Perez. Laboratorio de Informatica y Electronica Musical Centro para la Difusi6n de la Mfisica Contempornnea Santa Isabel 52, 5Q. Madrid 28012. Spain. QuadPan offers the possibility to work with a spatialization system of four live sound sources, in an easy and intuitive manner. It is suitable to any composer, even without experience in electronic music. And it works with the minimum, lower cost, and more accessible hardware, software requirements. Using midi control, the system allows you to edit space movements and synchronize the source's sound (both acoustic or electronic instruments so recording as live) with the programmed spatial movements. An intuitive graphical editor called QuadHC has been developed, it works in real time. You can define trajectories (with their associated durations) to four independent sources. The program can save, edit, print and export sequences of movements. In addition QuadHC generates midi files with the spatialization control data to be used live by means of any midi sequencer. The midi sequencer, with the movement data, will control both direct and reverb sound levels of the quadraphonic system, to project the virtual situation of sound. The synchronization between musical instruments and spatial movements is achieved thanks to a generator of midi clocks. The generator was specifically designed for our system, but it has other different applications due to its ability of instantaneous control of the tempo, without any abnormal phase loosing like other generators. Its look and performing seems like a musical instrument, and it is based in a medieval one, from the northwest of Spain, called Zanfona. It has been made by a Luthier specializated in such instruments. 1. Introduction. This project is the creation of a spatialization system of sound which comes from four independent sources in a bidimensional space which sorrounds the audience in 360Â~. The four sources can be played live or can be prerecorded sounds. There are similar projects which are being developed around the world, but our pourposal gives solutions that we think are ingenious and totally new, in the users-composer and in the performers field above all. The project relates two different parts: a) Software QuadHC The amplification systems allow to produce the feeling of space and movement: it is a process used in a stereo recording. We had to create a computer and a graphic language that would allow represent this space, to read it in computer terms. Then it would transform it in reverb data and volumes that could give the feeling of localization and movement. We named this editor program "QuadHC". In it, the composer draws and determine a duration in conventional solfedge notation to each movement. It could use a polyphonic maximum of four Â~<spatialized voices. The first idea in which all the decisions were based to make the editor design was its easiness: any composer should be able to use it without having any computer's notion, using as bases the common sense and the conventional musical writing. The second idea was its versatility, it should be flexible and adaptable for any sort of aesthetics. The third idea was that the set should be portable and cheap, using all that was already in the market (commercial programs, MIDI, etc.). The fourth idea was that it could be used as working station to record, save, and other capacities of any editor; it should also be used in real time, live. In general, we could say that all these ideas have been accomplished. b) Hardware Zanfona-MIDI-Clocks The second part of the project has a lot to do with the concert itself. The creation of a system which could be used in live concerts and which would include acoustic instruments was also a priority of the project. An essential difficulty should be solved: the syncronicity between the instruments and the computer. Every musical piece has a different duration every day when it is performed. In a computer, as in a tape recorder it lasts always the same time. We had to produce a flexible sequencer, and we had to make it behave like an other instrument which would follow the same tempi as the others. It had to be an external t ICMC Proceedings 1996 465 Perez et al.

Page  466 ï~~object which would make a slave from the computer. There were different projects, various ideas, but even thought the hardware part seemed clear, the form of the instrument was not defined. After different designs the idea of the "zanfona" came to us. It was a very good solution because there is always been a big making tradition in the northwest of Spain, and it is already "an instrument", it does not have to be invented. This is then the soul of the project because the computer is the one who governs and in this way, it humanizes the speed expulsion of data. The "zanfonist" gets together with the other musicians, he is guided by the conductor indications and those of the score, and he gives the tempi pattern to the computer. This system opens new fields for the sincronicity of different sources which happen together with the music through a computer support: images, sound, lights, etc., perfectly syncronized with a music performed live. 2. Description of the spatialization system. We can say that our system needs minimum hardware equipment with lower cost, required to this full spatialization system. Learning and configuration times have been reduced too. Spatialization and sonorization system consist of the following elements. - 4 loudspeaker with independent amplification. - 2 mixers with total midi control, stereo output, 8 stereo inputs, 1 auxiliary bus with prefader send. - 2 stereo reverberation units. - Apple Macintosh computer, with operative system 6.07 or later, including Midi Manager driver, and 13 inch color monitor or higher. - Spatialization editor QuadPan and a midi sequencer program. - S tandart midi interface. - 4 microphones o indepent line. One for each source or instrument. - Midi clocks generator, Zanfona Midi Clocks. In our quadraphonic system we could control manually the main fader levels as well as the levels of the sends and returns to reverb unit. In this situation a full spatialization of any source could be manually achieved. The extra computerized system attached to the quadraphonic system, makes possible its automatization, and allows a perfect synchronization between movements and the sound of live sources. 3. Quad HC editor. All the information that we manage in the edition process is shown only in one main window. The interface window can be split in the following parts: a. A big central square fills 3/4 parts of the window. This represents the plane where audience and speakers are located. We can draw over it the sound movements trajectories. b. A set of 7 buttons representing the musical figures, allow you to express the real duration associated to the drawn trajectory. Tuplets and dot are available too. c. Trajectories, their durations, and other data relative to the movement are stored in a list per instrument. d. There is a menu dedicated to the file management, with new, open, close, save and export options. e. There are other menus or fields to adjust some extra or global parameters (midi on-off, edition-life mode, etc.). f. Some buttons are included to help with the graphical edition and list management. ( OK, delete, insert, erase, etc. ). The user can access easily to all the elements enumerated before, using a optical pencil or mouse. 4. Internal operation. Previous ideas have been implemented in our software package called Quad HC. This software was developed on the HyperCard visual programming environment. Extra modules or extensions dedicated to midi Perez et al. 466 ICMC Proceedings 1996

Page  467 ï~~control and file management were developed using C and Pascal programming languages and compiled together. Quad HC allows two different operation modes: the edition of movements in real time or editing mode and the manual control of spatialization or live mode. 5. Editing mode Main part of the screen is reserved to a rectangular space which represent the physical space where movements occur. This space can be configured in order to adequate its internal distribution to the real space. In this space of movements, we can draw the trajectories of movements like continuos lines. The user can listen in real time the spacializated sound while he draws the movement trajectory. So when the music play, he drag the optical-pencil or mouse with the appropriated velocity in order to get a perfect synchronization with the music and he goes storing those movements in a dedicated list. In each different point of the plane that we draw, a new situation is sent to the sonorization system in real time and this is made sending midi messages to the mixers. Depending on the spatial location (x, y coordinates), and the internal space settings, the midi messages will control the faders and auxiliary sends of the mixers, following the expression: Gn(O,D) = Dn / D * (cos(O - On)) if 10-Onl < 900 Gn gain factor for channel n for virtual sound source located at a distance D and an angle 0 from the listener, On and Dn are the angle and distance between the loudspeaker and the listener. We have implemented several variants to this main formula. Why, if this is the better physical approximation to the spatialization phenomena?, because, we try to get the most intuitive behavior in relationship to the graphical view of the trajectory and its natural meaning to a musician without experience with our electronic system. This means that the possibility of decrease the apparent velocity of trajectories which goes together the square sides, has been considered multiplying by the factor: factor = cos(angle) power n. We have experiment with non exponential decays of sound with distance, but 3 step programmable linear decay. This increase the homogeneity of movements through the square diagonals, but it alters the natural perception of the distance in our drawing space. We think that for persons with experience with spatialization systems, the basic formula given by R. Moore [1] results so intuitive and satisfactory, but it isn't the perfect one for the non experiment composer because it modifies the work with musical dynamics and the visual relationship between sizes of trajectories and distances in the real space is not immediately. The program creates four lists internally (one per instrument ). In each line of each list we store data relative to a movement, these data are shown as follows: page number and trajectory order number (in natural ascendant order). trajectory of movement ( stored like a list of coordinates x,y) real start time and duration of the trajectory (expressed in ticks) midi data density for trajectory. All these fields are calculated in real time when the user is working with the program in the edition mode. Therefore, these data plus something else ( associated name, global parameters, etc.) are stored in a file (text file) when we use the command "save". The "open" command load again all the stored file to the program lists. All these data are used in the export process, when a midi file is generated. The generated midi file is type 1 and contains the sequenciated data. They are full equivalent to the data sent in real time during the edition process (if we activate the midi send option). The midi sequencer allows to discriminate the duration time into a maximum number of fractions (ticks), but they are not so big as we would desire. This is the reason to have a slow computation of the sequenciation of the movements data when we export our file. When we have complex trajectories (with a large number of points) realized in short times, there are more than 1 spatial point per temporal tick (using a direct division). In this situation the midi sequencer can not distinguish the temporal order of them or the large amount of data overload the sequencer. Both questions are solved in two ways. First, to decimate homogeneously the data, storing only one point per tick (made automatically by the program, sampling the trajectory with the appropriated frequency). Second, to allow to the user an optional decimate, using fixed frequencies (for example, eliminate the half of all the points, and so on). We can perform this control using the density parameter specified in each trajectory. ICMC Proceedings 1996 467 Perez et al.

Page  468 ï~~6. Control in live mode. With this mode we can make a natural -synchronization of spatial position changes. During the concert with independence of the programmed movements that the sequencer is performing, and with a human control, we can promptly decide the position of an instrument. This tool is useful to solve a bad control of the sequecer, sending in this moment a good situation for each instrument and minimize the error. Quad HC requires the Midi Manager driver installed on the computer, so we can work in an open midi environment, sharing midi data with other midi application like sequencers, MAX, etc. When we active the sending of midi data, the program takes a midi port and configure automatically the in out connections with the driver ( apple midi driver) and other active clients. At this moment it is performed the setup of the mixers, configuring channel pans and auxiliary sends. But the response speed of the faders and the reception midi channel must be configured previously and externally to the program using the appropriated applications. 7. Hardware tempo controller A computer-stored spatialization score (a MIDI file with control data for mixers) should be run in parallel with the performing of the live musicians. Two possibilities exist: the computer or the musicians being the 'masters' of the tempo. If the musicians are to play according to a computer tempo track, a lot of intimate playing is missed. It was obvious for us that it was better to let the players play freely, and to slave the computer to them. Automatic methods of beat estimation, though a reality nowadays [2], always react with some delay to tempo changes and are not reliable enough (particularly with contemporary pieces where the tempo concept may be nearly missing). This situation led us to the design of a hardware tempo controller, to be considered as an instrument on its own, and to be played by an additional musician. The work being sponsored by the 'Diputaci6n de Lugo', it was a natural and highly welcome proposal to build a Galicia traditional 'Zanfona'. The 'Zanfona' is and old (Middle Age) string instrument with a so called 'manubrio' (a handle that is used to rotate an internal wooden disk). The disk frictions against a group of strings that produce sound. The pitch is controlled by means of a keyboard that stops the strings at different points. To build it we have simplified several aspects when considering the two main fields of performance separately: the ergonomic field and the visual aesthetics. Ergonomic comes to our help from an important point of view, which is to fit the instrument to the needs of its performer, whose posture is going to define the final combination of the various elements, so as to offer better possibilities of performing. Regarding visual aesthetics, the main aim is to avoid any strictly necessary bulk to achieve a dynamic instrument, with a certain projection. The rotating movement of the 'manubrio' was perfectly suited to the control of tempo. Optoelectronics are used to recognize the rotation. 24 equally spaced holes produce MIDI clock messages that can be applied to make the computer-stored spatialization commands run in perfect synchro with the live musicians. Each turn of the 'manubrio' equals a beat. All the 'zanfonist' should do is play along with their colleges, according to the shared tempo feeling. As a guide to the zanfona player, a couple of LEDs help visualize the bar and/or beat subdivisions. Different rates of LED flashing are available through micro switch adjustment. Apart of the optocoupler, micro switches, LEDs and MIDI connector, we have managed to fit the whole system in a single low-cost standard PLD (programmable logic device). An EP910 (24 macrocells, 40 PIN) was chosen instead of the more traditional (but costly) micro controllercombined-with-UART design. 8. Acknowledge A. Nufiez, director of LIEM-CDMC. C. Coster, software development and ideas. C. Arias and J.M. Bal, T. Rodriguez. Diputaci6n de Lugo. Instituto de Estetica de la UAM. 9. References [0] - J. Chowning, "The simulation of moving sources", JAES 1971 (2-6). [1] - R. Moore. Elements of computer music. Prentice Hall. [2] - Various articles on beat detection and foottapping on ICMC'94 Proccedings. Aahrus, Denmark, 1994. Perez et al. 468 ICMC Proceedings 1996