Page  124 ï~~Musical Technical Issues in Using Interactive Instrument Technology with Application to the BioMuse Atau Tanaka CCRMA Department of Music Stanford University email: Abstract As use of interactive interface technology in the performance of real time computer music raises some technical issues, the development of new musical instruments from such technology raises many more musical issues. This paper begins with a description of the BioMuse, a bioelectrical musical controller, and then addresses technical and musical issues relating to its use in composition and concert performance. BioMuse Description The BioMuse is a neural interface/biocontroller developed by BioControl Systems, Palo Alto, CA. It monitors electrical activity in the body (as EEG and EMG) and translates it into MIDI. The basic BioMuse system consists of sensors attached to the body and data processing hardware that transforms the bioelectrical data to MIDI [Knapp & Lusted 1990]. The sensors are medical grade dry gel electrodes, three per band. By using three electrodes, a band derives a differential electrical signal across the area of the body that the band covers. Bands exist for taking muscle tension data (EMG), and for taking brain data (EEG). An EMG sensor is also used for eye tracking. The BioMuse has the capability to accept eight such signal inputs. In the current paper, discussion will focus on the use of just two arm bands, one on each of the lower arms. The BioMuse takes an analog voltage from the bands as an input to analog to digital convertors (ADC). This digital voltage representation is analyzed by a Texas Instruments TMS320 series digital signal processor (DSP). It is here that an analysis algorithm can be loaded into the BioMuse. I refer to this stage as biodata preprocessing. Both MIDI and RS422 serial data are output as a result of the analysis. In the system described here, I have used the simplest mapping of muscle tension to MIDI, that is, muscle tension from each arm maps directly to one MIDI controller value (7 bit). This MIDI data is then remapped as needed by the Max programming environment [Puckette & Zicarelli 1990]. This processed MIDI data then controls synthesis and signal processing parameters. In the BioMuse Much can be done in the BioMuse at the biodata preprocessing level. Control strategies can be implemented here that would replace the MIDI post processing stage of Max. In fact, some of the first Max patches, like a violin simulation, were made to emulate and replace BioMuse resident algorithms that had been previously done. By bringing these tasks out of the BioMuse into the Max environment, the gestural mapping process is brought closer to the compositional level, so as to exist at the same conceptual level as the "score" of a piece. More could be done to take advantage of the preprocessing computing power resident in the BioMuse itself. Direct analysis of the biosignal for gesture recognition are now being investigated. This kind of task could be accomplished at the Max level as well, either on the MIDI or on the RS422 versions of the output data. I have chosen to handle technical data tasks in the BioMuse and musical data tasks in Max. An example of a musical data task is the arming of triggers to play musical events when a muscle tension threshold is crossed. An example of a technical data task is artifact removal and smoothing of the biosignal. However, even the seemingly most technical tasks involve some musical decisions. Low pass filters are implemented in the BioMuse to smooth out artifacts in the biosignal. If no filtering is used, there is too much jitter in the resulting data. If too much filtering is used, the apparent response time to a quick muscular gesture suffers noticeably. Finding the right balance is both a technical and musical decision. The Multiplicity of One Dimension A single arm band mapped to a MIDI continuous controller is essentially a one dimensional controller. However there exists a level of hidden complexity in the seemingly one dimensional arm band. An arm band on the forearm monitors activity of the muscles that control movements of the wrist, hand, and fingers. Physiologically, it can be observed that muscles on a certain part of a limb generally control the actions of one or more members below it. A single arm band on the 1A.1 124 ICMC Proceedings 1993

Page  125 ï~~forearm is, then, a window by which we can view the activities of the wrist, hand, and fingers. To extract detailed information about these members from the forearm signal would require advanced gesture recognition techniques such as the use of neural networks [Lee, Wessel 1992]. In musical practice, it is seen that, even without the use of such analysis techniques, certain musical effects are better realized by gross muscle tensing (like making a fist) and others by more delicate finger actions (like stroking gestures). It is in this way that a single arm band is more interesting than the average one dimensional controller. If we consider the performance technique needed to play violin or piano, we find that the degree of control in each arm is multidimensional. Earlier approaches to new interfaces have sought to derive control data at the point where control is manifested [Hong 1991]. These interfaces are essentially fancy measuring devices on the hands and fingers. The approach proposed here derives control data for the same gestures not where control is manifested, but instead at the point where the biocontrol is generated. In doing this, I claim that we get a step closer to capturing human musical intent - the stated goal of this area of musical research.[Lee, Wessel 1992] This use of a single biosignal to reflect the actions of multiple members offers an elegant alternative to the use of datagloves and the like. These other controllers are for the most part mechanical and thus become devices that the user must learn how to manipulate. A biocontroller, on the other hand, is somehow more natural. It is more likr learning how to walk, run, then jump, rather than learning how to drive an automatic then a stick-shift car. Most new interfaces are also plagued with the problem of not providing tactile feedback to the performer. The BioMuse, as an "invisible" controller, would apparently share this problem. However, as soon as one tries to maintain a constant muscle tension for a given time, it becomes clear that the human body offers itself plenty of feedback information, usually in the form of fatigue. New Instruments Alternative controllers based on existing musical instruments are another area of active investigation, and pose some interesting dilemmas [Cook 1992]. They rely on established performance technique on a traditional instrument. These controllers seek to mimic the traditional instrument on which it is based, but for most performers of the source instrument, the new controller often feels awkward and unnatural. Anyone experienced with MIDI guitars knows about this frustration. Just as an electric guitar is a very different from an acoustic guitar, an electronic guitar is not the same instrument as an electric guitar. These emulative controllers are then new instruments in disguise. They are appealing to a user because it seems initially that years spent on mastering the technique of the source instrument will apply to the electronic version. While this is partially true, the prevailing experience is that existing technique must be adapted considerably to make the controller work effectively. This involves a considerable relearning of the instrument for the performer. Much research and development effort is spent trying to make these new controllers mimic the source instrument. The end user also spends considerable time and energy relearning his instrument to adapt his technique to the new controller. If the end result is ultimately a new instrument with a new performance technique, it seems that these efforts could be better spent. Another common goal in the development of new controllers, whether they mimic traditional instruments or not, is to try to minimize the effort and musicianship needed for the realization of complex musical results [Zicarelli 1992]. Considerable time is spent making a new controller "effortless." Why do we have to make life so hard in order to make music so easy [Ryan 1992]? Given these easily misdirected energies in new instrument development, I sought to simplify my own work with the BioMuse. I wanted to stay close to direct musical output, and discover, one musical element at a time, the peculiarities of the BioMuse that made it unique as a new instrument. This was my decision to work with just two arm bands, despite the seductive possibilities of using brain channels and eye tracking. I made very simple mappings of muscle tension to timbre, pitch, and rhythm, all separately. I tried some basic combinations, and also used the two arm bands together - e.g. with one arm controlling pitch and the other controlling melodic speed. Interestingly enough, it was these simple mappings that proved the most satisfying to play - it was so clear how each inflection was affecting the resulting music, it really felt as if the sound came from the body. I found that there is plenty of complexity to master even with one arm band. There are many ways to realize different tension trajectories. I discovered these techniques in an intuitive way. Codification comes later, and with it more complex mappings. In this way the more complex mappings could be developed to maintain that feeling of closeness and direct bodily control. ICMC Proceedings 1993 125 1A.1

Page  126 ï~~Performance Issues When it came time to stage a piece, performance issues came to mind. Eye tracking and brain channels were not practical for public performance - how was the person sitting in the back row of the hall to see which direction my eyeballs were pointed, and how was anyone to know whether I was really in an alpha state or not? Even with the seeming physicality of the arm bands, it must be remembered that the BioMuse reports only muscle tension activity. That is, I could have performed a piece without any movement, but instead just by quietly tensing and relaxing the arms. Although this could be an interesting concept for a future piece, it hardly seemed the thing to do for the concert premier of a new instrument. This is not to say that my work is choreographed. Potential collaboration with dance and movement people are being considered and are replete with possibility. However, my current interest is to approach the BioMuse as a new musical instrument and not as a multipurpose multimedia interface. I sought to find a musical ingredient of movement to incorporate into the piece. A pianist certainly does more than just move his fingers on stage. I found that the biomusical mappings I had created lent themselves to certain motions in order to be realized in a flowing way. Certain movements suggest themselves in the music, and help to realize the intended effect in performance. It must be made clear, though, that the movement itself did not cause the music to happen. Things that look like movement driven actions are in fact not, and must be considered and compensated for by the performer. One can express violence while playing the violin, and the performer can adopt a violent gesture to illicit an aggressive sound. Untempered violence, however, would sooner result in a broken violin than a successful expression of intent. Likewise, for percussive sections where I feign the action of striking a drum, the actual motion of striking does not itself trigger the drum sound, but does help to articulate it. In considering articulation, I can not get carried away with the striking movement - if I were to think only of the movement, my muscle could very well tense in anticipation of the motion, and trigger the drum sound early. Likewise, if I forget about the inherent time delay, the drum sound could enter later than my articulated strike. These issues of compensating for articulation could be deemed shortcomings of technology. I prefer to regard them as standard technique of performing on a particular instrument. Articulating a staccato on a tuba requires some forethought and planning. The use of movement in a technically nonessential but musically interesting way is found with other instruments. A pianist can make a grand gesture before or after articulating a note. The piano is physically not sensitive to the gesticulation of the performer before or after the striking of the note. However, the execution of this kind of gesture is important in defining the musicality and sound of the pianist. In one sense this is the theatrical aspect of musical performance. In another sense, it is the action that, though ineffective physically, has enhanced the musicality of the resulting sound. When a performer feels comfortable enough with his instrument to inject such elements of bravura to heighten the musical experience, we have attained the height of instrumental performance. I hope that we will be able to attain such fluency with our new electronic instruments. References Perry Cook. A Meta-Wind-Instrument Physical Model and a Meta-Controller for Real-Time Performance Control. Proceedings ICMC San Jose, p.273, 1992. Andy Hong. Measuring Cello Performance Articulation. Proceedings ICMC Montreal, 1991. R Benjamin Knapp, Hugh S. Lusted. A Bioelectric Controller for Computer Music Applications. Computer Music Journal, 14(1):p.42, 1990. Michael Lee, David Wessel. Connectionist Models for Real Time Control of Synthesis and Compositional Algorithms. Proceedings ICMC San Jose, p.277, 1992. Miller Puckette, David Zicarelli. MAX - An Interactive Graphic Programming Environment. Opcode Systems, Palo Alto, CA, 1990. Joel Ryan. Effort and Expression. Proceedings ICMC San Jose, p.414, 1992. David Zicarelli. Music Technology as a Form of Parasite. Proceedings ICMC San Jose, p.69, 1992. 1A.1 126 ICMC Proceedings 1993