BIOTOOLS: INTRODUCING A HARDWARE AND SOFTWARE TOOLKIT FOR FAST IMPLEMENTATION OF BIOSIGNALS FOR MUSICAL APPLICATIONSSkip other details (including permanent urls, DOI, citation information)
This work is licensed under a Creative Commons Attribution-NonCommercial-NoDerivatives 3.0 License. Please contact firstname.lastname@example.org to use this work in a way not covered by the license. :
For more information, read Michigan Publishing's access and usage policy.
Page 00000474 BIOTOOLS: INTRODUCING A HARDWARE AND SOFTWARE TOOLKIT FOR FAST IMPLEMENTATION OF BIOSIGNALS FOR MUSICAL APPLICATIONS Miguel Angel Ortiz Perez R. Benjamin Knapp Queen's University Sonic Arts Research Centre Belfast, Northern Ireland BT7 INN +44 289 097 4761 mortizperez0l @ qub.ac.uk ABSTRACT In this paper, we present the current state of BioTools, an ongoing project to implement a modular hardware and software toolbox for composers and performers, which allows fast deployment of biosignal monitoring and measuring systems for musical applications. We discuss the motivations for this work and additionally two examples are shown of how this set of tools and the compositional strategies were used in the pieces Diamair for choir and physiological sensors, and Out of Time, a project in which BioTools was used to record and analyze biosignals for their later use to inspire and aid in composition. 1. INTRODUCTION Currently, there is an extensive and constantly growing body of research and artistic exploration in the use of biosignals for musical applications , , et al. (See  for a description of what physiological signals are and their relationship to human-computer interaction.) However, as of yet, there is no universally available set of hardware and software tools that enable easy access to a wider community of practitioners to start composing and performing using physiologically controlled interfaces Usually, the hardware tools have to be adapted from the medical field, often requiring custom electronics, expensive or electrically unsafe equipment, and specialized analysis algorithms. Thus, using biosignals to control music generally requires a case-bycase methodology, and often involves either a long development process/period of time by the composer or the participation of a specialized engineer (or group of engineers) in the creative process. With the development of BioTools, we attempt to limit this time/effort in order to enable the composer to focus on designing the interaction model i.e. the actual physical positioning and implementation of the diverse sensors from their desired piece and not the low level electronics required. 2. MOTIVATION In 1965 Alvin Lucier first used brainwaves as the main generative source for the composition and performance of his piece Music for solo performer . Since then the +44 289 097 4069 b.knapp @qub.ac.uk use of biosignals for musical applications has been of great interest to composers and researchers. In the following years great advances have been made both in the artistic expression related to this medium and the underlying technologies involved. Several composers ranging from pioneers Richard Teitelbaum, David Rosenboom and Jaques Vidal to more recent sound artists as Robert Hamilton, Ken Furudachi and Atau Tanaka have made great advances in this field. The work of these artists is highly personal and appears to be more characteristic of their individual artistic expression rather than a more generalized practice that we could define as biomusic in a broader sense. By developing an accessible toolkit for fast implementation of biointerfaces we intend to enable a wider community of musicians to work at a higher level towards finding or suggesting a style of idiomatic music written for biosignal interfaces. 3. BIOTOOLS There are two main tasks we have focused on in the development of BioTools. The first task is recording, assessing, analysing and plotting physiological data obtained from naturally experienced and induced emotional states for its later use on composition. (See  for information on this process). This allows for the use of physiological data not only as a control layer at performance time for triggering and controlling sound events or processes, but using this data for biosignalinformed composition. Measurements of biosignals through set experiences (performing a particular piece, responding to a questionnaire, watching a succession of images, listening to music, news, etc.) can be used to inform compositional parameters such as: musical structure, polyphony, rhythm, harmony and others. The other purpose of our toolkit is to allow easy implementation of the required algorithms to use biosignals as part of an Integral Music Controller for musical performances  [. We attempt to address these two distinct tasks with a set of standardized hardware and software modules which allow for a more widespread use of biosignals for both aims. 474
Page 00000475 Our initial software implementation for BioTools is built upon the Max/MSP platform, due to its widespread use amongst composers and performers. However, we have also begun implementing the data collection and analysis modules in the EyesWeb platform  because, as has been pointed out previously , Max/MSP still has problems with scheduling and time-stamping synchronized multiple streams of data. EyesWeb is far superior for this precise timing of real-time events and its built-in strengths for image emotive analysis and synthesis capabilities will be beneficial to the composer as well. Different approaches exist for mapping gestures to sound and choosing the appropriate mapping strategy is one of the main artistic decisions composers make on their pieces. However, we will not attempt to discuss the extensive field of gesture mapping in this paper (please see ,  and  for more details). Instead, we focus on the behaviour of biosignals when responding to diverse stimuli to try to create music which is idiomatic to this type of controller. In doing so, we examine two elements: 1. The type of gestures possible for triggering and controlling individual music events on the course of any given composition 2. The technical, philosophical and aesthetic connotations related to the use of this type of signals for composition, in a similar manner as additive synthesis and FFT analysis techniques have informed the French musique spectrale school . 4. HARDWARE TOOLKIT (THE NEXT BIOMUSE) The BioMuse system has evolved over the past 15 years from a high-end research system to a wireless mobile monitoring system [ 11]. The BioMuse has been redesigned once more to now be a simple toolkit of bands that can be worn on the limbs, chest, or head to measure any of the underlying physiological signals. Each band has the appropriate signal conditioning and protection circuitry necessary for the type of signal being measured. For example, the headband is specifically designed for measuring EEG and EOG signals. The limb band is designed to measure EMG and GSR signals. The chest band is designed to measure EKG. All of the bands have self-contained dry electrodes with the amplification, adaptation, and protection electronics imbedded within the band. The output of these bands can then be plugged to any of the standard wireless transmitter systems such as the ICubeX  or the Arduino Bluetooth . Figure 1. shows parts of the toolkit being used during a rehearsal. Figure 1. Hardware toolkit modules, GSR sensors and armband connected to an Arduino BT. 5. SOFTWARE MODULES The software layer we are currently working on consists of a series of Max/MSP abstractions, GUIs (for fast analysis and visualization of data) and their related help files. The modules are implemented as a collection of patches instead of external objects to allow for easy modification and improving of these implementations by ourselves as well as others. BioTools Monitor window EMG Figure 2. BioTools' monitor window 5.1. Electromyogram (EMG) The EMG hardware module measures underlying muscular activity generated by motor neurons. This signal is the most versatile for musical applications because it can be measured above any muscle, including arm (using armband) and face (using headband or glasses) and can be used both for continuous control and state recognition. Thus, it can track not only emotional 475
Page 00000476 information, but can be used in conjunction with more traditional non-physiological sensors to measure any of the physical gestures related to playing musical instruments and other performing arts. As demonstrated by Atau Tanaka  and others, the most common placement of EMG sensors for musical practice is in the forearms of the performer. This is a convenient place for the sensors because it allows finger activity to be tracked without an intrusive device such as gloves which can directly affect the performance. The software abstraction provides simple envelope following of the overall muscular activity tracked by the sensor and incorporates dynamic low-pass/high-pass filters and an adaptive smoothing algorithm to address the trade-off between stability of the signal and accurate response to fast gestures. As a sub-group of the EMG module, we are currently working on gesture recognition of specific sets of muscles in order to assess information related to the specific performance practice of different musical instruments. 5.2. Electrocardiogram (ECG, EKG) Created by the electrical impulses of the heart as it progresses through the stages of contraction, the EKG is one of the largest bioelectric signals. Figure 3 shows the components of a typical EKG signal. Our abstraction reads this signal and currently measures two key components: the RR and the QRS complex segments. The heart rate is computed directly from the length of the RR interval, the change in the duration of the RR interval measures the overall heart rate variability (HRV) which has been found to be strongly correlated with emotional stress . 5.3. Galvanic Skin Response GSR refers to the change in skin conductance caused by changes in stress and/or other emotional states. The GSR is extremely sensitive to emotional changes. Both subtle changes in the tonic level of the GSR and dramatic changes in the phasic level can be tracked with this technique. The GSR signal in its raw format is often confusing for musicians who are not familiar with the way it works, higher arousal levels (stress, increased involvement) cause the skin resistance to drop; reduced arousal (relaxation, withdrawal) levels results in increased resistance. To address this non-intuitive behaviour, our abstraction extracts both tonic and phasic behaviour and inverts the resultant control signals. 6. EXAMPLES PIECES CREATED USING BIOTOOLS The presented toolbox has been employed recently for the composition of the pieces Diamair and Out of Time. For these compositions, BioTools has proved to be extremely helpful, we were able to focus on the physical implementation (in the case of Diamair) and the musical contents of the pieces. 6.1. Diamair: a piece for choir and IMC Diamair  is a piece for choir and Integral Music Controller inspired by the poem of the same name, often translated as "A Mystery" or "The Song of Amergin", this text is contained in the Lebor Gabala Erenn (The Book of Invasions) . For this composition we use the GSR and EMG modules of the IMC in addition to realtime face tracking. The conductor is equipped with EMG sensors on each forearm. The resulting data is used to identify staccato and legato articulations (as well as interpolation between them) on his/her conducting gestures. This information is then used to control spatial spread of the electronic sound sources and to apply amplitude and frequency envelopes. A group of eight soloists are equipped with GSR sensors. These sensors are placed in custom choir folders that the singers hold in their hands as shown in Figure 4. This implementation succeeds in being nonintrusive for the singers. sT.semCJnt I OT bRerval:i" "! Q\\\\\\\\\\\\\\' Figure 3. Ideal EKG signal The QRS complex can give valuable information on the breathing patterns of the performer, thus it makes it possible to voluntary use breath as a direct controller for sound manipulation as well as to use ancillary breath patterns related to specific instrumental practices (wind instruments and voice). 476
Page 00000477 ________________________________________ Figure 4. Hardware implementation of GSR sensors for choir soloists. The GSR signals from the choir where mapped to a granular synthesis engine to control transposition (specifically levels of dissonance), number of grains (polyphony) and grain size in order to shape the materials through involuntary autonomic physiological reactions, creating a direct interface between emotion and sound manipulation. The choir is laid out in two concentric circles with the conductor at the centre as showed in Figure 3. The inner circle is formed by the eight soloists. The rest of the choir who are not equipped with sensors are placed surrounding the audience. Rest of the Choir IMF GSR O(equipped 0 Soloists QT- `0 Conductor 00 AudiencE Figure 5. Spatial Choir configuration An imposed challenge for this project was to keep the hierarchical conductor-soloists-choir relationships in their interaction with the electronic sounds. Using the distributed IMC  concept to allow all the possible levels of interaction, we distributed the interface (GSR and EMG sensors) between the conductor and choir. The conductor has the capability of controlling the choir through his physical gestures. His control is augmented by the GSR module so that his gestures also remotely control the live electronics. The soloists do not have direct control over their sound manipulations but rather interact with them through ancillary and induced involuntary autonomic physiological reactions. The remaining choir members who are below the soloists in the hierarchical tree (conductor-soloists-choir), have no direct interaction with the live electronics, but close a feedback loop by their singing which affects the conductor's gestures and soloists emotional states. The use of the interface had a major role in the final compositional result. The GSR signals evolve slowly over time which in initial tests proved to lack more dynamic changes. To address these limitations specific fragments of the piece were written to induce different stress levels to the soloists. 6.2. Out of Time: physiologically informed soundtrack to the film Out of Tune. Out of Tune is a short film by director and writer Fran Apprich. This work depicts women's exploitation in a world in which girls want to be women. The story is set in a strip club in reference to Jean-Luke Goddard's Vivre sa vie. The collusion of a girl backstage with a stripper triggers an unexpected clash of personalities and generations. The music for this film explores further this idea of exploitation by measuring the emotional responses of the actress during the main stripping scene and analyzing such measurements for their later use as a compositional framework for the whole soundtrack. The EKG and GSR modules of BioTools were used to measure, record and plot the actress' stress levels during rehearsals and shooting. The recorded data from the different takes was averaged to find consistent curves in her emotional state changes during acting. As well as the overall plotted curve, we found consistent spikes at specific actions in her stress levels (i.e. the increase in stress seconds before stripping and slow relaxation afterwards). As she played the role of the stripper, subtle changes on her emotional states where identified relating to the different elements of the performance. The soundtrack is composed almost exclusively for an out of tune piano; the overall emotional curve measured by the GSR module is used to dictate the form and structure of the piece. Changes in the heart rate variability were found to be associated to more specific actions and were used to organize dynamics, articulations and harmony. A restriction imposed by this fixed medium was the impossibility to use biosignals as a real-time performance tool. The physiological information on this project was used to layout more traditional musical parameters. For the final result, there is no direct sound generation or manipulation by the biosignals, but rather the recorded data serves as a structural framework for the compositional process. This data was averaged between the different takes and then rendered into form, harmony and rhythmic structures for the composition of the piece, some other elements of the composition as melodic outline and style references are not related to the physiological information recorded from the actress, but rather from the specific requirements of the film's narrative. 477
Page 00000478 7. CONCLUSION We have described a new set of tools, BioTools, which are currently being created for rapid development of musical applications using physiological sensors. The new hardware sensors enable flexible placement of the sensors anywhere on the body and measurement of any type of physiological signal. The initial software tools are working on the Max/MSP platform because of its widespread use by composers and performers. However, as pointed out previously, time coding different data streams in Max/MSP for analysis purposes is a complex and time consuming process and due to this we have also begun to implement BioTools on the EyesWeb platform, Additionally, we are looking at implementing the modules on other programs such as PD, Anvil, and Chuck to offer more flexibility. The use of BioTools has made the process of creating a piece, Diamair, for Integral Music Control as well as a piece, Out of Time using pre-recorded physiological signals an exercise in composition not electrical engineering. 8. REFERENCES [ 1] Anonymous. Book of Leinster, Section 1 Folio 12b 40, http://www.ucc.ie/celt/published/ G800011A/index.html  Bowler I., Purvis A., Manning P., and Bailey N., "On mapping N articulation onto M synthesiser-control parameters," Proc. Int. Computer Music Conf. (ICMC'90), Glasgow, Scotland, 1990.  Camurri A. et. al. "The Premio Paganini project: a multimodal gesture-based approach for explaining emotional processes in music performance", Proceedings of The 7th International Workshop on Gesture in HumanComputer Interaction and Simulation 2007, 23-25 May 2007, Lisbon, Portugal  Knapp R. B. and Cook P. R. "The Integral Music Controller: Introducing a Direct Emotional Interface to Gestural Control of Sound Synthesis," Proceedings of the International Computer Music Conference (ICMC), 2005, Barcelona, Spain, September 4 -9, 2005.  Knapp R. B. and Lusted H. S., "A Bioelectric Controller for Computer Music Applications," Computer Music Journal, MIT Press, Vol. 14, No. 1, pp. 42-47, Spring 1990.  Knapp R. B. and Lusted H. S., "Designing a Biocontrol Interface for Commercial and Consumer Mobile Applications: Effective Control within Ergonomic and Usability Constraints," Proceedings of the 11'h International Conference on Human Computer Interaction, Las Vegas, NV, July 22-27, 2005  Lee C.K., Yoo S.K., Park Y.J., Kim N.H., Jeong K.S., Lee B.C. "Using Neural Network to Recognize Human Emotions from Heart Rate Variability and Skin Resistance" Proceedings of the 2005 IEEE Engineering in Medicine and Biology 27th Annual Conference, Shanghai, China, September 1-4, 2005.  Lee M. and Wessel D. "Connectionist models for real-time control of synthesis and compositional algorithms", Proceedings of the International Computer Music Conference, San Jose, USA, 1992.  Lusted H. S. and Knapp R. B. "Controlling Computers with Neural Signals," Scientific American, October 1996.  Nagashima Y. "Interactive multi-media performance with bio-sensing and biofeedback". Proceedings of the New Interfaces for Musical Expression Conference, Montreal, QC, Canada, May 22-24, 2003.  Ortiz P6rez M. A., Knapp R.B. and Alcorn M. "Diamair: Composing for Choir and Integral Music Controller". To be Published in the Proceedings of the New Interfaces for Musical Expression 2007 Conference, New York, NY, June 7-9, 2007.  Wanderley M. M. (guest editor). "Mapping Strategies in Real-time Computer Music." Organised Sound, 7(2), August 2002.  Warner D. "Notes from the timbre space", Perspectives of New Music, Vol. 21, No. 1/2. (Autumn, 1982 - Summer, 1983), pp. 15-22.     http://www.arduino. cc/ http://www.gyoza.com/ate/atau/html/index.html?30g5 85 http://www.infomus.dist.unige.it/eywindex.html1 http:i//ww.lovely.comnVtitles/p 1014.html  http://infusionsystems.com/catalog/index.phpa  Jensenius A. R., God0y R, and Wanderley M. M.. "Developing Tools for Studying Musical Gestures within the MAX/MSP/JITTER Environment." Proc. of the 2005 International Computer Music Conference (ICMC2005), Barcelona, Spain, 2005.  Knapp R. B. and Cook P. R. "Creating a Network of Integral Music Controllers," Proceedings of the New Interfaces for Musical Expression (NIME) Conference, IRCAM, Paris, France, June 5-7, 2006. 478