DO MOBILE PHONES DREAM OF ELECTRIC ORCHESTRAS?Skip other details (including permanent urls, DOI, citation information)
This work is licensed under a Creative Commons Attribution-NonCommercial-NoDerivatives 3.0 License. Please contact email@example.com to use this work in a way not covered by the license. :
For more information, read Michigan Publishing's access and usage policy.
Page 1 ï~~DO MOBILE PHONES DREAM OF ELECTRIC ORCHESTRAS? Ge Wang Georg Essl Stanford University Technical University of Berlin Center for Computer Research Deutsche Telekom in Music and Acoustics Laboratories firstname.lastname@example.org email@example.com Henri Penttinen Helsinki University of Technology Department of Signal Processing and Acoustics henri. firstname.lastname@example.org ABSTRACT MoPhO is the newly established Mobile Phone Orchestra of CCRMA. It is the first repertoire- and ensemblebased mobile phone performance group of its kind. We describe the motivation and making of such an ensemble and the repertoire of MoPhO's first concert, performed in January 2008. The ensemble demonstrates that mobile phones orchestras are interesting technological and artistic platforms for electronic music composition and performance. 1. INTRODUCTION MoPhO is the Mobile Phone Orchestra of CCRMA a new repertoire-based ensemble using mobile phones as the primary musical instrument. While mobile phones have been used for artistic expression before, MoPhO is the first (to the best of our knowledge) to approach it from an ensemble/repertoire angle. It employs more than a dozen players and mobile phones which serve as a compositional and performance platform for a expanding and dedicated repertoire. In this sense, it is the first ensemble of its kind. MoPhO was instantiated in Fall 2007 at Stanford University's Center for Computer Research in Music and Acoustics (CCRMA) and performed its debut concert in January 2008. Mobile phones are growing in sheer number and computational power. Hyper-ubiquitous and deeply entrenched in the lifestyles of people around the world, they transcend nearly every cultural and economic barrier. Computationally, the mobile phones of today offers speed and storage capabilities comparable to desktop computers from less than ten years ago, rendering them suitable for real-time sound synthesis and other musical applications. Like traditional acoustic instruments, the mobile phones are intimate sound producing devices. By comparison to most instruments, they are rather soft and have somewhat limited acoustic bandwidth. However, mobile phones have the advantages of ubiquity, strength in numbers, and ultramobility, making it feasible to hold jam sessions, rehearsals, and even performance almost anywhere, anytime. A goal of the Mobile Phone Orchestra is to explore these possibilities as a research and music-making body. We investigate the fusion of technological artifact and human musicianship, and provides a new vehicle for experimenting with Figure 1. Is this what mobile phones dream of? new music and music-making. We see the mobile phone orchestra idea matching the idea of a laptop orchestra [24, 18, 25, 9]. The phones as intimate sound sources provide a unique opportunity to explore "mobile electronic chamber music". The Mobile Phone Orchestra presents a well-defined platform of hardware and software configuration and players, enabling composers to craft mobile instruments and write music tailored to such an ensemble. Furthermore, the combination of technology, aesthetics, and instrument building presents a potentially powerful pedagogical opportunity, which compared to laptop orchestras gain the added benefit of extreme mobility. 2. RELATED WORK Turning mobile devices into musical instruments has already been explored by a number of researchers. Tanaka presented an accelerometer based custom-made augmented PDA that could control streaming audio . Geiger designed a touch-screen based interaction paradigm with integrated synthesis on the mobile device using a port of Pure Data (PD) for Linux-enabled portal devices like iPaqs [13, 12]. Various GPS based interactions have also been proposed [19, 23]. Many of these system used an external computer for sound generation. Using a mobile phone as physical musical instrument
Page 2 ï~~Figure 2. The Mobile Phone Orchestra performing Drone In/Drone Out by Ge Wang. has been pioneer by Greg Schiemer  in his PocketGamelan instrument. At the same time there has been an effort to build up ways to allow interactive performance on commodity mobile phones. CaMus is a system that uses the camera of mobile phones for tracking visual references to allow performance . CaMus2 extended this to allow multiple mobile phones to communicate with each other and with a PC via an ad hoc Bluetooth network. In both cases an external PC was still used to generate the sound. The MobileSTK port of Perry Cook's and Gary Scavone's Synthesis Toolkit (STK)  to Symbian OS  is the first full parametric synthesis environment available on mobile phones. It was used in combination with accelerometer and magnetometer data in ShaMus  to allow purely on-the-phone performance without any laptop. Specifically the availability of accelerometers in programmable mobile phones like Nokia's N95 or Apple's iPhone has been an enabling technology to more fully consider mobile phones as meta-instruments for gesture driven music performance. The main idea for the mobile phone as a meta-instrument is to provide a genericas-possible platform on which the composer can craft his or her artistic vision. At the same time the abilities offered by the phone have to be in a sense stabilized to offer a persistent repertoire for an ensemble. There is also earlier body of work using mobile devices as part of artistic performances. In these, mobile phones did not yet play the role a traditional instrument within a performance ensemble. Golan Levin's DialTones performance is one of the earliest concert concepts which used mobile devices as part of the performance . The concept of the performance is that the audience itself serves as part of the sound source display and the localization of people in the concert hall is part of the performance. A precomposed piece is played by calling up various numbers of members of the audience. Visual projections display the spatial patterns that make currently sounding telephones. The main conceptional use of mobile phones in this concert was passive yet spatial in nature, blurring the performer and audience boundary. The art group Ligna and Jens Rohm created an installation performance called "Whlt die Signale" (German for "Dial the signals"). The performance used 144 mobile phones that were arranged in an installation space. People could call the publicised phone numbers and the resulting piece would be broadcast over radio. Unlike Levin's piece the compositional concept is aleatoric, meaning that the randomness of the calling participants is an intended part of the concept . A performance installation that used mobile technology indirectly, and predates both Levin's and Ligna's work is Wagenaar's "Kadoum" . Here heart-rate sensors were attached to 24 Australians. The signals were sent via mobile phones to other international locations where electricmotor excited water bucket installation would display the activity of the Australians. Here mobile technology was primarily used for remote wireless networking and the mobile devices themselves were not an inherent part of the concept of the piece but rather served as a means of wireless communication. Wagenaar's piece serves as an example of what we will call "locative music". This is music where distributed location plays a conceptual role in a piece. Some authors think of mobile music making as referring to the mobility
Page 3 ï~~Figure 3. Georg Essl conducts TamaG during rehearsal. of the performance itself and not just of the potential of such mobility. This view is reviewed by Gaye et al  who work with the definition "Mobile music is a new field concerned with musical interaction in mobile settings, using portable technology". Atau Tanaka and Lalya Gaye provide some of the more prominent examples of locative music. The term "locative music" is closer to the term "locative media" used by Tanaka and Gemeinboeck in this context . Gaye's Sonic City used a variety of sensors attached to a special jacket. These sensors pick up environmental information as well as body-related signals which in turn modifies music heard by the person wearing the jacket through headphones. For example the jacket would be able to pick up signals such as heart-rate, arm motion, pace and compass heading. Sensed environmental data included such things as light level, noise level, pollution level, temperature, and electromagnetic activity. As the location and the environment changed, the sonic experience varied with it . Tanaka explored locative music in various ways. A project called Malleable Mobile Music explored the ability to make passive networked music sharing into an active endeavor, by providing an interactive music engine with associated rendering capabilities into the mobile devices of participants. Tanaka's installation piece net_derive brought this further into a performance concept. The installation consisted of two settings, one in a gallery with large scale video projections and the other mobile using scarfs with mobile phones embedded at both ends. These were handed out to audience of the performance. Through headphones participants heard instructions and the phones took pictures and recorded sounds of the environment the audience explored. By the participants following or deviating from the instructions the piece maintained an aleatoric component reminiscent of the Ligna piece discussed earlier. Through GPS and wireless communication, their position and information were traced and displayed in the gallery space where the visuals and sounds changed with the choices made by the moving audience . 3. THE MOPHO ENSEMBLE The Mobile Phone Orchestra of CCRMA consists of 16 mobile phones and players, and at this early stage, contains a repertoire of 8 publicly premiered pieces ranging from scored compositions, sonic sculptures, to structured and free improvisations. So far, all pieces have solely used the phones' onboard speakers (and occasional human vocalization) for sound production (no additional amplification or reinforcement), keeping true to our notion of "mobile electronic chamber music" and the potential of ultramobility. Currently, MoPhO uses Nokia N95 smart phones, though in principle we are open to using any mobile device. It is worth noting the onboard features of the N95 to provide an assessment of the capabilities of today's phones. The N95 offers 1) a five mega-pixel video/still camera, 2) a second front-side camera, 3) microphone, 4) stereo speakers, 5) 20-button keypad, 6) 3-axis accelerometer, 7) Bluetooth,
Page 4 ï~~8) Wi-Fi, 9) 320x240 resolution color LCD display, and 10) 330 MHz CPU. In terms of software, the phone runs Symbian OS, with an freely available and extensive software development kit in C++ that allows access to all of the above hardware features, compiler, emulator, and optional integrated development environment. MoPhO also employs the Python virtual machine application, which allows us to write audio synthesis engines in C++ and combine them with Python front-end GUI's. The potential of the phone is indeed immense, though in our experiences so far, the development process presents unique overheads in terms of licensing, and general awkwardness naturally associated with a still-maturing platform. The Mobile Phone Orchestra performed its first public concert on January 11th, 2008 to a packed audience at the CCRMA Stage at Stanford University. It featured the 8 initial pieces of the MoPhO repertoire, all composed especially for the mobile phones. 4. ORIGINAL REPERTOIRE 4.1. Drone In/Drone Out Drone In/Drone Out (Figure 2; a.k.a. ET:Drone:Home) is a structured improvisation for 8 or (many) more players/phones, composed, and programmed by Ge Wang. Based on the laptop orchestra piece Droner by Dan Trueman (see ), Drone In/Drone Out explores both individual and emergent timbres synthesized by the phones and controlled by the human players. The phone generates sound via real-time FM synthesis, and maps the two accelerometer axes to spectral richness (via index of modulation, up/down axis), and subtle detuning of the fundamental frequency (left/right axis). This results in rich, controllable low-frequency interference between partials, and creates a saturating sonic texture that permeates even large performance spaces despite the limited output power of onboard speakers. Additionally, preprogrammed pitches and modulation ratios (selectable via the phone's number pad) allows the ensemble to move through a variety of harmonies and timbre-scapes, as directed by a human conductor. Furthermore, by experimenting with modulation ratios and spectral richness, the resulting partials can suggest the percept of low fundamental frequencies well beyond the limited bass response of the phone speakers. Due to the extreme mobile nature of phones, players may be placed almost anywhere throughout the performance area, and furthermore, are able to easily move during a performance and/or even play from the audience. For example, during the MoPhO debut performance at the CCRMA Stage, we began the concert with members of the ensemble sitting, disguised among the audience. The remaining players marched in with phones droning, as the disguised players revealed themselves and moved to surround the audience (resulting in 12 players/phones). A reprise of the piece (Drone Out) closed the concert, exploring additional spatial configurations of phones before players exited stage / returned to the audience. Figure 4. Rehearsal of The Saw by Henri Penttinen. 4.2. TamaG TamaG (Figure 3) by Georg Essl is a piece that explores the boundary of projecting the humane onto mobile devices and at the same time displays the fact that they are deeply mechanical and articifical. It explores the question: how much control do we have in the interaction with these devices or do the device itself at times controls us. The piece works with the tension between these positions and crosses the desirable and the alarming, the human voice with mechanical noise. The alarming effect has a social quality and spreads between the performers. The sounding algorithm is a non-linear algorithm called circle map [5, 6] which is used in easier-to-control and hard-tocontrol regimes to evoke the effects of control and desirability on the one hand the the loss of control and mechanistic function on the other hand. The first regime consists of single-pitch like sounds that resemble the human voice. When the non-linearity is gradualy increased the performer enters a non-oscillatory regime and the voice stutters in and out. The second regime is mechanistic noise that too can be controlled, but due to the highly non-linear behavior this control is very difficult. 4.3. The Saw The piece The Saw (Figure 4) by Henri Penttinen uses mobile phones as key-pitched intruments. In this case the keys are mapped the the Phrygian mode. Performers are placed in a semi-circle which allows the conductor to explore a variety of panning effects. By emphasizing simple numeric scores and very direct conducting gestures, the piece is written to be playable by trained musicians and non-trained performers alike. 4.4. The phones and the fury The phones and the fury by Jeff Cooper and Henri Penttinen is a DJ-man-style one-performer table-top piece where multiple phone play the role of the individual players playing looped music. The playback rate can be controlled by
Page 5 ï~~Piece # of phones # of performers Software Input types Conducted? Drone In/Drone Out 8 or more 8 or more (1 each) C++ and python Accelerometer & keys yes The Saw 8 8 (1 each) C++ and python Accelerometer & keys yes Circular Transformations 8 8 (1 each) C++ and python Accelerometer & keys yes TamaG 5 5 (1 each) C++ and python Accelerometer & keys yes The phones and the fury 11 1-2 C++ and python Accelerometer & keys no phoning it in 12 or more 12 or more (1 each) built-in mp3 player keys no Chatter 12 or more 12 or more (1 each) C++ and python Accelerometer & keys yes MoPhive Quintet 5 5 (1 each) built-in recorder mic & keys no Table 1. Details of the repertoire of the first public MoPhO concert Figure 5. Adnan's MoPhive Quintet in concert. tilting the devices. By interweaving of looped patterns it references the cross-mixing of a DJ performance. The solo instrument for this piece, the Pocket Shake, was created by Jarno Seppinen and Henri Penttinen. It is a wavetable synthesizer with three sine waves, whose frequency is mapped from the three axes of the accelerometer. Fast movements result in quickly changing timbres and sine sweeps. The piece was played by one person, but more players can be easily introduced. 4.5. Circular Transformations Circular Transformations is a collaborative and experimental work by Jonathan Middleton and Henri Penttinen. The piece is composed for a mobile phone ensemble of 5 -10 players, and is structured in the same manner as an organum with four clausula sections. The title gets its name from the circular patterns of a harmonograph  set to the ratio 5:3 (major sixth). From the rotary shapes Jonathan was able to translate the lines into musical patterns by mapping the actual forms of the lines into number representations. The post production of the notes and numbers was done in the software called musicalgorithms . The tones were created from a combination of slightly inharmonic FM-synthesis  and circle maps [5, 6] sounds and controlled by a simple sequencer. The piece can be either played with a collective synchronization and letting the players control the timbres of their part or with a conductor who gives timing cues of each part. The spatialization was formed as a semi-circle with one bass player at both ends and the other players situated in pairs. In addition, the pairing improved interaction between the players for creating different timbres of the same part. 4.6. phoning it in Chris Warren's phoning it in is a mobile phone "tape piece" performance, where the performers act as diffusers in space. The piece is spatialized and each phone carries a different component of the composition. By positioning and orienting the phones, the players diffuse the piece through the performance space. The tape composition is tailored specifically to the bandwidth of the mobile phone playback by using compression and other techniques to maximize the utility of mobile phones as highly mobile distributed diffusers. 4.7. The MoPhive Quintet: Cellphone Quartet in C major, op. 24 Adnan Marquez-Borbon's MoPhive Quintet (Figure 5) is a free-form improvisation for four or five players exploring iterative live sampling via onboard phone microphones and speakers. At any time, players are encouraged to vocalize, capture the sound of other human players or phones, and/or playback a previously recorded clip. As the piece evolves, new vocalizations are intertwined with samples potentially passed from phone to phone via live, on-the-fly recording. This piece is carried out with the default sound recorder software provided with the phone, and compellingly and playfully suggests new group musical possibilities using common phone features. 4.8. Chatter Ge Wang's Chatter is a conducted improvisation for 12 (or more) players and phones, and employs a simple software buffer playback that maps an axis of the phone accelerometer to playback rate. The players are distributed throughout the audience in an effort to immersive the audience in a sonic web of cell phone conversational clouds that range in theme from greetings to weather reports, laughter, and sheer wackiness. The source material consists of short sentences, laughters, and various guttural utterances (courtesy of Georg Essl) that are triggered via the phones' number pads, easily permitting rhythmic interplay between phones (when desired).
Page 6 ï~~Figure 6. Mobile phone music is on the march. 5. CONCLUSION MoPhO is a Mobile Phone Orchestra that uses programmable commodity mobile phones as its primary means of musical expression. Their computational power allows for rich sound synthesis to be performed on the phones on-the-fly and they offer a diverse set of ways to interact - via hand motion detection from accelerometer data, key input, built-in camera as vision-based sensor, and the microphone. The technology is stable enough that one can start forming both a well-defined ensemble, and create a persistent repertoire. In many ways we see the development of a mobile phone ensemble as a a parallel to the emergence of laptop orchestras. Mobile phones like laptops form a technological basis that can serve as instruments of new music performance, where the engagement with the programmable device itself constitutes the instrument and fuses the teaching of technology and art, and allows new forms of ensemble expression. Some of the properties of the mobile phone orchestra is rather distinct from laptop ensembles. They are rather easy to transport and set up. Mobile phone performances can easily be moved, performed onthe-go, or spontaneously kick-started. The typical power of the speakers of these devices does allow for a chamber music quality of the performance: strong enough for adequately quiet spaces while preserving the intimite instrumental qualities of these devices. This ensemble is still in its infancy. The first concert in January 2008 provided creedance to that the technology is mature enough to sustain the concept of the ensemble. But there are still many pieces missing. Unlike laptops, there is very limited sound synthesis software available for mobile phones, and there is no user interface that would allow non-programmers to easily set up their own composition yet. There are many open questions how to best make sensor data mapping, gesture recognition and sound synthesis available to a non-technical performer. Part of the future development will have to be extension of current software in this direction. On the artistic side this is of course only a first few steps within this setting. The complexity of pieces is quite open-ended, as location, interconnection, mapping of gestures to musical sound can all diversely contribute to mobile phone ensemble play. Also we look forward to exploring performances with other instruments acoustic, electric or otherwise. We believe that this is only the beginning for mobile phone orchestras and are excitedly looking forward to diverse developments of this new emerging medium globally. 6. ACKNOWLEDGEMENTS This project was possible thanks to the support and enthusiasm of Jyri Huopaniemi of Nokia Research Center, Palo Alto and thanks to Nokia for providing a sizeable number of mobile phones. Jarno Seppinen provided invaluable input into mobile phone programming during a workshop taught at CCRMA in November 2007. Many thanks to Chryssie Nanou, Artistic Coordinator of CCRMA, for guidance and support throughout the project and for setting up the first concert at CCRMA. Many thanks to Brett Ascarelli and Yungshen Hsiao for documentation in the form of pictures and video footage of rehearsals and concerts, and to Rob Hamilton for his excellent support. Last but never least hearty thanks to all the great MoPhO perfomers and co-composers: Steinunn Arnardottir, Mark Branscom, Nick Bryan, Jeff Cooper, Lawrence Fyfe, Gina Gu, Ethan Hartman, Turner Kirk, Adnan Marquez-Borbon, Jonathan Middleton, Diana Siwiak, Kyle Spratt, and Chris Warren.
Page 7 ï~~Figure 7. The original Mobile Phone Orchestra members (left to right): Henri, Georg, Steinunn, Nick, Ge, Diana, Chris, Gina, Ethan, Adnan, Turner, Kyle. 7. REFERENCES  A. Anthony. Harmonograph: A Visual Guide to the Mathematics of Music. Walker & Company, NY, USA, 2003.  F. Behrendt. Handymusik. Klangkunst und 'mobile devices'. Epos, 2005. Available online at: www.epos.uos.de/music/templates/ buch.php?id=57.  J. Chowning. The synthesis of complex audio spectra by means of frequency modulation. J. Acoust. Soc. Am., 21(7):536-534, 1973.  P. Cook and G. Scavone. The Synthesis ToolKit (STK). In Proceedings of the International Computer Music Conference, Beijing, 1999.  G. Essl. Circle maps as a simple oscillators for complex behavior: I. basics. In In Proceedings of the International Computer Music Conference (ICMC), New Orleans, USA, November 2006.  G. Essl. Circle maps as a simple oscillators for complex behavior: Ii. experiments. In In Proceedings of the International Conference on Digital Audio Effects (DAFx), Montreal, Canada, September 18-20 2006.  G. Essl and M. Rohs. Mobile STK for Symbian OS. In Proc. International Computer Music Conference, pages 278-281, New Orleans, Nov. 2006.  G. Essl and M. Rohs. ShaMus - A Sensor-Based Integrated Mobile Phone Instrument. In Proceedings of the Intl. Computer Music Conference (ICMC), Copenhagen, 2007.  R. Fiebrink, G. Wang, and P. R. Cook. Don't forget the laptop: Using native input capabilities for expressive musical control. In In Proceedings of the International Conference on New Interfaces for Musical Expression (NIME), pages 164-167, New York, NY, 2007.  L. Gaye, L. E. Holmquist, F. Behrendt, and A. Tanaka. Mobile music technology: Report on an emerging community. In NIME '06: Proceedings of the 2006 conference on New Interfaces for Musical Expression, pages 22-25, June 2006.  L. Gaye, R. Maz6, and L. E. Holmquist. Sonic City: The Urban Environment as a Musical Interface. In Proceedings of the International Conference on New Interfaces for Musical Expression, Montreal, Canada, 2003.  G. Geiger. PDa: Real Time Signal Processing and Sound Generation on Handheld Devices. In Proceedings of the International Computer Music Conference, Singapure, 2003.  G. Geiger. Using the Touch Screen as a Controller for Portable Computer Music Instruments. In Proceedings of the International Conference on New Interfaces for Musical Expression (NIME), Paris, France, 2006.  G. Levin. Dialtones - a telesymphony. www. f long. com/telesymphony, Sept. 2, 2001. Retrieved on April 1, 2007.
Page 8 ï~~ J. Middleton and D. Dowd. Web-based algorithmic composition from extramusical resources. Journal of the International Society for the Arts, Sciences and Technology, 41(2), Accepted for publication in 2008. URL, http://musicalgorithms.ewu.edu/. Last visited: 14-01-2008.  M. Rohs, G. Essl, and M. Roth. CaMus: Live Music Performance using Camera Phones and Visual Grid Tracking. In Proceedings of the 6th International Conference on New Instruments for Musical Expression (NIME), pages 31-36, June 2006.  G. Schiemer and M. Havryliv. Pocket Gamelan: Tuneable trajectories for flying sources in Mandala 3 and Mandala 4. In NIME '06: Proceedings of the 2006 conference on New Interfaces for Musical Expression, pages 37-42, June 2006.  S. Smallwood, D. Trueman, P. R. Cook, and G. Wang. Composing for laptop orchestra. Computer Music Journal, 32(1):9-25, 2008.  S. Strachan, P. Eslambolchilar, R. Murray-Smith, S. Hughes, and S. O'Modhrain. GpsTunes: Controlling Navigation via Audio Feedback. In Proceedings of the 7th International Conference on Human Corn puter Interaction with Mobile Devices & Services, Salzburg, Austria, September 19-22 2005.  A. Tanaka. Mobile Music Making. In NIME '04: Proceedings of the 2004 conference on New Interfaces for Musical Expression, pages 154-156, June 2004.  A. Tanaka and P. Gemeinboeck. A framework for spatial interaction in locative media. In NIME '06: Proceedings of the 2006 conference on New Interfaces for Musical Expression, pages 26-30, June 2006.  A. Tanaka and P. Gemeinboeck. netderive. Project web page, 2006.  A. Tanaka, G. Valadon, and C. Berger. Social Mobile Music Navigation using the Compass. In Proceedings of the International Mobile Music Workshop, Amsterdam, May 6-8 2007.  D. Trueman. Why a laptop orchestra? Organised Sound, 12(2):171-179, 2007.  G. Wang, D. Trueman, S. Smallwood, and P. R. Cook. The laptop orchestra as classroom. Computer Music Journal, 32(1):26-37, 2008.