Page  00000001 TACTILE COMPOSITION SYSTEMS FOR COLLABORATIVE FREE SOUND Dan Livingstone Computer Music Research School of Computing, Communications and Electronics, University of Plymouth, Drakes Circus Plymouth PL148AA United Kingdom dlivinastone^ outh. auk ABSTRACT Numerous innovative controllers and collaborative tactile interfaces have been developed for social interaction with sound. This evolutionary field of interaction design has led to a wide range of compositional models that increasingly mirror the open source methodologies developed by the creators of such systems. The authors consider the software integration of such systems and propose a potential model for free sound composition. We speculate on how these integrative approaches are leading to new compositional frameworks for distributed composition, providing an overview of how an open source development approach influences the structure, interaction design and compositional output of such systems. The range of related works in this field is considerable, selected examples are considered in terms of interaction models & compositional approaches that offer a free sound or open source model for social collaboration with the potential for distributed composition. 1. INTRODUCTION Our discussion focuses on the potential of collaborative tactile interfaces to extend the notion of free sound composition. This is not an exhaustive or comparative survey, instead we have chosen to focus on a small range of tactile or tangible interfaces that each offers a different interaction framework for participants to explore. Several of these examples are well documented by the original authors; others are lesser-known systems that offer complimentary approaches. In most cases the designers of these systems had specific audiences or interaction methods as a design objective of each system. We summarise the core features of each and provide a brief analysis as to how each interaction model can contribute to a wider knowledge of interaction design for tactile or tangible collaborative composition systems. 2. EXAMPLE SYSTEMS Each system has a tangible interaction model; Soundgarten [7] offers a toy like collective floor based building process, combining elements of a single object, enabling children to record, modify and arrange samples. ISS Cube [5] functions as a collaborative table top spatial mixing surface for up to four participants using simple movement of tactile objects, this builds on Chris 0 'Shea Digital Media Artist School of Computing, Communications and Electronics, University of Plymouth, Drakes Circus Plymouth PL148AA United Kingdom a more strategic model of play and exchange with intuitive interaction based on relative movement and location of small discs, reminiscent of many board games and intuitive to use with collaborative 'positioning' of predefined sound samples. Audiopad [4] is a well-documented work in the audiovisual tactile mixer field offering real-time visual feedback in addition to a tangible control interface with tactile elements. The fourth example, RGB Player [1] allows manipulation of sequence and pattern, either collaborative or turn based placing/removal of colored objects. Block Jam [3] combines an element of building or assembly to construct a sequence, pattern variation and control of audio flow. A different approach can be seen in the design of ReacTable [2] in this system of tangible objects, textural qualities, topological markers and simple gestures are combined to trigger or represent different types of synthesis. Each example is intended for a different type of social interaction, for example; group discovery, individual or turn based interaction, collective play and collaboration. 3. INTERACTION MODELS These examples have interaction models and functionality that can be simply categorised as Exploratory, Organisational, Sequential and Relational. It is interesting to note that the example aimed at the youngest audience naturally offers the personalisation of the sound-scape through live sampling, whereas the potentially most compositionally experimental work uses visual metaphor to indicate sound synthesis processes. 3.1. Exploratory model Soundgarten is "a tangible interface that enables children to record, modify and arrange sound samples in a playful way"[7]. The project is aimed at 4 to 6 year olds, with the objective of developing early musical education with pre-school children. The interface for Soundgarten resembles a children's toy, where the surface of the garden is the performance stage. The garden has 19 plug holes that allow sound samples to be triggered by plugging in a mushroom. The 3 vertical levels of the garden control the volume of each sample. A microphone, called a shuffle in Soundgarten

Page  00000002 enables a child to record sounds in their environment, and by plugging a mushroom object into the shuffle, the recorded sound can then be plugged into the garden. As the microphone is wireless it allows the children to roam around to find interesting sounds to record, rather then being confined to being sat around the project, which would inevitable lead to the recording of sounds already being produced by the garden. As well as the ability to record sounds, Soundgarten is loaded with a set of predefined sample banks. Each sample mushroom contains an icon on the top to indicate the sound produced. The colour of the icon also denotes the type of sound, such as blue for environment sounds, such as wind blowing, alarm clock or dog barking and brown colour for instruments, such as drums or violin. Soundgarten also enables a set of effects on the sounds associated with each mushroom. Filters such as echo, resonance, play backwards, increase & decrease pitch can be applied to a sample via attribute objects. These attributes resemble a flower petal or leaf and can be plugged into the top of a mushroom and adding more than one attribute will combine the effects. The designs of the tactile attribute objects in Soundgarten don't seem to correspond to the effect on the sounds, such as echo or increase pitch, but in this case that isn't necessarily a problem. Given the target audience of the project, a child of this age wouldn't have a grasp on technical working of those effects, but would just need to remember what each attribute object did. Soundgarten aspires to be extendable, "Like Lego, Fisher Technique and other constructive toy systems SOUNDGARTEN provides an open system, which can be expanded indefinitely" [8]. This may be the case with the production of new sound samples, tangible objects and perhaps a larger playing surface, it won't be able to achieve a free open play environment such as Lego, due to the structure of the plug holes and objects themselves. The ability to combine sounds, by plugging them on top of each other would make this more open. This would enable detailed gardens to be built by the children. 3.2. Organisational model The Interactive Surround Sound (ISS) Cube is a surround sound mixer that allows users to spatially position a sound using tactile objects. The aim of this project is that "users of the system can easily change their mood by recreating their spatial sound scenery. For example, nature sounds can be positioned within the space to create a calm and natural environment" [5] ISS Cube has 4 coloured pucks, called carriers that allow the users to select a predefined sound sample by moving it to the edge of the surface, where a selection menu will appear. Once a sample is selected, moving the carrier across the surface will spatially position the sound using a 4 speaker set-up. Each corner of the surface representing one of the speakers, so the sounds pan between each speaker based on the position of the carrier relative to the surface corner. A second type of tactile object, a smaller white puck, controls the volume of each sample. The closer the sample carrier to the volume, the louder it becomes within the space. The focus of ISS Cube is to allow collaborative mixing of sounds in a space; "due to the multiple input devices, the square tabletop display, which enables equal access from all sides, invites collaborative interaction" [5]. As there are only 4 carrier objects to control samples, this only allows 4 people to collaborate of the positioning of sounds at a time, with enough space around the table for spectators. 3.2.1 Audiopad Audiopad is a tactile interface for musical performance. The initial aim was to increase the stage presence of laptop style performers. Audiopad is essentially a mixer, allowing performers to trigger sound samples, control volume and various effects on those samples. Interaction with Audiopad is via a series of pucks, each with a different action. Sample pucks are used to carry sound sample banks. By moving a sample puck over an area of the interface a performer can then select a group of samples from the graphical menu. A selector puck placed near a sample put brings up a graphical tree menu for choosing a sample. A microphone puck controls the volume of a sample based on the distance between. A projected graphical interface provides instant feedback to the performers using Audiopad. Graphics are placed over the position of each puck, providing local details about the selected sample, volume, on / off state and applied effects. "Our exploration suggests that this seamless coupling ofphysical input and graphical output can yield a musical interface that has great flexibility and expressive control" [4]. The level of sound control in Audiopad is based around selecting predefined samples, altering their volume and applying effects. Samples are held in Ableton Live, with control parameters being passed to it via MIDI by the tracking interface. Effect filters, such as delay or low pass are assigned to different groups of samples, so a performer is not free to add every effect to each sample, however this focused approach leads to a level of intuitive interaction which is highly accessible. 3.3. Sequential model RGB Player began as a "dynamic physical interface that would allow any everyday object to become a device of interaction" [1]. Through the artist's own interest in creating sound from visuals, RGB Player was an exploration into the reverse of this process. The main compositional feature of RGB Player is the ability to create a patterned sequence by placing objects in a line around the disc. Drum sequences that increase and

Page  00000003 decreases can be built up, mixed in with fast repeating guitar samples and piano keys for examples. The interface for RGB Player consists of a rotating glass disc and a slit in the surface that scans any objects that pass over it. Beside it stands a variety of small colourful children's toys, that when placed on the rotating disc, trigger sound samples as they pass over the scanner. Toys and objects placed onto RGB Player are scanned by an internal webcam, which then translates the RGB values into one of 6 sound samples, depending on it's nearest colours, from bass to drums and piano. The distance of the object to the centre of the instrument determines the pitch of the sample played, with less distance emitting a higher pitch. The rotating disc in RGB Player serves as a good metaphor for the loop of a sound sample. With each full cycle the composition goes back to the starting point to begin the sequence again. The only downside to this is the inability to stop the disc from rotating, so one eventually becomes dizzy following objects around and trying to generate a pattern. 3.3.1 Block Jam Block Jam is a musical sequencer that allows players to control the order of sound samples using a series of connected tangible blocks. "Block Jam is not a musical instrument; it is an alternative means of controlling a sequencer. It has no means of continuous control or gesture" [3]. The aim of Block Jam is to create an accessible collaborative musical interface. The player interacts with Block Jam via 26 physical blocks. Each block contains visual feedback via an LED matrix, a push button and rotating dial style input. Initially players start with a play block, to which sample blocks can be connected, by putting them side by side they lock into place. The visual feedback on Block Jam displays the state of each block, which indicates the direction of play in the sequence, such as straight, corner (change direction) or a gate (rotate direction). The player can select from one of three sound sample banks for each block by rotating their finger on the dial interface, with each sample bank containing 5 sounds. The colour displayed on the block indicates which sample bank is currently active for that block (red, orange, green). It is unclear as to why these colours were chosen, as they do match that of traffic signals, which would suggest a stop or go action, but this is not the case. The speed of the musical sequence in Block Jam is determined by the length of time the player pushes the button on the play block before releasing. Each sample contains 3 variations to match the 3 possible speeds of play, as apposed to simply speeding up or slowing down the rate of play of one sample. 3.4. Relational model ReacTable is an instrument for collaborative performance. At the time of writing the system leads the field toward the design of tangible objects in relation to the sounds generated. Haptic encoding such as object shapes, surface texture and colour have been explored. Surface texture gives users indication as to the timbral properties of a sound, "Noise generators have a completely irregular texture and different types of sanding paper can represent a granular synthesizer" [2]. In earlier versions it is unclear as to whether surface texture communicates effectively as a method of identification, as a performer would have to at least understand the terminology and process of each, like saw-tooth generator for example. The ReacTable development team has also experimented with surface materials, such as using plastic (for synthetic sounds) and wood for organic sounds. The Reactable team has recently been exploring the use of topological markers to indicate the relationship between object and interaction. In addition they are refining their camera tracking methods by using these markers to refine object identification, orientation and relative position. 4. A FREE SOUND MODEL Each model discussed has core elements that help to define a Free Sound Integrative model. From the exploratory model, the process of building, reconfiguring and live sampling participants provides an open inclusive form of interaction. The Organisational model shows that conventional control mechanisms can be far more intuitive using tactile objects supported by visuals that reinforce interaction and functionality in a combined perceptual interface. The Sequential model offers a tactile method of assembly of scored elements with pattern variation, a reconfigurable linear process. The Relational model establishes a potentially more direct kinaesthetic linkage between objects, textures and sound properties, a form of haptic encoding. The distributed model allows for virtual interaction within a shared compositional online space, where participants create spatial and visual relationships while exploring a range of sound juxtapositions that can be added to through file upload and exchange. A Free sound approach for the creation of collaborative tactile composition can be described as one which integrates key features of all of the above elements. Common features in this type of system would be not only interaction to influence spatialisation of predefined samples, but the ability to add new source material through file upload, live sampling or real-time synthesis. Sequencing and flow of sound elements over a

Page  00000004 distributed composition also allows for different compositional elements to be modified either simultaneously or in direct response to the interaction of other users. This also suggests that an evolutionary or algorithmic approach to generate new composition from shared elements would extend the open nature of such works. 4.1. A Free Sound Approach To establish this model, a framework needs to be in place for integration between programming software, sound applications [6], visual output and tracking systems. We have implemented one such approach, to illustrate an integrative methodology, more recent development of the Sonicforms platform has been motivated by further use of Open source libraries, currently in development. 4.3 Sonicforms We introduce Sonicforms [8], an online open source research platform that promotes a free sound approach for the development of new audio or visual works, mediated through a tangible tactile interfaces. The structure of the project allows a range of open source software to be used in the creation of the visual and audio output, whilst the tracking system sends information about each tactile object to the software. Using TEMP, a communication gateway server, any messages can be communicated between programs, such as UDP and TCP. The project is open to contribution, enabling other audiovisual artists and developers to implement audiovisual works for tactile collaborative interaction. 5. CONCLUSIONS In order to implement a successful tactile interface for collaborative composition, it is extremely useful to consider the interaction models evident in related systems. Each system discussed has received highly favourable responses from audiences & participants. These systems function effectively and have been professionally implemented. Each example discussed has gone through a process of refinement in terms of interface, interaction mode and sound control, in several cases these are long term projects supported by an integrated team of researchers and practitioners. When considering a 'Free Sound' or an adaptive compositional approach to tactile interaction the underpinning technologies can determine the methods available. In some cases the limitations of a specific software based approach may define the resulting compositional parameters of such systems. By considering the interaction models embedded within each system we have been able to draw on this 'best practice' to identify the core elements of an adaptive or 'Free Sound' approach, we have also considered the potential software limitations and provide an example system that utilises an integrative software approach, including a summary of software integration to extend the interaction, composition and broadcast potential of tactile compositional environments for collaborative composition. In conclusion, the open source community continues to provide versatile tools, extensions, externals and libraries that support and extend a broad range of approaches, identifying interaction models within tangible collaborative music systems is a very useful methodology for identifying the most effective development route. 6. REFERENCES [1] Barter T. RGB Player, exhibited next2004 (Denmark), Royal College of Art Degree Show 2004 (extract from interview by author.) [2] Kaltenbrunner, M. & O'Modhrain, S. & Costanza, E "Object Design Considerations for Tangible Musical Interfaces" Proceedings of the COST287-ConGAS Symposium on Gesture Interfaces for Multimedia Systems, Leeds (UK) 2003 [3] Newton-Dunn H., Nakanon H., Gibson J. "Block Jam: A Tangible Interface for Interactive Music" Proceedings of the Conference on New Interfaces for Musical Expression, Montreal, Canada 2003 [4] Patten J., Recht B., Ishii H."Audiopad: A Tagbased Interface for Musical Performance" Proceedings of the Conference on New Interfaces for Musical Expression, Dublin, Ireland, 2002. [5] Quarta, M. ISS Cube, exhibited at Ars Electronica, Cybersonica,, Bafta Interactive Festival 2003 http//wwsonefuingniieoritxid iss.htrni [6] Wright M., Freed A., Momeni A. "OpenSoundControl: State of theArt 2003" Proceedings of the 2003 Conference on New Interfaces for Musical Expression (NIME-03), Montreal, Canada. 2003 [7] Woolf M. Documentation of exhibited work ntation.pt;df (extract from interview by author.) [8] Sonicforms - ht/w soifrog (open source research platform)