Page  00000326 An Application of The System, "BodySuit" and "RoboticMusic" - Its Introduction and Aesthetics Suguru Goto IRCAM 1, place Igor Stravinsky 75004 Paris France Suguru.GotoC@ircam.fr http://suguru.goto.free.fr 1 Introduction The system, which I introduce in this demonstration contains both a gesture controller and automated mechanical instruments at the same time. In this system, the Data Suit, "BodySuit" controls the Percussion Robots, "RoboticMusic" in real time. "BodySuit" doesn't contain a hand-held controller. A performer, for example a dancer wears a suit. Gestures are transformed into electronic signals by sensors. "RoboticMusic" contains 5 robots that play different sorts of percussion instruments. The movement of the robots is based upon the gestures of the percussionist. Working together with "BodySuit" and "RoboticMusic," the idea behind the system is that a human body is augmented by electronic signals in order to be able to perform musical instruments interactively. This system was originally conceived in an art project to realize a performance/musical theater composition. This demonstration is intended to introduce this system as well as the possibilities from my experiences in an artistic application. 2 General Description "BodySuit" was first created by an electronic engineer Patrice Pierrot, in 1997. Although it was originally conceived to work with "RoboticMusic," it had to wait many years until "RoboticMusic" was ready. Meanwhile, many possibilities of "BodySuit" were explored, for instance, it was experimented with to control computer generated sounds and video images (Fig. 1). Fig. 1: BodySuit can also control sounds and video images in real time. "RoboticMusic" was created in 2003 (Fig.2). The original concept and the design were done by me, and the robots were realized by a humanoid robot specialist, Fuminori Yamazaki, of the iXs Research Corporation in Japan. The project is still a work in progress, the goal is to eventually form a robot orchestra. Fig.2: There is a special sort of spring robot. At the end of this, it holds a mallet. in the arm of the A gesture of performer with "BodySuit" is translated to gestures of "RoboticMusic." Instead of having a computergenerated sound, one can interactively have an acoustic percussion sound. One of the important elements is the relationship and the communication method explored within this system. One may consider "BodySuit" and "RoboticMusic" as a relationship between a conductor and an orchestra, where dance-like gestures merely trigger instruments. In other words, this is an instrument that relies on physical gestures. Another point is the method of translation used by the computer. For example, signals from "BodySuit" are transformed by Mapping Interface and Algorithm in a computer, and then are sent to "RoboticMusic." One gesture may trigger one attack on one instrument. However, it is also possible to trigger 5 instruments at the same time. Otherwise complex musical data, which is automatically generated by a computer and then reproduced by "RoboticMusic," is altered by gestures with "BodySuit" to modify the parameters of algorithm in real time. 326