Page  00000001 IMP-CAM: Improvising with Cellular Automata Music Wong Kam Wah School of Creative Media City University of Hong Kong Abstract In this paper we described a system called IMP-CAM, which is used for the interaction between dance movement and cellular automata generated music. This system is the major component of one of our on-going projects. In this project, we try to address three major issues: (i) how the control parameters of a cellular automata affect the quality of music generated; (ii) how the features of dance movements can be mapped to those control parameters of a cellular automata, so that the movements will affect the music generation process; and (iii) develop a system that can capture a dancer's movement and control the music generated by cellular automata, in real-time. Since a dancer's movement is usually affected by the music he/she perceived, we believe that our system defines a new media form: a two-way communication between a live dancer and cellular automata music. 1 Introduction Cellular automata have been used to model a wide range of scientific phenomena (Cood 1968). A typical 2D cellular automata consists a two-dimensional grid of cells, where each cell has a value called its state, usually in the range 0 to N. At each frame, the cell's state values change simultaneously according to a pre-defined rule that determines the new value for each cell. In most situations, the rule is local, which means that it only bases on the value of the cell and its local neighbors. The states of all cells form the state of the cellular automata. The change of states gives a cellular automata an evolution of life. Figure 1 shows a typical evolution of a 2D cellular automata over time. The left image shows an initial state of the cellular automata (which is generated randomly), and the second and third image show other states of this cellular automata as time passed. Cellular Automata is also being used to generate music automatically (Bilotta, Pantano, and Talarico 2000b; Bilotta, Pantano, and Talarico 2000a; Miranda 2001; Miranda 1993). Figure 1: A typical evolution of a 2D cellular automata. Cellular Automata Music creates music by mapping the state of cellular automata to some musical features, such as pitch, volume, and duration. As the state of a cellular automata changes over time, music piece is formed. There are a lot of different ways to define this musification mapping. However, once the rule of a cellular automata and the mapping are defined and remain unchanged, the initial state of a cellular automata determines the final music piece that will be generated. In the other words, the whole process is deterministic. As a result, those music pieces generated by a single musification method tend to be similar to each other. We believe that by adding another level of interactivity on the input parameters of cellular automata, we can gain more control over the quality of the music generated. On the other hands, there are many research works studied the expressive power of gesture and movement (or in particular, dance movement). For example, MIRALab', InfoMus Lab2 and the MEGA project3 have done a lot of excellent researches on this topic. However, few of them investigate the interactivity between movement and music. Winkler (1995) has done some initial works in this direction, and his work motivates our idea. lhttp: // 2 3 Proceedings ICMC 2004

Page  00000002 2 Our approach In particular, we are interested in adding a level of interactivity controls on the input parameters of a cellular automata, from a live dancer's improvisation. Improvisation in the dance terminology means that a dancer listens to a piece of music, and dances according to the instant feeling without any planned movement. This is usually a one-way process: music can affects dancer, but not the vice versa. In our idea, if a dancer's movements can change the cellular automata music, and the music generated will further influence the dancer's feeling, this will form an endless loop of communication between a live dancer and the cellular automata music. This eventually will create a new form of media art: a two-way communications between a live dancer and cellular automata music. 2.1 The IMP-CAM system We have developed a system to test our preliminary idea. The system is capable of real-time control of music based on a dancer's movement. Our system contains two modules: a cellular automata module (called the CA module) written in C++, and an interactive motion-analysis-and-music-generation module (called the IMAAM module), developed using a public available software called Eyesweb4. Eyesweb is a visual programming language similar to MAX/MSP, but it contains several motion analysis objects where MAX/MSP does not. The two modules are connected and communicating to each other using OpenSoundControl5 protocol, developed by CNMAT, UC Berkeley. Figure 2 shows a screen snapshot of our system when it is running. The right side of the screen showed the Eyesweb patch that we built (i.e. the IMAAM module). On the upper left corner of the screen it showed the CA module running. In the CA module we implemented a typical 2D cellular automata. The initial cell states of the cellular automata are generated randomly. We implemented different rules for the cellular automata, such as the Conway's Life rule and the Cyclic rule6 In the current implementation, we followed the music generation method as described in the Isle Ex website7. At each frame we count the number of non-zero cells and the number of cells that has state value changes. These two numbers are used to generate the music. We are currently investigating other musification methods. 4http: //www.eyesweb. Org/ 5http: / /www. onmat. berkeley. edu/OpenSoundControl/ 6http: //psoup.math, 7http: //www. Figure 2: A screen snapshot of the IMP-CAM system. The CA module also allows changing the tempo of the generating music, as well as to "erase" some cells (i.e. set the cells' values to zero) interactively. Whenever the state of the cellular automata is changed, the music generated will also be affected. These two control parameters, i.e. changing the tempo and "erase" the cells, is connected and controlled by the real-time movement analysis module (the IMAAM module), as we will explain in the next paragraph. In the IMAAM module, we captured a dancer's movement in real-time and performed different motion analysis on the input video stream, using the Eyesweb software. In the current implementation, we performed the quantity of motion analysis, which allows us to find out the dancer's speed of movement. The speed of movement will be feed backed into the CA module to control the tempo of the generating music. We also find out the center of the dancer on the stage. The center is feed backed to the CA module to "erase" the corresponding cell location in the cellular automata. As a result, the position of the dancer on the stage will change the state of the cellular automata; therefore change the music generated. Figure 3 shows a conceptual diagram of our system. Cellular automata states MV1usiC CA IMIAAM output module Communicate through module 1 (written OSC protocol (build using in C++) 4 Eyesweb) Control parameters Motions capture input Figure 3: Conceptual diagram of the IMP-CAM system. Proceedings ICMC 2004

Page  00000003 2.2 Other works in progress This project is an on-going project, and we are still in the developing phase. Other works involved in this project include: How the control parameters of a cellular automata affect the quality of music generated? Most researches on Cellular Automata Music focus on the musification mapping between cellular automata states and musical features. However, few of them focus on the relation between the control parameters (such as the initial state, size of the cellular automata, rules, number of cell states, etc) and the quality of music generated. In this project we will focus on this issue. How to map dance movements to the control parameters of a cellular automata? We believe that there is a strong relationship between dance movements and musical features. For example, a high-speed movement can usually be associated to a fast but noisy tone. A jump of the dancer can be associated to a short but heavy pitch. These associations can be used to control the cellular automata music. In this version of IMP-CAM, we used the speed and position of a dancer only. We are investigating other possibilities. Develop a system that can capture a dancer's movement in real-time and create cellular automata music. The IMPCAM system we described in this paper is an initial prototype of our target system. We hope that our final system can be used in different applications, including stage performances, education, and interactive installations. 3 Conclusion We described the IMP-CAM system, which is used for the interaction between dance movement and cellular automata generated music. This system is the major component of one of our on-going projects. Our system is different from the other gesture-generated music systems, in the sense that our system has a cellular automata module which will generate music continuously even though there is no live object involves. When an object (usually a dancer) involves, the object's movement can affect the generated music in some ways, but the CA module still has its own rule to create music. From this point of view, we classify IMP-CAM as a system for "dancer interacts with artificialintelligent generated music". 4 Acknowledgement The work described in this paper was fully supported by a grant from City University of Hong Kong (Project No. 9031005). References Bilotta, E., P. Pantano, and V. Talarico (2000a). Music generation through cellular automata: How to give life to strange creatures. In 3rd International Conference on Generative Art. Bilotta, E., P. Pantano, and V. Talarico (2000b). Synthetic harmonies: an approach to musical semiosis by means of cellular automata. In Artificial Life VII. Cood, E. F. (1968). Cellular Automata. Academic Press, London (UK). Miranda, E. R. (1993). Cellular automata music: An interdisciplinary project. Interface 22(1). Miranda, E. R. (2001). Evolving cellular automata music: From sound synthesis to composition. In Proceedings of the Workshop on Artificial Life Models for Musical Applications. Winkler, T. (1995). Making motion musical: Gesture mapping strategies for interactive computer music. In Proceedings of the 1995 International Computer Music Conference. Proceedings ICMC 2004