A Toolkit for Interactive Digital ArtSkip other details (including permanent urls, DOI, citation information)
This work is licensed under a Creative Commons Attribution-NonCommercial-NoDerivatives 3.0 License. Please contact email@example.com to use this work in a way not covered by the license. :
For more information, read Michigan Publishing's access and usage policy.
Page 00000001 A Toolkit for Interactive Digital Art Haruhiro Katayose (1)(3) Hirotsugu Shirakabe (2) Tsutomu Kanamori (1) Seiji Inokuchi (1)(2) (1) L.I.S.T., firstname.lastname@example.org, http://www.sys.wakayama-u.ac.jp/~katayose/ (2) Faculty of Engineering Science, Osaka University (3) Faculty of Systems Engineering, Wakayama University Abstract Recently various kinds of authorware for multimedia contents are available. It is, while, difficult to find environments to create interactive digital art including real-time media control based on gesture. This paper describes a toolkit to write interactive digital art which includes real-time value-based control of the piece. The toolkit is composed of a compact gesture sensor called ATOM and a set of templates and a methodology called HIAT. The proposed toolkit contributed much to productivity improvement in our actual artistic activities. 1 Introduction 2 System Overview Multimedia has been spread into daily life, and the chance people enjoy creating multimedia contents has been increasing[l]. Recently various kinds of authorware for multimedia contents are available. It is, while, difficult to find environments to create interactive digital art including real-time media control based on gesture. One of the essential points of interactive digital art is the design of hardware configuration which represents artistic concept. The composer has to take pains to prepare for gesture sensors and presentation equipment. The second point is implementation methodology. The more sensors or equipment an artist can use, the more he gets freedom in writing a piece. At the same time, the technical problem arises. Handle of the sensor data with heavy traffic sometimes troubles not only artists but technical engineers. This paper describes a toolkit for interactive digital art, which was designed to solve the above difficulties. A system proposed here for interactive digital art consists of the gesture sensor called ATOM and a framework to write a piece, called HIAT (Figurel). 2.1 ATOM8 ATOM8( analog to MIDI converter ) is the smallest MIDI instrument which has several analogs input for many kinds of sensors (Figure2, Figure3). It accepts 8 transducers signals which are amplified to standardized 0~5Volt. The size of ATOM is only about one inch cube. ATOM8 is able to control sensors and generate MIDI signal directly. The users can choose wireless or wired system in response to artistic requirement. The user can also set communication speed of data output from ATOM in every 10ms. 2.2 HIAT Overview MAX is well-known as a useful environment for interactive digital art, v18 few constraints of which regarding programming sometimes result in the difficulty in implementation. Especially, when artists want to use,ecieve various sensors which output value data, a heavy data traffic sometimes causes system crash if the program is not carefully tuned. In order to avoid such trouble, the authors prepared MIDI/RS232C some templates and a methodology * called HIAT to write interactive art using ATOM8 on MAX. HIAT The methodology regulates the usage -, of templates, but does not regulate Sartistic realization. It supports scene - transition in blackboard style, easy access to the sensor data, and Figure 1: System Overview
Page 00000002 Figure 2: ATOM8 Figure 4: HIAT Templates Figure 3: Wireless ATOM8 Attached to a Performer appropriate inner data control. The templates are maintemplate, scene-template, sensor template, and monitor template, as shown in Figure 4. We also prepared some patches to access easily to the presentation facilities such as sound effecter, computer graphics, video, and light effect. Figure 5 shows a main view of the program which was made using the main-template. 3 Scene Templates and Methodology This section shows basic way to write a piece. The scenetemplate consists of a module to check own activation (patcher activate), a module for initialization processing (patcher initialize), and a module for media control (patcher control). The first step to make a simple piece is to open a scenetemplate. When you open a scene-template, you are asked to give the scene name. After you gives a name, the scene is managed using the name. The next step is concrete implementation of the scene. You should place the patcher "sensor,, to enable the ATOM8 in the scenepatcher. Next, you can open the patcher control in the scene-patcher and write the scene function. The principle method is to connect a sensor handle to the presentation facilities, as shown in the left side of Figure 7. If you want to make a big piece, just repeat the procedure shown in the above paragraph to make plural scenes. Open the main-template, and place the prepared scenes, rf(T7)--T V in Hong Kong 1 Reset SOn/ Of I Figure 5: A Sample of Main Template as shown in Figure 5. You do not have to place the patcher "sensor,, in each scene, because the maintemplate already has patcher "sensor.,, The basic way of scene control regarding activation is to use a global handle. If the global handle has the string which corresponds to the scene name and when its flag is on, the scene is activated, and when the flag is off, the scene is dis-activated. You can run plural scenes in parallel, and can write some complicated condition to control a scene combing sensor data. 4 Using HIAT Using this environment, we wrote "Tikukan no ucyu V,, which realizes real-time control of audio and visual media by a shakuhachi player's gesture. Figure 8 is one scene from "Tikukan no ucyu V,,. Figure 5 is the main view of"Tikukan no ucyu V,, performed at ICMC'96 in HongKong. The control part works on only one PowerBook with PowerPC603e with 100MHz. (When performing a piece with visual control, we have to prepare for a desk-top computer.) We also tried to use this environment for an interactive digital art featuring dance as shown in Figure 9. This project was very big, and the total number of staff was beyond 30. It was difficult to arrange trained engineers to cover a number of scenes. Three non-experienced
Page 00000003 students took charge of three dancers respectively, and wrote scenes where the dancers appear. The students could write control programs in one month. We can see the effectiveness of the toolkit, thinking they should have also learned the usage of sound equipment. 5 Summery This paper introduced a toolkit for interactive digital art. We confirmed the effectiveness of the toolkit by applying it to actual artistic activities. We would like to write other pieces and also provide our toolkit to artists who are in trouble with writing interactive art. References  Camurri, A.1995. "Interactive Dance/Music Systems.", Proc. ICMC, Banff, pp.245-252.  Katayose, H. "An Environment for Interactive Art, Proc. ICMC, HongKnog, pp.173-176. pther atiate The name of this patcher is music patcher initialize patcher control,................... [_______________________ Figure 6: Scene Template Figure 7: Control Patcher in a Scene Template Figure 8 "Tikukan no ucyu V" Figure 9: "LIFE ON ICE" 1996