A graphical user interface for MIDI Signal Generation and sound synthesisSkip other details (including permanent urls, DOI, citation information)
This work is licensed under a Creative Commons Attribution-NonCommercial-NoDerivatives 3.0 License. Please contact firstname.lastname@example.org to use this work in a way not covered by the license. :
For more information, read Michigan Publishing's access and usage policy.
Page 276 ï~~A Graphical User Interface for MID Signal Generation and Sound Synthesis Youichi HORRY Central Research Lab., HITACHI Ltd. Kokubunji, Tokyo 185 Japan email@example.com Abstract We propose a graphical user interface for sound synthesis and MIDI signal generation by arranging objects in three dimensional space and manipulating them. The movement of each object is controlled by external devices such as a mouse, data gloves or MIDI instruments. In addition, the objects can move automatically according to specific rules. For MIDI signal generation, the position of each object represents key number, key velocity and panning. For sound synthesis, the objects represent original waveforms in granular synthesis. The amplitude, pitch, and timing of each waveform are characterized by the position of the objects. Using our graphical user interface, the complex parameters of music construction can be controlled in real-time, and visually impressive animation of computer graphics can be obtained. 1. Introduction 2. A Graphical User Interface Computer graphics have been used for visualizing sound and music [Pressing et al., 1993], as well as for a graphical user interface of sound synthesis and composition. Whitney  attempted the fusion of graphics and music connecting geometrical patterns with harmonization. In the UPIC system, curves in two dimensional space are used for sound synthesis, in which the horizontal axis represents time, and the vertical axis represents various parameters such as pitch. Increasing computer power, especially graphics power, makes it possible to generate in real-time both high quality animation of computer graphics and music. On the other hand, many types of algorithmic compositions and sound syntheses have been proposed. However, they require an excessive number of parameters which are too complicated to be handled in real-time. So, we would like to propose a new graphical user interface for handling parameters of MIDI signal generation and sound synthesis. To make operations of MIDI signal generation and sound synthesis easier, we use objects in three dimensional space extending the concept of a graphical score (Figure 1). Figure 1: Graphical user interface panel Music is constructed by the shape, rotation, and Music Graphics 276 ICMC Proceedings 1994
Page 277 ï~~the position of each of these objects. The movement of each object is controlled by external devices such as a mouse, data gloves or MIDI instruments. For example, each object follows the cursor movement dictated by the devices. In case of using MIDI instruments, the position is determined by MIDI signals such as key number, key velocity, and key pressure. In addition, the objects can move automatically according to specific rules such as random walk or convergence to a point. This graphical user interface is useful for MIDI signal generation and sound synthesis. 2.1 MIDI signal generation Figure 2 shows an example of using our graphical user interface for MIDI signal generation. The position of each object represents key velocity, panning, and key number according to the X, Y, Z axes respectively. generated and is sent to an external sound module. For tonal music, the generated MIDI signal is filtered before it is sent to an external sound module via MIDI (Figure 3). Each track has its own filter which is set by using a key number value through the MIDI keyboard. If a new filter is set after a certain time lag, the previous filter setting is canceled. Therefore chod progression can be controlled in real-time. In addition, setting different filter for each track makes it possible to generate bitonal or polytonal music (Figure 4). Figure 3: i e tonaly fier jor each trac. Figure 2: Usage for MIDI signal generation. The objects are classified into several tracks. Each track has its own shape, color, and MIDI channel (i.e. program number). In addition, associated with each track is a motion rule for the objects. The objects move automatically according to the motion rule of the track that the objects belong to. Simultaneously, the MIDI signal which is characterized by the position of the object is Figure 4: Example of MIDI signal generation. Upper score: C major scale filter Middle score: G major scaleffilter Lower score: Db major scale filter The generated MIDI signals can be saved as a standard MIDI file, and can be arranged and modified by other sequence software. ICMC Proceedings 1994 277 Music Graphics
Page 278 ï~~2.2 Sound Synthesis Our graphical user interface is useful for not only MIDI signal generation but also sound synthesis. For the sound synthesis, the objects represent original waveforms in granular synthesis. Granular synthesis is one method of sound synthesis proposed by Xenakis , and explored by Roads . The main concept is to synthesize sound by arranging original short waveforms using either a lre-detennied or random pattern with respect to time by varying their pitch and amplitude. shapes represents the variation of the original sound waveform. The rotational speed of each object represents the modulation of the original sound waveform (Figure 6). Figure 7 shows a result of sound synthesis using our graphical user interface. The generated sound is played using the Audio Library (AL) on an IRIS Indigo2 (Silicon Graphics). w rww!"N Figure 7: synthesized wave 3. Using Virtual Reality Equipment We extended our system by incorporating equipment used for virtual reality. To control object motion, we use a Cyber Globe (Virtual Technologies Inc.) for finger motion capturing and a 3 space (Polhemus Inc.) instead of a mouse for wrist position detection. The wrist position which the 3 Space detects is used for the control of object position. The finger motion (grasping or opening) controls the rotational speed of the objects. We use the left hand for these motions while the right hand is used for MIDI keyboard control. Stereoscopic animation of computer graphics is projected on a display via a Crystal Eyes (Stereo Graphics Inc.). The synthesized sound from the IRIS Indigo2 and MIDI-triggered sound from the sound module are mixed by a Comp. Mixer 7's (Mark of the Unicorn) for three dimensional sound localization [Bosi, 1990]. Controlling the volume and the delay, the sound is played through eight s ers stationed at the corner of a cube surrounding the performer (see Figure 8 and 9). Figure 5: Usage for sound synthesis. a. _. WA iii" Figure 6: modulation by rotation Left:pre-modified original waveform for granular synthesis. Center: cosine wave for the modulation; the higher rotational speed, the higher frequency of the cosine wave. Right: the modulated sound wave. In our system, the amplitude, pitch, and timing of each waveform are characterized by the position of the objects (Figure 5). The difference of the objects' Music Graphics 278 ICMC Proceedings 1994
Page 279 ï~~S MIDI Figure 8: Block diagram incorporating virtual reality equipment. L'oiseau vent: Violin and Recorder: Nagoya (1993) L'oiseau glacee: Piano and Recorder: Tokyo (1993) Dawn Bird: Koto and Shakuhachi (Husky Bamboo Recorder): New York (1994) All pieces were composed by Hinoharu Matsumoto (Tokyo National University of Fine Arts and Music). 5. Summary We propose a graphical user interface for MIDI signal generation and sound synthesis by arranging objects in three dimensional space and manipulating them. Using our graphical user interface, the complex parameters of sound synthesis can be controlled in real-time, and visually impressive animation of computer graphics can be obtained. References [Bosi, 1990] Marina Bosi. An Interactive Real-time System for the Control of Sound Localization. Computer Music Journal, 14 (4) p.59, Cambridge, MIT. 1990 [Pressing et al., 1993] Jeff Pressing, Chris Scallan, Neil Dicker. Visualization and Predictive Modelling of Musical Signals using Embedding Techniques, Proceedings ICMC Tokyo p.110. 1993 [Roads, 1988] Curtis Roads. Introduction to Granular Synthesis, Computer Music Journal, 12 (2), Cambridge, MIT. 1988 (Whitney, 1980] John Whitney. Digital Harnony On the Complementarity of Music and Visual Art. McGraw-Hill Inc. 1980 (Xenakis, 1971] lannis Xenakis. Formalized Music. Bloomington, IN: Indiana University Press. 1971 Figure 9 4. Realization Concerts were held featuring our system with acoustic instruments (instead of a MIDI keyboard) as input devices connected to a pitch-to-MIDI converter. The generated sound and computer graphics were presented to the audience. The title and acoustic instruments used are as follows: L'oiseau d'autrefois: Koto (Japanese 13 strings) and Uta (Vocal): Tokyo (1992) ICMC Proceedings 1994 279 Music Graphics