Audio-Visual Instruments in Live PerformanceSkip other details (including permanent urls, DOI, citation information)
This work is licensed under a Creative Commons Attribution-NonCommercial-NoDerivatives 3.0 License. Please contact firstname.lastname@example.org to use this work in a way not covered by the license. :
For more information, read Michigan Publishing's access and usage policy.
Page 595 ï~~Audio-Visual Instruments in Live Performance Ross Kirk, Andy Hunt, Richard Orton Music Technology Group, University of York, YO 1 5DD, UK Email: email@example.com ABSTRACT: This paper describes how graphical objects can be combined with sonic objects to enable multimedia performance and composition. It describes how these objects, based on the unit paradigm, can be integrated within more well-established audio synthesis techniques, and how graphical performance instruments can be constructed within the paradigm. Introduction A companion paper in these proceedings (Kirk et al 1995) describes the use of the MIDAS system in extending the concept of the audio unit generator to encompass a unit generator capable of producing graphical output This enables the addition of image to performance and composition, so that an integrated approach to multimedia performance instruments can be achieved. The reader is referred to that paper for a discussion of the basic mechanisms employed within the operation of the MIDAS system, and the way in which graphical unit generators operate within the MIDAS context. The graphical application described in the paper referred to above was essentially a static application. The screen image was used as a 'front panel' control for a synthesis system. Although the image could be changed dynamically as the system ran, the expectation was that once designed, the graphical output would not be changed significantly at run-time. By contrast, this paper describes the use of essentially the same graphical unit generator processes (ugps) in an application where the image is constantly adapted as the perfonmance proceeds. Algorithmic Control of Graphical Elements The images described in this paper are constructed from accumulations and constellations of simple graphical primitives. The examples shown here are based on the use of rectangles, although other primitives such as lines, triangles, circles and so on could have been used. Complex images can be generated by the manipulation of the position, colour, line style, shading and replication of these primitives. The images presented in the paper have been kept in a simple form, so that the basic principles can be illustrated as clearly as possible. Much more complex (and sophisticated) imagery can be produced and a new discipline of the aesthetic and praxis of composition with image using these techniques needs to be developed. As a first introduction to algorithmic control of image generation, we consider the use of a single rectangle ugp within an image synthesis network. Figure 1 shows the image produced when the X and Y position inputs of the rectangle inputs are driven by wavetable oscillators. In this (deliberately) simple case, the wavetables contained sinusoids, although there is of course no reason why more complex wavetable content could not be used. The X and Y 'size' inputs of the unit generator are also modulated by the oscillators. <7i.!; ".-... Figure 1: Wavetable Driven ____________________ Graphical Unit Generators Similarly, the application could'be extended to include seseral graphical ugps, to produce images based on a constellation of simple primitives. An algorithm in the form of a unit generator network could be IC MC PROCEEDINGS 1995 595 595
Page 596 ï~~developed to define (and modify) the relative position of graphical primitives within a constellation, whilst its 'centre of gravity' could be swept over a locus defined by other parts of the algorithmic network. accumulated accumulated data data X position Yposition Rectangle ugp Figure 2 Algorithmic UGP Network Figure 3 Image Resulting from this Network The use of algorithmic unit generator networks to define images is further illustrated by Figures 2 and 3. In this case, the X and Y positions of a single rectangle ugp is driven by separate algorithms which add constrained random numbers to the current accumulated position of the X and Y co-ordinates. It should be emphasised that much of the visual interest of the images is lost by the need to present static 'snapshots' of the graphical output on the printed page. A considerable part of the aesthetic interest lies in the dynamic of the evolution of the images. This dynamic property could be enhanced by a manipulation of the history of the evolved image. For instance, images of primitives could be replaced by modified instances (eg changed colour), or even removed, as a function of the number of iterations in elapsed time, to create a visual analogue of reverberation. Audio-Visual Instruments in Live Performance It was stated in the companion paper that ugps exist which respond to mouse position and MIDI input. This allows performance control to be integrated into the unit generator network. Numeric output data can be derived from these ugps and placed into inputs of graphical and/or sonic ugps, as required by the instrument designer, so that image and sound can be made responsive to performance gesture. Taking the example of the unit generator network shown in figure two, performance control could be configured to influence random number seed, or alternatively the base values of X and Y position, to which random numbers are added, or perhaps all of these inputs, depending upon the number of the gestural performance channels available. Experience with the use of such a graphical performance instrument does, perhaps encouragingly, exhibit some of the properties familiar to other instrumentalists. Taking the simple example of mouse X and Y position controlling the frequency of the sinusoids used for the network which produced figure 1, such a diversity of image contour can be built up that the screen rapidly fills with image primitives. This might correspond to the sound produced by an instrumentalist's fitrst approach to a new instrument. With practice, it is possible to localise and sculpt the image generation to form a more coherent pattern. Clearly practice is needed! Conclusion The graphic unit generator has proved itself as a flexible (user configurable) mechanism for the production of multimedia sound/image composition and performance. We now need to turn to the problem of the development of composer's and performer's art, made possible by such innovation. Reference: Kirk, P R;, Whittington P, Hunt A D, Orton R (1995). Graphical Control of Unit Generator Processes on the MIDAS System. Proceedings of ICMC, Banif 596 I C MC PROCEEDINGS 1995