An Application of Interactive Computation and the Concrete Situated Approach to Real-time Composition and PerformanceSkip other details (including permanent urls, DOI, citation information)
This work is licensed under a Creative Commons Attribution-NonCommercial-NoDerivatives 3.0 License. Please contact firstname.lastname@example.org to use this work in a way not covered by the license. :
For more information, read Michigan Publishing's access and usage policy.
Page 00000001 AN APPLICATION OF INTERACTIVE COMPUTATION AND THE CONCRETE SITUATED APPROACH TO REAL-TIME COMPOSITION AND PERFORMANCE Angelo Bello 224 rue de Charenton 75012 Paris, FRANCE 103132.1666 @compuserve.com Abstract Recent work carried out on a real-time composition system (the UPIC system) lends support to a new paradigm shift which is manifesting in the computer science and artificial intelligence fields described as the concrete situated approach and interactive computation. Interactive computation has been described as a technique which supersedes the capabilities of the Turing machine, because of the fact that Turing machines are incapable of interpreting asynchronous events. The concrete situated approach stresses a shift from a mentalist to an interactionist representation of activity. Such approaches shall be discussed within the context of computer music composition and performance. 1 Introduction In the introduction to his lecture titled "Time and Being" in 1962, Martin Heidegger suggested: "Let me give you a little hint on how to listen. The point is not to listen to a series of propositions, but rather to follow the movement of showing." [Heidegger 1972] His suggestion seems appropriate as one confronts the often diametrically opposed worlds of the mediated -of sophisticated computer technology; and that of the musically immediate - of improvisation and interpretation, of the world at hand, without barrier. What is real-time interactive composition? Human interaction with machines for the purpose of making music is somewhat of a characteristic of electroacoustic and computer music of this century. Certain electro-mechanical instruments have been more accepted by interpreters and creators of contemporary music, such as the electronic organ for example. The acceptance of certain electrical instruments into the catalogue of "standard" instruments may be due to the fact that such instruments do not stray far from established paradigms, i.e., the piano keyboard. Leon Theremin's invention in 1928 introduced a radical shift in the approach to musical performance. Never before was an instrument performed without the performer being required to touch the instrument itself, or any type of dislocated controller connected to the instrument proper, for that matter. The advent of the general purpose computer then permitted instruments like the Theremin to be transformed from an instrument as such, to controllers of other, perhaps more sophisticated systems of sound synthesis [Chadabe 1983]. Controlling a general purpose computer for the purpose of creating sound, musical or otherwise, without the aid of metacontrollers such as Theremin-like devices, can present other paradigmatic difficulties. Are we prepared to let a computer perform an entire composition, without the interruption of a human interlocutor? The Turing machine concept has been used to characterize the operation or execution of particular computer music compositions. For example, Xenakis' Gendy 301 and S709 pieces have been described as "computable music" [Hoffmann 1998], or "Turing music" [Bello 1998]. Inherent in this type of computer composition is the fact that all aspects of the resulting sound are dependent upon the program code which controls the computer hardware. That is, the computer program may be considered a representation of the composition, and thereby of the sound itself. The entire piece is defined by the program code required to carry out the operation of the computer and its associated connected equipment.
Page 00000002 2 The UPIC System The present UPIC system [Marino 1993] is a sophisticated graphical user interface environment, permitting the composer to "draw" on a two-dimensional graphical area displayed on the computer screen, called a "page" in UPIC terminology. The horizontal axis of the page represents time, and the vertical axis denotes frequency or pitch (called the ambitus range). The lines and curves created by the user are then converted to sound by a bank of 64 oscillators, contained in a synthesis unit, which is connected to the host computer via communication cables. Sounds may be sampled into the UPIC system, and then later processed and transformed by various techniques such as amplitude and frequency modulation. An array of sophisticated editing tools is available which permits the user to craft his or her sonic palette within the UPIC environment. These tools include four general categories of waveform specification and modification: the waveform (the sound source itself), the amplitude table, the frequency table, and the envelope table. The UPIC page of lines and curves is the score (the lines and curves are called "arcs"). The sonic structure of these arcs are defined by the four parameters mentioned above. A maximum of 4000 arcs may be drawn on a single UPIC page, and a maximum of 64 oscillators can be played at once to create the sound output (a future version of the UPIC proposes 512 to 1024 oscillators). A score is "performed" through the use of a vertical line/cursor which scans the page of arcs created by the user. The tracing of the cursor may be programmed via a Sequencer option. With the sequencer, the user is able to trace at constant or various speeds (accelerando or decellerando), as well as tracing in retrograde motion (a reverse playback of the page and the arcs). During cursor playback, changes in ambitus (frequency or pitch) and amplitude (of the waveforms) could be read by the UPIC synthesizer, whereby sound output is then generated, following the contours of the arcs in time. The user may synthesize sounds using an additive process, whereby many arcs are summed within a Page, each arc having its own characteristic time pressure curve. With this method, in order for a temporal transformation in the sound to occur, the user must initiate playback of the arcs on the page, where the cursor traces the arcs' duration and position. The multi-timbral layering of the various arcs, along with their individual characteristics, has created a large catalogue of sounds and textures. Another option within the UPIC system is the frequency modulation technique available for each of the arcs on a page. Using this option, the composer may frequency modulate any arc on a page, by any other arc on the same page, as long as the modulating arc superimposes the temporal scale of the carrier arc. Any waveform created by the user may be assigned to both the carrier and/or the modulating arc. As a result, many complicated timbres can be created, rich in harmonic content, and rich in color and texture. The modulating and carrier waves may be extracted from sampled sound material, and, in addition, the FM implementation on the UPIC system permits the construction of recursive feedback algorithms where the modulating waveforms themselves may be modulated by the carrier arcs they are in fact modulating. Complex systems can be organized, where various groups of arcs may be associated with each other as modulators and/or carriers. The graphical page on the screen then becomes a type of algorithmic FM space for the organization of various feedback modulator/carrier combinations. The graphics page is not necessarily used as a visual representation of a score, or is not even the score itself. The graphics page becomes an algorithmic scratchpad in a sense. Many of the control parameters defining the FM algorithms may be adjusted in real-time, as the synthesis unit is interpreting the graphical data and the organizational structure of the algorithmic arcs on the page. Using the mouse pointing device, one can access the various control parameters as the sound is synthesized. In this way, the UPIC is transformed from a synthesis unit under the control of a predetermined graphical organization of arcs and lines, to a computer music composition instrument, where the composition, as well as the sonic material, is created and/or synthesized in real time by the composer or performer. One can suggest that the system is converted from a mentalist, predefined and predisposed Turing process, where the graphics page was read by a moving cursor, to an interactive, concrete situated environment, where the performer of the UPIC system takes on an active role in the determination of the ultimate piece, the final result. 3 The Concrete Situated Approach and Interactive Computation In the concrete-situated approach to activity [Chapman 1991, Agre 1997], organization of activity is an emergent quality, the result of interactions with the environment. The result, or the complexity of the result, is much greater than the sum of the complexity of the constituent parts of the domain which make up the environment. The agent in our case would be the composer-performer of a real-time computer music composition instrument. One reacts to the environment in the present, without the trappings of concentrating within a mentalist approach where causality is not a function of an interaction between agents and the world. The Turing machine paradigm is a mentalist one. It does not characterize human interaction with a problem solving program under execution on a general purpose computer. The program "runs", and the answer is "output" to the user. Newer and more advanced computer systems permit the execution of programs in real time, affording the user with the ability to
Page 00000003 modify the parameters of an algorithm while the computer is "crunching the numbers". The user is now an active agent in the process of obtaining an answer from the GPC. The Turing machine concept is now extended, giving rise to an augmented definition of the processes involved, Interactive Turing Machines [Wegner 1997. 1998]. Interactive Turing Machines are capable of distinguishing asynchronous events, or events which are not specified by the algorithm as such, but by an interactive user who specifies adjustments "in the moment" during execution of the program. This figure shows a grouping of arcs from an UPIC graphics page created by the author for the performance of the composition Maya. These 4 groups are part of a larger set of arc groups (this is an enlargement of an area of the UPIC graphics page), which contains an additional 6 groups identical to the ones shown above, for a total of 10 (the additional groups extend in a similar manner below the ones shown above). In addition, a smaller group of arcs, positioned lower on the graphics page (lower on the frequency scale), are used to frequency modulate the larger group, of which the arcs in the figure above are part of. These arcs then in turn modulate the second group, creating the recursive feedback FM synthesis algorithm
Page 00000004 4 Improvisation This word is heavy with connotations and implications. To improvise on an instrument in the midst of creation is not a foreign act to most if not all composers. To improvise the composition itself may be another matter altogether. And would one suggest that the word improvisation is an adequate one to describe what in fact is transpiring during the course of composition? In the case of the UPIC situation described above, a set of etudes, UPIC Etudes 2-8, and the composition Maya, have been created in real-time. The structure and form of the piece were not predetermined. The algorithm describing the manner in which the UPIC system was to synthesize the raw sonic material was created prior to the performance of the pieces, however, what is ultimately heard is an emergent result of the interaction between the user of the UPIC and the algorithms which had been set in motion to generate the sound. 5 Conclusion This paper presented an approach to real-time performance and composition using a computer. New ways of considering problem-solving and computation are manifesting in the computer sciences, and seem to support the techniques used in real-time composition described herein. The significance of these new techniques lies in the manner in which activity is described as interactionist, and causality is the result of the interaction of agents and machines. The Turing machine construct is no longer adequate to model such interactionist activity. Composing can be considered an interactive activity, where the composition itself becomes an emergent manifestation of that activity. References [Agre 1997] Agre, Philip E., Computation and Human Experience, Cambridge, England: Cambridge University Press, 1997. [Bello 1998] Bello, Angelo, "Emergent Evolution of Sound, Space, and Time and the Tools of lannis Xenakis", Presences de lannis Xenakis, International Symposium on lannis Xenakis, Paris: CDMC/Sorbonne/Radio France, February 1998, forthcoming. [Chadabe 1983] Chadabe, Joel, "Interactive Composing: An Overview", Computer Music Journal, vol. 7, 1983. [Chapman 1991] Chapman, David, Vision, Instruction, and Action, Cambridge, MA: MIT Press, 1991. [Heidegger 1972] Heidegger, Martin, On Time and Being, tr. by Joan Stambaugh, New York, NY: Harper & Row, 1972. [Hoffmann 1998] Hoffmann, Peter, "Evaluating the Dynamic Stochastic Synthesis", Journ6es d'Informatique Musicales JIM 98, 05.05.-07.05.1998, La Londe-les Maures, France, textes des communications, pp. F4.1- F4.8. [Marino 1993] G6rard Marino, Jean-Michel Raczinski, Marie-H61ene Serra, "The New UPIC System: Origins and Innovations", Perspectives of New Music, vol. 31 (1993), no. 1, pp. 258-269. [Wegner 1998] Wegner, Peter, "Interactive Foundations of Computing", Theoretical Computer Science, Feb. 1998. [Wegner 1997] Wegner, Peter, "Why Interaction is More Powerful than Algorithms", Communications of the ACM, vol. 40, no. 5, MAY 1997, pp. 80-91, http://www.cs.brown.edu/people/pw/ ("The Paradigm Shift from Algorithms to Interaction").