2.1 Background: Performing Spectrograms The first machine for reconstructing sound from spectrographic images appears to be the Pattern Playback machine built by speech researcher Franklin S. Cooper at Haskins Laboratories in the late 1940s. In this system, spectrographic sound patterns are hand-copied in white paint onto an acetate belt, and then conveyed at seven inches per second past a photoelectric sensor. Simultaneously, an intense slit of light from a mercury-arc lamp is focused onto a rapidly rotating "tone wheel." This disk, which has 50 concentric variably-spaced apertures, admits light at a variety of periodic intervals (ranging from 120 to 6000 Hz) onto the belt. Light modulated by the wheel and directed onto the spectrogram belt thus reflects to the photocell only those portions of the light which carry the frequencies corresponding to the painted pattern [2,10]. Signals from the photocell are then amplified and directed to a loudspeaker. listen to spectrograms simultaneously and in real-time [9]. The core UPIC interface concept has been maintained in the popular Metasynth software [12], and extended in my own Yellowtail [7], wherein the user can draw procedurally animated marks into a real-time spectrographic score. LENS LIGHT CYL. TONE SOURCE LENS WHEEL PATTERN AMPLIFIER LOUDSPEAKER PLAYBACK Figure 2. Cooper's 1951 Pattern Playback system. From [10]. Cooper's Pattern Playback machine continued to find use in audio perception studies as late as 1976; the original device, which is still operational, now resides in the Haskins Laboratories Museum in New Haven, Connecticut [10]. Figure 4. lannis Xenakis' 1977 UPIC system. From [11I]. The use of a camera to interactively 'perform an image' - rather than a single-point cursor - forms the second main interaction paradigm for live spectrographic sequencers. An early real-time implementation of this was developed by Finnish artist-researcher Erkki Kurenniemi in his 1971 DIMI-O ("Digital Music Instrument, Optical Input") system, which simply treated a live video image as if it were a spectrogram. In this system, a graphical "current time indicator" scanned the live video image from left to right; when this indicator overlapped a sufficiently dark or light video pixel, a synthesizer generated a chromatic tone whose pitch was mapped to the vertical coordinate of the pixel [5]. A modern implementation of this concept can be found in the Additive Synthesis demo patch which ships with Cycling74's Jitter toolkit [4]. The project described in this paper is related to these priors, but uses an AR projection to provide precise visual feedback to the user. Figure 3. Cooper's 1951 machine as seen today. From [10]. A significant limitation of this optomechanical device is that it could only be used, as Cooper's title suggests, for spectrographic playback. With the introduction of real-time digital audio synthesis, two main interface paradigms have arisen to enable live improvisation with spectrographic images: drawing-based and camera-based interfaces. lannis Xenakis' UPIC system, first realized in 1977, is emblematic of the former. Consisting of a graphics tablet interfaced to an HP computer, users of the UPIC could gesturally create, edit and store spectral events with unprecedented precision. By 1988, a version developed by Raczinski, Marino and Serra allowed users to draw and Figure 5. Erkki Kurenniemi's 1971 DIMI-O system. From [6]. 3 The Scrapple Instrument 3.1 Overview: The Table is the (Active) Score The spectrographic performance instrument described in this paper, Scrapple, consists of a Windows PC, custom software, a 2-to-3m long table covered with a dry-erase board (which serves as the primary user interface), and a digital video camera which observes the table from above. Users perform the instrument by drawing or erasing marks 152 0
Top of page Top of page