The Table is The Score: An Augmented-Reality Interface
for Real-Time, Tangible, Spectrographic Performance
Golan Levin
School of Art, Carnegie Mellon University
golan [at] andrew.cmu.edu
Abstract
Real-time performance instruments for creating and sonifying
spectrographic images have generally taken the form of
stylus-based drawing interfaces, or camera-based systems
which treat a live video image as a spectrogram. Drawingbased approaches afford great precision in specifying the
temporal and pitch structures of spectral events, but can be
cumbersome, as they only accept input from a single point;
camera-based approaches offer quick flexibility in all-around
image improvisation, but poor compositional precision
because of inadequate visual feedback to the user. In this
paper, I present a camera-based spectrographic performance
instrument which affords both compositional precision and
improvisatory flexibility. This is made possible through an
augmented reality (AR) projection overlaid onto and
carefully aligned with a dry-erase performance surface.
Keywords
Audiovisual performance instrument, augmented reality,
spectrographic performance, graphic sound synthesis.
1 Introduction
Spectrograms, or diagrams which depict the frequency
content of sound over time, are a basic visualization tool in
computer music and acoustics. Ordinarily, spectrograms are
used to analyze pre-existing sounds. Nevertheless, the
concept of a composition and performance tool with a
spectrographic input interface - capable, in theory, of
allowing a musician to construct sound entirely from the
bottom up - is a recurring one in computer music.
Attempts to build interfaces for spectrographic
performance instruments have generally elected to prioritize
either compositional precision (with cursors) or
improvisatory freedom (with cameras). In this paper, I
introduce a solution which I believe offers a good measure
of both. To accomplish this, I use techniques borrowed from
the field of "augmented reality", which Lev Manovich has
defined as the "overlaying of dynamic and context-specific
information over the visual field of a user" [8].
In my system, objects placed on a table are interpreted as
sound-producing marks in an active spectrographic score.
Video projections cast onto this table transform the
instrument into a simple augmented reality, in which the
users' objects are elaborated through colorful and
explanatory graphics. Every point on the table's surface, and
each pixel in the camera's view, corresponds to a unique
time/frequency possibility, and is performable as such.
Figure 1. The Scrapple spectrographic instrument in use. On the
table are a variety of dark rubber and felt objects. The table is also
a dry-erase surface and can be scribbled on with conventional
whiteboard markers. Note the real-time video projection, from
overhead, of various augmented-reality (AR) information layers: a
grid representing subdivisions of time and pitch; a "Current-Time
Indicator," which scans the table lengthwise; and glowing haloes
around the physical objects, indicating successful detection.
2 Background
Various implementations of spectrographic sequencers
have been created over the past 60 years. In this section I
briefly survey a selection of these systems, with an eye
towards better understanding the tradeoff between
compositional precision and real-time instrumentality.
151
0