other (and, notably, independent of the graphics frame
rate), and may in fact change as the real-time system
evolves. Coney Island integrates user input from ten MIDI
drum pads, physically-based mechanical simulations, and
three-dimensional geometric models created with
AliaslWavefront's Maya. The application runs on a four
processor Silicon Graphics Onyx 2 with an Infinite Reality
2 graphics board.
Graphics and Particle Simulations
The visual space presented in Coney Island includes five
islands floating on top of ocean waves, each of which
contains a mechanical game. The games are similar to
pinball: users apply forces to move particles toward some
goal. Each island consists of a hierarchical geometric
model created in Maya, and a physically based particle
simulation to drive the animation. The particle systems
model the forces applied by the user, particle collision
against other particles and against three dimensional
geometry, particle mass and radius, gravity, and friction.
The differential equations used to compute the physical
simulations require a consistent service rate, which was set
to 20 Hz. Unfortunately, the graphics frame rate is not
predictable, and at a given time falls in the 12-15 Hz range,
depending upon which island is being visited and how
much particle activity there is. Therefore the particle
simulations and OpenGL rendering code are run in distinct
parallel processes.
Interactive Presentation and Large-scale Form
The Coney Island experience is organized as a tour of the
islands, with periodic transitions underwater to tour the
debris leftover by the history of gaming on the islands. The
computer graphics camera travels to each region where
visitors spend some time interacting. Although the overall
organization and quality of the presentation is specified by
the environment's designers, many of the details of the
presentation, in particular the camera angles and the order
and timing of events, are controlled by intelligent
algorithms. During the tour, the order in which the islands
are visited is chosen at random, although each island is
visited only a single time. Once at an island, the system
becomes sensitive to the level of user activity. If there is
no immediate user input, the game will demonstrate itself
by briefly running automatically. An algorithm chooses
camera positions and camera editing patterns, based upon
which parts of an island are active due to user input. The
camera algorithm is designed to produce results that make
sense cinematically and help explain the operation of the
game mechanisms. After a game has been running, a new
island will be visited if the amount of input dies down.
A basic feature of the ScoreGraph system is that the
directed graph that organizes an application can be
reconfigured as it runs. New processes can be started,
existing processes may be shut down or reduced in
computational load, and connections between nodes can be
made or broken. In Coney Island this occurs each time an
island is visited. The drivers for the MIDI drum pads are
reconfigured to control a different mechanism. A new
particle system is started and the previously running
simulation is shut down. This provides a smooth scene
change between processes that are essentially separate
applications.
4. Coney Island Sound Production
Coney Island includes three classes of interactivity
with sounds:
* action space performance and extended causality;
* active navigation and direct manipulation of
synthesis parameters;
* passive navigation and positional influences upon
auditory space in environmental dynamics.
Action space performance generates sounds from user
actions synchronized to motion-based events
displayed graphically. Players influence sound
production by engaging with motion simulations, an
application of the Generative Mechanism principle
discussed by Choi (2000a). Mechanical frictions and
particle collisions in the islands are applied to control
STK physically-based and modal synthesis
instruments, creating quasi-realistic friction and
collision sounds; at the same time the data is applied
to granular synthesis implemented in jMax to produce
particles of speech. The palette ranging from realistic
to metaphorical sounds is a compositional design
applied to virtual locations and simulated mechanics.
The Coney Island grand tour brings about transitions
from realistic to metaphorical sounds, realized at the
level of the sound particle. Underwater locations
abandon realism in favor of granular speech
assemblages determined by wave equations stirred up
by percussion pad forces.
Active navigation and direct manipulation of synthesis
parameters occurs in select underwater regions where
a single player may use a joystick to navigate a small
animated submarine. The VR camera follows
automatically. The submarine is constrained to
traverse floating abstract surfaces, and its position on
each surface is applied to the tuning of sound
synthesis parameters by mapping position to a highdimensional parameter control space (Choi 2000b). In
these regions the particles of speech may be
transformed into intelligible phrases.
Passive navigation with positional influences occurs in
regions where the sounds are determined by dynamics
that are independent of the players' actions, while the
position of the camera determines activation of the
sound sources and spatialization of the resulting
sounds. These sound sources are distributed in a
designated region under the islands, represented
visually as a field of floating historical debris. When
activated by camera proximity these debris emit
complete speech excerpts from historical recordings.
Four parallel Spat patches in jMax simultaneously
process four source positions to create distance and
directional cues. The camera position activates no
more than four sources simultaneously so that all
sources may be scheduled in one of the four Spats.