~ICMC 2015 - Sept. 25 - Oct. 1, 2015 - CEMI, University of North Texas
Kinesonic Composition as Choreographed Sound:
Composing Gesture in Sensor-Based Music
Aurie Hsu, Steven Kemper
Mason Gross School of the Arts
Rutgers, The State University of New Jersey
aurie.hsu@rutgers.edu, steven.kemper@rutgers.edu
ABSTRACT
Music composition is seldom considered a physical activity
or embodied experience. As current technologies enable the
mapping of movement to musical parameters, the consideration of gesture and movement becomes essential to shaping
the identity of a piece. This paper discusses the concept of
choreographed sound as part of "kinesonic composition, "
an approach that foregrounds embodied experience, integrates physical and imagined gesture, kinetic and kinesthetic experience, and sonic elements. It also describes the
Remote electroAcoustic Kinesthetic Sensing (RAKS) system,
a wearable wireless sensor interface designed specifically
for belly dance movement. Discussing three recent pieces
that use the RAKS system, the authors outline the bidirectional relationship between movement and music in a kinesonic framework.
1. INTRODUCTION
Kinesonic composition is an approach that foregrounds
physical and imagined gesture as a composable parameter
[1]. As contemporary compositional practice increasingly
integrates movement-based technologies such as sensors,
controllers, and robotics, consideration of gesture becomes
essential to shaping the identity of a piece. By integrating
movement, attention to kinetic and kinesthetic experience,
and sonic elements, composition becomes choreographed
sound. We will discuss the concept of choreographed sound
as part of a kinesonic approach to composition in the context of three pieces developed using the Remote electroAcoustic Kinesthetic Sensing (RAKS) system, a wireless sensor interface designed for belly dance movement [2].
2. RELATED WORK
Gesture studies span many areas of research including music, semiotics, cognitive psychology, embodiment theory,
phenomenology, linguistics, and communication theory.
While there are many contexts for musical gesture, research
typically focuses on the following areas: 1) structures for
musical analysis and interpretation [3], 2) affective commuCopyright: ~ 2015 Aurie Hsu et al. This is an open-access article distributed under the terms of the Creative Commons Attribution License 3.0
Unported, which permits unrestricted use, distribution, and reproduction in
any medium, provided the original author and source are credited.
nication in performance [4], 3) motion analysis [5], and 4)
acquisition of gesture characteristics for human-computer
interaction (HCI) controller design [6]. Kinesonic composition synthesizes several of these theoretical and interpretive
frameworks from research in musical gesture [7, 4, 8], human-computer interaction (HCI) [9, 10, 11, 6], dance notation [12], embodiment theory [13, 12, 14, 15, 16], and dance
scholarship [17, 18].
3. CHOREOGRAPHING SOUND
Music composition is seldom considered a physical activity
or embodied experience (for a notable exception, see [19]).
Though composers tend to be stationary when they write
(for example, sitting at a desk), they are in essence organizing or "choreographing" the performer's physical movement. Therefore, thinking about movement can affect compositional decisions. For example, a pianist's ability to
move from plucking high strings inside the piano to playing
keys in the lowest register, may contribute to decisions
about pitch, voicing, timbre, and the sequence of sounds. In
pieces using sensor-based interfaces, the composer directly
defines and maps the relationship between movement and
sound, further employing physical gesture as a composable
parameter. By focusing on the mechanics of movement and
its relationship to sound, choreographing sound engages an
embodied perspective that involves tactile, kinetic, and kinesthetic sensory experience. This choreographic perspective,
focusing on the physical dimension of music making, is
valuable when creating sensor-based music.
3.1 Characterizing Kinesonic Composition
We describe a reconfigurable gesture as a single physical
gesture that can output varying sonic results. In sensorbased music, a movement can map to any sound. For example, moving a hand can trigger sound file playback, activate
strikers on a robotic instrument, or control filter parameters
in an electroacoustic texture. This versatile relationship between reconfigurable gesture and sound forms a basis for
research in HCI interfaces and music [9, 10, 6]. The possibilities for mapping a reconfigurable gesture to musical parameters introduce several significant compositional considerations. The composer can vary the function of a physical
-412 -
0