~Proceedings ICMCISMCI2014 14-20 September 2014, Athens, Greece
Towards Touch Screen Live Instruments with Less Risk: A Gestural Approach
Edward Jangwon Lee
Graduate School of Culture Technology
KAIST
[email protected]
ABSTRACT
Although touch screen interfaces such as smartphones and
tablet PCs have become an important part of our life and
are being used in almost every situation, these interfaces
are facing some difficulties in being used in live musical
performances, despite the numerous benefits they can musically offer. Among those difficulties, we identify and
focus on the visual dedication requirement of interaction
and nevertheless high risk of making mistakes, and design a simple musical interface aiming to alleviate these
problems. In order to reduce visual dedication, we employ
larger on-screen controls. To reduce risk of mistakes, we
choose a gestural approach and incorporate plucking gestures, which require users to pull and release a touch after initiated. The interface is qualitatively tested, focusing
on playability, visual dedication, and risk of making mistakes. While playability and risk received positive feedbacks, reducing visual dedication received partial agreement and seems to require further investigation. Although
the interface is yet immature and too simple to be used on
stage, we believe that identifying and solving the problems
that touch screens have while being used in live situations
is meaningful and valuable to discuss.
1. INTRODUCTION
The introduction of touch screen interfaces such as smartphones and tablet PCs, alongside with their numerous novel,
fast and accurate sensors, has changed our lives in a way
that we have never imagined before. These new interfaces
seem to be capable of almost anything and there are applications that are used in both casual and professional fields,
leading smartphones to become an indispensable part of
our life.
Many researchers and artists have seen great live music
possibilities in touch interfaces, and many results can be
found throughout the music computing literature. Alongside with new protocols such as OpenSound Control (OSC)
[1], touch interfaces can be hooked into a network and
serve as a control surface with low latency using softwares
such as Control [2], enabling composing and performing
in a way we have never imagined before.
Copyright: 2014 Edward Jangwon Lee et al. This is
an open-access article distributed under the terms of the
C vC n run Ur, s which permits unrestricted use, distribution, and reproduction in any medium, provided the original
author and source are credited.
Woon Seung Yeo
Division of Digital Media
Ewha Womans University
[email protected]
However, compared to the appealing features and creative
opportunities touch screen devices can offer, it seems that
these devices are not gaining enough popularity on live,
on-stage situations as an instrument. We believe that identifying the musical obstacles that touchscreen devices are
facing and designing digital musical interfaces in a manner
that can possibly overcome those obstacles will surely further promote the usability of them on stage. One of the major obstacles might be the risk of making mistakes. Touch
screens highly suffer from accidental touches, which cannot be afforded to happen during live performances. This
problem can be relieved by a gestural approach, since gestures that can too easily trigger interactions may be the
main reason of accidental inputs. Incorporating plucking
gestures [3], which requires a marginal cost of interaction while offering additional sound parameters, might be
a possible remedy for this. Section 1.1 discusses the difficulties touch screens have in being a reliable on-stage instrument.
While many other issues might exist, this paper identifies
and discusses a number of these obstacles, and presents a
simple digital musical interface for user testing. Although
this interface is yet too simple to be used in serious live situations, we hope that this piece of work provides a discussion point in finding and solving the problems that touch
screen devices have in being selected in live situations.
1.1 Touch Screens and Live Performances
Despite the great possibilities that touch screen devices can
offer, such as networking and versatile user interface programming, why are these devices not widely used enough
in live performances? Among numerous possible reasons,
we present a few of them that suit to our research. First,
touch screens mostly require heavy visual dedications, unlike traditional instruments. Geiger (2006) states that throughout the history of instruments, only few instruments rely on
visual feedback [4]. Moreover, in collaborative ensemble
situations, the performer must interact with other players
and possibly the audience - making visual dedication to interfaces even further costly. Upon this reasoning, Walther
et al. (2013) devised a MIDI controller based on swipegestures using the whole screen as a single canvas, thereby
reducing the required visual effort on finding the exact position to touch [5]. Another good example addressing visual dedication problems is CarPlay I, which includes sev1 CarPlay by Apple (http: //www.apple. com/ios/carplay)
addresses visual dedication problems by employing voice and inbuilt car
controls to manipulate touch screen smartphones while driving.
- 780 -