Page  00000453 Interactive Music for Instrumented Dancing Shoes Joseph Paradiso, Kai-Yuh Hsiao, Eric Hu Responsive Environments Group MIT Media Laboratory 20 Ames St. E15-325 Cambridge, MA. 02139 USA +1 617 253 8988 joep@media.mit.edu, kshiao@mit.edu, human@media.mit.edu ABSTRACT We have designed and built a pair of sneakers that each sense 16 different tactile and free-gesture parameters. These include continuous pressure at 3 points in the forward sole, dynamic pressure at the heel, bidirectional bend of the sole, height above instrumented portions of the floor, 3-axis orientation about the Earth's magnetic field, 2-axis gravitational tilt and low-G acceleration, 3-axis shock, angular rate about the vertical, and translational position via a sonar transponder. Both shoes transfer these parameters to a base station across an RF link at 50 Hz State updates. As they are powered by a local battery, there are no tethers or wires running off the shoe. A PC monitors the data streaming off both shoes and translates it into real-time interactive music. The shoe design is introduced, and the interactive music mappings that we have developed for dance performances are discussed. 1) Introduction A trained dancer is capable of expressing highly dexterous control of his or her body during a performance. Rather than have the dancer follow music that has been prerecorded or performed, the field of interactive dance explores having the dancer generate or modify their own accompaniment, blurring the distinction between musician and dancer. This is an active field, with various types of sensing techniques being used to detect the dancer's movements. Perhaps the earliest such attempt was Thermin's Terpsitone [1], an application of capacitive sensing dating back to circa 1930. Most modem approaches use video tracking (e.g., [2]), which begins to be able to work reliably enough for live performances. Although modem vision systems can begin to distinguish and follow individual parts of the body [3], a dancer is able to exert an enormous amount of control at the feet, which video systems either are often unable to see or lack enough bandwidth to adequately track. Some efforts have employ sensing floors (e.g., [4]) to capture a dancer's foot pressure and position, but these only work when the foot is down and when the dancer is on the sensing surface. To obtain more detail, it becomes imperative to instrument the foot itself. The music field already has a history of electronic tap shoes (e.g., [5]), although usually equipping them with simple piezo pickups at the toe and heel; capturing other degrees of freedom has been largely ignored. Outside of dance and music, various techniques have been used to acquire foot data, including fine-grained pressure measurements for shoe designers [6], mobile systems that warn patients with neuropathies at the sole [7], simple pressuresensing shoes for golfers [8], pressure-measuring overshoes for virtual reality immersion [9], and footmounted inertial sensors for jogging pedometers [10]. The system that we've built blends many aspects of these approaches, measuring 16 tactile and freegesture parameters at each foot and wirelessly transmitting real-time updates directly from each shoe. Dynarma / - 'ro Tr- m ltwr/'" -. Ro vr R-ttt. " --^ " ',~*. " r - w,- -ilrb) en r C~ ~81~PA M Figure 1: Layout of shoe and sensor system 2) Sensor Hardware Our shoe design has evolved over the past couple of years, from a conceptual study [I1], to a prototype built into a Capezio Dansneaker [12], then to the current device based around a Nike jogging sneaker [13]. Fig. 1 shows a schematic of our present instrumented shoe, Fig. 2 shows a block diagram of the installed system, and Fig. 3 shows a photograph of the final device, with and without protective Lucite covers over the electronics. Tactile parameters are measured in a sensor-laden insole, depicted as a dotted line in Fig. 1. This insole uses force-sensitive resistors (FSR's) to measure continuous pressure at three points around the toes, including downward pressure at the left and right segments and pressure against the top of the shoe during pointing. ICMC Proceedings 1999 - 453 -

Page  00000454 Expressive Footwear: Top-Level Design Left Shoe Master Base Station Picket Recognition Line Conversion Sonar Control/Timing PAN Oscillator RS.32 Packet Reuapnlton Line Conversion Sonrr Tiwing Slave Base Station Right Shoe Fionre 2: Disaorm of Intera~tive Fnntwp.er Sv.stim I orientation with respect to the local Earth's magnetic field, and a 3-axis, high-G piezoelectric accelerometer that gives directional response to rapid kicks and jumps. A 40 kHz sonar receiver is also mounted on this card that receives pings from up to 4 ultrasound sources that can be located at different positions around the stage to track the dancer's translational position. The sonars ping every 100 msec and determine distance out to roughly 30 feet with a circa 1" resolution. *^...' ^ '*.,~.! Figure 3: Photograph of instrumented shoes A strip of PVDF piezoelectric foil at the rear of the sole measures the dynamic pressure at the heel, and a pair of back-to-back resistive bend sensors in the middle of the sole measures the sole's bi-directional bend. An electric field pickup antenna at the bottom of the insole capacitively couples into electric field signals transmit from the stage, allowing the shoe's height to be determined [14] from the received signal strength. The remainder of the sensor suite is located on a circuit card affixed to a metal mount on the side of the shoe, as depicted in Figs. I and 3 and seen close-up in Fig. 4. These include a vertical rate gyro that directly responds to twists and spins, a 2-axis low-G accelerometer that picks up tilt and general foot dynamics. a 3-axis magnetometer that gives Figure 4: Close-up of shoe electronics card All data is 8-bit digitized by an onboard PIC16C711 microcomputer, then serialized and relayed via a small, low-power transmitter module to a base station located up to 100 meters away, depending on the local RF and antenna environment. Different frequencies are used for each shoe, which communicates to a corresponding basestation, transferring 8-bit updates of all sensor values 50 times per second. A 9-Volt alkaline battery is also mounted on each shoe, providing enough power for circa 5 hours of use at the 50 mA draw. More details on the shoe sensor and electronics systems can be found in [13,15]. Fig. 5 shows some sample data acquired from the shoe system over intervals of about 2 seconds. At left is the pressure sensor data for a complete typical step, -454 - ICMC Proceedings 1999

Page  00000455 where one can see the dynamic PVDF signal at the heel starting the motion, then subtle bend in the sole. and concluding with pressure at the toes. At right are the tilt accelerometers, one shock sensor, and the gyro signals for the foot twisting about, jumping, and landing, which is nicely seen in the shock signal. Front Left ".. 2$,6 Presr r Gymr, soo 0M 00 000 0 0 0.0 Front Rigphta. Pressrmn X Shock 0o0 0a 0.0 0.o P1tch 0001 hA ResarPVDF M.c 0.0 00. oC so send Roll 00 00 0.0 W.. Figure 5: Sample data from shoe sensors 3) Dance Applications This shoe has been used in a variety of dance applications for performers at varying levels of expertise. All software mappings employed ROGUS [16], a C++ MIDI library written at the MIT Media Laboratory. Fig. I shows our first performance, at the MIT Media Lab's Wearable Computing Fashion Show in October 1997. Here, sounds triggered by the shoe (notes' on the pressure sensors, transients on the shock sensor and gyro) augmented a background dance sequence, while other sensors gave additional effects (e.g., bend produced transpose, compass panned sounds around, etc.). Only one shoe was used at the time, and our dancer quickly learned to control this limited musical palette. November 1998. This mapping was subsequently reworked with our choreography collaborator Byron Suber of Cornell University (seen with the shoes in Fig. 7), to produce a demonstration that was performed live at the 1999 International Dance and Technology Conference [13]. As it was a rich mapping that directly reflected the movement of the dancer in interesting ways, it was very expressive, and has been used with various other artists, including a live improvising mime, who performed with the system for several days at the 1999 Tokyo Toy Fair. Figure 7: Byron Suber testing the shoes at MIT A11 B12 A13 A31 B32 A33 Figure 8: Stage Layout for ADF Performance In this mapping, the right/left toes and heels produced various melodic tones in an assigned harmony; pressure sensor response from both feet must be present for these tones, thus insuring that they are both on the ground. Bend of the sole transposed these toe melodies by an octave, up or down depending on the bend direction, and pressure at the top toe sensors triggered cymbal crashes. The gyro picked up twirls, launching a cascading glissando (and burst of white noise for very fast right-spins). The shock sensors launched orchestra hits, and the left foot's shock also turned off all notes and changed the harmony played by the toes. Forward tilt launched "sparkling" notes for the left foot and a digital pad sound in the right foot, while sideways tilt would adjust the octave ranges of sounds controlled by the corresponding front tilt. As the shoes approached the single sonar pinger used in this mapping, a cymbal/snare rhythm would start, growing louder with the dancer's approach. If the dancer stepped on the electric field transmitter, all sounds would stop, and a droning chord would fade up, increasing in volume as the foot was lifted away from the electrode (different sound on Figure 6: Early shoe system at its MIT debut in 1997 After building a pair of new shoes, as seen in Fig. 3, we produced another demonstration piece that dispensed with the sequence and enabled the dancer to launch and modify a variety of continuous sounds, using many of the sensor systems simultaneously to explore multiparameter continuous control. More details on this mapping can be found in [13]. Our next piece was produced for a gymnast, who performed with the shoes at the Wearable Computing Fashion Show at NIKOGRAF in Tokyo during ICMC Proceedings 1999 -455 -

Page  00000456 each foot), with the chord voicing changing as the foot was rotated (as derived from the compass signal). We are now completing a new, complex mapping for a performance/demonstration at the American Dance Festival, with Byron Suber and music by David Borden, also of Cornell. The stage setup used in this performance is shown in Fig. 8. There will be two dancers in this performance, one each with an active sensor shoe. The dancers will be able to select either a sequence or looped music sample to play in the background by stepping into either regions A or B above and tapping their foot as to push the toe-top sensor. Regions A3, and A13 will start a musical sample playing; there are five samples of different musical excerpts in all (30-second loops), and these are selected by the region and range at which the trigger occurred. Toe-tapping in regions B starts a MIDI sequence, each of which has 3 parts that the dancers can add and subtract, again by toe-tapping appropriately in the B regions. The pressure sensors in the sole will play notes or sounds appropriate to the current background, and these (together with the backing sequence, if appropriate) will transpose up and down with bend of the shoe. When a shoe is lifted and tilted, continuous audio effects will be proportionally added to a voice in the sequence or to the music sample (e.g., filtering, flanging, reverb, crossfading, or vibrato, depending on what's currently playing). Both angles of tilt are used, allowing one shoe to control the mix of two different effects. Different music samples and sequences can not coexist; triggering one supercedes whatever is currently playing. If a dancer is in zones A,, or A33, the sole's pressure sensors will play different pitched speech phrases pronounced by a computer-generated voice, allowing the dancer to put musical sentences together by their movement. These phonemes are also transposed by the sole's bend, and effects are similarly introduced with tilt. Throughout the dance, different sonic events are tied to the shock accelerometer signals (e.g., jumps, leaps) and rate gyro, as in the mapping described earlier. When one of the dancers walks onto the electric field transmitter, they are able to pitch-bend the background music sample or tempo-shift the current music sequence by moving their active shoe up and down (effects will still be produced with tilt). The other shoe will have the same effect as in the previous mapping, turning off the background, and producing orientation-dependent drones. 4) Conclusions Future work will explore using new digital wireless networking standards (such as Bluetooth or IEEE 802.11), allowing many more sensor cards (e.g., instrumented shoes) to be accommodated. Use of a switching regulator will also enhance battery lifetime, and better integration of the electronics into the shoe will make the unit less obtrusive to the performer, although our performers are generally able to tolerate this mounting. Future work will benefit by including the sensor shoes with other systems that detect upper body motion, enabling the dancer to be totally immersed in an interactive environment. By running gesture-recognition algorithms on the realtime data, the mapping system can adapt to individual dancers, and better learn their technique, useful in both interactive performance and training procedures. 5) Acknowledgements We thank many of our Media Lab research colleagues, namely Ari Benbasat, Ari Adler, Josh Strickon, Andy Wilson, Chris Sae-Hau, Kaijen Hsiao, and Zoe Teegarden. We also thank Byron Suber and David Borden of Cornell University for their enthusiastic artistic collaboration. We appreciate the support of the Things That Think Consortium and other sponsors of the MIT Media Laboratory. 6) References 1. Mason, C.P., "Terpsitone. A New Electronic Novelty," Radio Craft (December 1936), p. 335. 2. Zacks, R., "Dances with Machines", Technology Review, May/June 1999. 3. Paradiso, J., Sparacino, F., "Optical tracking for music and dance performance", in Optical 3-D Measurement Techniques IV, A. Gruen, H. Kahmen eds., Herbert Wichmann Verlag, Heidelberg Germany, 1997, pp. 11 -18. 4. Paradiso, J. et. a. (1997). The Magic Carpet: Physical Sensing for Immersive Environments. In Proc. of the CHI '97 Conf. on Human Factors in Computing Systems, Extended Abstracts, (pp. 277-278). New York: ACM Press. 5. di Perna, A. (1988). "Tapping into MIDI." Keyboard Magazine, July 1988, p. 27. 6. Cavanagh, P.R., F.G. Hewitt Jr., J.E. Perry (1992). "Inshoe plantar pressure measurement: a review." The Foot. 2(4), 1992, pp. 185-194. 7. See: www.clevemed.com 8. See: www.pro-balance.com 9. Choi, I., C. Ricci (1997). "Foot-mounted gesture detection and its application in virtual environments." 1997 IEEE International Conference on Systems, Man, and Cybernetics. Computational Cybernetics and Simulation, Vol. 5, 12-15 Oct. 1997, pp. 4248-53. 10. Hutchings, L.J. (1998). "System and Method for Measuring Movement of Objects." US Patent No. 5724265, March 3, 1998. 11. Paradiso, J., E. Hu (1997). "Expressive Footwear for Computer-Augmented Dance Performance." Proc. of the First International Symposium on Wearable Computers, Cambridge, MA. IEEE Computer Society Press, Oct. 13-14, 1997, pp. 165-166. 12. Paradiso, J., E. Hu, K.Y. Hsiao (1998). "Instrumented Footwear for Interactive Dance." Proc. of the XII Colloquium on Musical Informatics, Gorizia, Italy, September 24-26, 1998, pp. 89-92. 13. Paradiso, J., Hu, E., Hsiao, K.Y.. "The Cybershoe: A Wireless Multisensor Interface for a Dancer's Feet." Proc. Of International Dance and Technology 99 (IDAT99), Tempe AZ. Feb. 26-28, 1999. 14. Paradiso. J.. Gershenfeld, N. "Musical Applications of Electric Field Sensing". Computer Music Journal, 21(3), 1997, pp. 69-89. 15. Hu, E., "Aplications of Expressive Footwear," MS Thesis, MIT Department of Electrical Engineering and Computer Science, Cambridge MA, 1999. 16. Denckla, B. and P. Pelletier (1996). "The technical documentation for 'Rogus McBogus', a MIDI library". http.://theremin. media, mit. edu/rogus/ -456 - ICMC Proceedings 1999