The use of active tactile and force feedback in timbre controlling electronic instrumentsSkip other details (including permanent urls, DOI, citation information)
This work is licensed under a Creative Commons Attribution-NonCommercial-NoDerivatives 3.0 License. Please contact firstname.lastname@example.org to use this work in a way not covered by the license. :
For more information, read Michigan Publishing's access and usage policy.
Page 171 ï~~The Use of Active Tactile and Force Feedback in Timbre Controlling Electronic Instruments. Bert Bongers Royal Conservatory, Sonology Department The Hague, Netherlands email@example.com Abstract This paper focuses on the human interface through which we control synthesizers. Since the first modular synthesizers, it became possible (for the first time in history) to separate the interface from the actual sound source. It makes sense to try to design new interfaces, new instruments, unbounded by the sound source. Due to this separation, the connection between the sound and the feel of the instrument was lost. Our goal is to restore the relationship, and, moreover, to make this haptic feedback programmable. 1 Introduction The keyboard, having proven to be a versatile interface since it can be found on many instruments for centuries, has been the defacto-standard through which most functions of a synthesizer are controlled. The pitch controls and modulation wheels found on most of today's commercial synthesizers are poor controls compared to the features added to the keyboard in historical examples, like the "pitch-band" on the Ondes-Martenot [Ruschkowski, 1990], or the timbre controllers on Hugh le Caine's Electronic Sackbut [Young, 1984]. This instrument incorporates touch sensitivity on the keys, a concept later explored in more depth by Robert Moog [Moog, 1987]. Likewise, the "aftertouch" on modern synthesisers can only be called rudimentary compared to the aforementioned attempts to make the keyboard more sensitive. Other instrument forms, based on traditional (acoustic) instruments have been used. The advantage of this is that the already acquired musical skills can be used. But all of these instrument forms have their own idiosyncrasies, imposed by the inherent process of sound generation. It is also possible to design new instrumentforms, and in fact it is a very obvious thing to do. The electronic sound source does not impose anything on the form of the interface. The whole realm of electronic sensors (and actuators) is available, ranging from ordinary switches and pots to accelerometers, pressure pads, mercury switches, position detectors and the like. This offers the opportunity to design an instrument that takes the human being as starting point, instead of being based on the sound source's peculiarities. Strangely enough, the first electronic musical instruments like the Theremin and the Trautonium, built in the beginning of this century, employed new interface forms. Well-known examples of recent approaches are Don Buchla's Thunder, Michel Waisvisz's Hands, and the various systems based on cameras or gloves which directly measure body-movements. But it is not only obvious, it is also very difficult to invent a new instrument form because there are no guidelines to find an appropriate shape. One way to do that, which led to the instruments mentioned above, is to determine from which body movements one could start. Among the most intuitive movements are the ones that have proven to work on traditional instruments. Using tools like STEIM's SensorLab or the Doepfer1 Voltage-to-MIDI converter, it is trivial to subsequently translate electric signals from the sensors which track the physical quantities into MIDI. With the application of traditional instrument-forms to control electronic sound sources a lot of its original sensitivity is lost. The control most players of acoustical instruments have over their sound (esp. timbre) can make a lot of electronic musicians jealous (or afraid, depending on the artist's nature). Our goal is to design an instrument that is both intuitive (also for novel users) and profound in its control over the parameters (as many as possible) of the sound. 2 Instruments built at Sonology Over the last few years, various instruments have been built at Sonology. Some are based on (the exploration of) movements and gestures, others are complete instruments that one can pick up and play with. 2.1 The Web One of these alternate controllers is The Web, an instrument-form devised by STEIM's artistic director (and guest researcher at Sonology) Michel Waisvisz. It resembles the shape of a spider's web, though slightly bigger in scale (diam. ICMC Proceedings 1994 171 Interactive Performance
Page 172 ï~~120 m.), and consists of six radials and two circles in a frame. In 24 of the resulting string-parts, the physical tension caused by the player is measured by in-house designed sensors (based on a moving encapsulated Neodymium magnet and a Hall-effect sensor). The wires are actually harp-strings (the F), introducing a musical feel already. The player has (continuous) control over 24 parameters of the sound, making it a very good timbre-controller to, as it were, mould the sound. As in traditional instruments, these parameters are linked (in a fixed configuration, due to the web-structure). A more detailed discussion of Michel Waisvisz' seminal ideas behind this instrument can be found in [Krefeld, 1990]. 2.2 The MIDI-conductor The MIDI-conductor, a more generic version of The Hands, was devised by Michel Waisvisz and developed in co-operation with STEIM. These controllers (six pairs were built) are used by many people in different ways, most notably in this context Sonologist Edwin van der Heide who used the instrument to control pitch and timbre of his real-time DSP implemented harmonizer. 2.3 Glove controllers Another instrument is a glove equipped with bend sensors on the fingers, and ultrasonic position sensors in space on two axes. The glove (or rather, the gauntlet) is a customized Mattel PowerGlove [Gold, 1992], connected to the SensorLab in order to read the bend sensors on the fingers continuously (the Mattel electronics can't do that, and hence this limitation persists in the Doepfer MOGLI or the GoldBrick2. At this stage, our glove gives the player continuous control over seven parameters which are only slightly linked. Sonologist Wart Wamsteker uses the glove to control filters and effect processors (harmonizer and reverb) through which a signal is fed back. Various performances with the instrument have proven that it gives very intuitive, sensitive and powerful control over the sound. A more elegant pair of gloves has been built for Walter Fabeck, to be used as a Virtual Keyboard. Further development of the system took place at STEIM. Walter had a light emitting template keyboard built at home in London, and currently names the instrument the "Chromasone". Yet another use of the SonoloGloves has been devised by Sonology's Artistical Director, Clarence Barlow. Through gesture-recognition software, it is possible to control a computer with speech synthesis software. By implementing the deaf and dumb alphabet, it would be possible to read sign language, though this concept has been explored before with the Talking Gloves [Aukstakalnis & Blatner, 1993] or IRCAM's Dada Glove [Chabot, 1993]. The glove as a timbre controller approximates the ultimate form, which might be something like clay (SonoloPutty), and would have programmable relationships between the parameters. 2.4 The 3-DOF pedal An A4-sized keyboard consisting of 122 keys (microswitches) has been built for Sonologist Harry Fortuin, to play microtonal tunings. The keyboard is completed with two pedals to control sound parameters such as velocity. These custom built pedals each have three Degrees Of Freedom (3-DOF: pitch, yaw and roll, i.e. the rotations in the planes of the three dimensional space), to give the player profound foot-control over the sound. The importance of the often overlooked possibilities of foot control has been emphasized by Bill Buxton [Buxton, 1986]. 2.5 Myoelectrical Control The possibilities of using the electric signals emitted by contracted muscles (Electromyographic signals -EMG), are being researched. Existing systems such as the BioMuse [Tanaka, 1993] [Gibbs, 1993] and the WaveRider of Jonathan Purcell and Bruce Ackerman are promising. These systems also read brainwaves (EEG) (another system, Psychic Lab's IBVA does only that), or skin resistance but our first interest is muscle tension because it is (for our Western minds) easier to control. Sonologist Erik Stalenhoef wants to be able to glue up to eight electrodes on his body, to control sound. Initial experiments, carried out with a commercially available EMG measuring device and the SensorLab were promising. In co-operation with STEIM we will develop an electrically isolated eight-channel amplifier/integrator to hook the sensors to the SensorLab. 3 Tactile and force feedback Another very important thing that has been lost at the transition from acoustical to electronic musical instruments is tactile and force feedback. The relationship between sound and feel is distorted; any sound selected on your DX7 will have the same keyboard action. Haptic cues, however, are very important as information on soundparameters can be conveyed from the sound source to the player. The haptic and kinaesthetic perception consists of two parts which work closely together:, the tactile part (the mechanoreceptors in the Interactive Performance 172 ICMC Proceedings 1994
Page 173 ï~~skin) and proprioceptive part (force sensed in one's joints and muscles). Tactile and proprioceptive cues are known to be very fast [Puckette, Settel 1993] and can enhance the perception of related sound. A lot of research on this matter is being carried out in the Virtual Reality industry [Rheingold, 1991] [Kalawsky, 1993], this knowledge as well as the developed technology can be used for musical applications. In spite of the oxymoronic and overpretentious name, VR is very much about human-machine interfacing, and one could regard a musical instrument as the most exacting interface. 3.1 Touch Our cutaneous sensitivity is conceived by four different mechanoreceptors in the skin. They have either punctate sensitivity (the fast adapting Meissner corpuscles and the slower adapting Merkel disks) or diffuse sensitivity (the rapidly adapting Pacinian corpuscles and the slow adapting Ruffini cells). Furthermore, movement of hair is sensed by hair root plexuses, and the skin is sensitive for temperature and pain. The most sensitive parts of the body are the skin on the fingertips and the lips. Many acoustical instruments provide the player with pitch-information of vibrating strings (e.g. through the cellist's left-hand fingertips) or the vibration of a reed (e.g. through the oboist's lips), which makes it easier to intonate [Chafe 1993]. Some preliminary experiments with piezo's have been done to investigate the use of active vibrotactile feedback, e.g. to provide the player of an electronic musical instrument with tactile information on the played pitch. 3.2 Proprioception Proprioception, or kinaesthesia, is our awareness of movement and position of parts of the body. We can sense force applied to the body in the joints, muscles and tendons. This can be important for the reproduction of gestures. The use of active proprioceptive feedback has been explored by ACROE (Grenoble) [Cadoz, Lisowski and Florens, 1990] and at CCRMA (Stanford) [Gillespie, 1992], oddly enough again on keyboards. 3.3 Active Feedback Bringing active tactile/force feedback in the instruments will provide the player with information about the sounds not only by hearing them, but also by touching them. The feedback will be programmable, which introduces a new problem: what relationship should there be? To explore that, we have done experiments with motors and solenoids, as well as with Muscle Wire3, an electrically controllable shape memory alloy. We used Max to control these actuators through a Doepfer MIDI-to-voltage converter. The Muscle Wires, very appropriately called that way because they are very thin (typ. 0 100 i ) and perform in a very organic way: the wire shortens in length when electricity is supplied. We will build these wires in the glove to control the flexibility of the fingers, thus providing the instrument with force feedback. The relationship between the sound and feel could be designed in such a way that, for example, the parameters of the sound (e.g. attack) dictate the amount of effort the player needs to bend his fingers. Based on the same technology are the Tactors4, little plates that fit on the fingertips and transform an electric signal into a haptic cue. We can use them for tactile feedback, for example to make the interruption of a light beam (triggering a sound) tangible. Another use is to let the player feel (not only hear) the movement through a space as detected by ultrasonic distance sensors. Anyone who ever tried to reproduce a gesture in space knows how difficult a task it is, if no tactile or (external) proprioceptive information is supplied. 4 Conclusions Tactile and force feedback is very important in the process of controlling sound. When using active feedback an intuitive relationship between the sound and feel, as perceived by the player, should be provided. It is most obvious to relate the feel of a sound-resulting gesture to the envelope of that sound, as in the case of acoustical instruments. Though it is already difficult to devise new instrument forms, it is important to incorporate tactile and force feedback at the same stage because it can make the instrument more intuitive to use. I would call this tangible sound. 5 Acknowledgements The Web and the 3-DOF pedal were built by mechanical engineer Theo Borsboom, as well as some parts of the other instruments. The gesture-recognition software for the glove was written by Jos Mulder. The ability to virtually see electrons, as possessed by our chief engineer Jo Scherpenisse, helped me through a lot of electronic problems. The EMG-electrodes and measuring device were kindly provided by Piet Luitwieler of Vickers Medical Equipment BV. Tim Perkis and Scott Greshanm-Lancaster provided me with some PowerGloves, on which I did some serious surgery in order to rip their sensors out (the gloves that is, not these ICMC Proceedings 1994 173 Interactive Performance
Page 174 ï~~friends!). Scott also suggested me to use Muscle Wire. This paper would not have happened at all without the encouragement and support of Paul Berg. Notes 1 Doepfer Musik Elektronik GmbH. Dieter Doepfer is the designer of many MIDI to the real world and vice versa kits, obtainable through Ocean MIDI-Musiksysteme, Offenbach, Germany. 2 The GoldBrickTM is an interface-box made by Transfimite Systems Co. (Cambridge, MA), that translates Nintendo data-format into RS-232, there's a Max object that reads the glove. 3 Muscle Wire is a registered trademarks of Mondo-tronics (San Anselmo, CA). [Gilbertson, 1992] 4 The Tactors are made by the TiNi Alloy Company (San Leandro, CA) 6 References [Aukstakalnis & Blatner, 1993]: Steve Aukstakalnis and David Blatner. Silicon Mirage, The Art and Science of Vritual Reality. Peachpit Press, Inc., Berkeley, CA, pp.166-167, 1992 [Buxton, 1986]: William Buxton. There's More to Interaction Than Meets the Eye: Some Issues in Manual Input. In: Ronald M. Baecker and William A. S. Buxton, Readings in HumanComputer Interaction, A Multidisciplinary Approach. Morgan Kaufman Publishers, Inc., San Mateo, CA, pp. 366-375, 1987. [Cadoz, Lisowski and Florens, 1990]: Claude Cadoz, Leszek Lisowski and Jean-Loup Florens. Modular Feedback Keyboard. ICMC Proceedings 1990, pp. 379-382, 1990. [Chabot, 1993]: Xavier Chabot. To Listen and To See: Making and Using Electronic Instruments. Leonardo Music Journal, Vol. 3, pp. 11-16, 1993. [Chafe, 1993]: Chris Chafe. Tactile Audio Feedback. ICMC Proceedings 1993, Tokyo, pp. 76-79, 1993 (Gibbs, 1993]: W. Wayt Gibbs. Body English. Scientific American, August 1993, pp. 94-96. [Gilbertson, 1992]: Roger G. Gilbertson. Muscle Wires Project Book. Mondo-tronics, San Anselmo, CA, 1992. [Gillespie, 1992]: Brent Gillespie. The Touchback Keyboard. ICMC Proceedings 1992, San Jose, CA, pp.447-448, 1992. [Gold, 1992]: Rich Gold. It's Play Time. In: Linda Jacobson (Ed.) CyberArts, Exploring Art and Technology, Miller Freeman Inc., New York, pp. 196-202, 1992. [Kalawsky, 1993]: Roy S. Kalawsky. The Science of Virtual Reality and Virtual Environments. Addison-Wesley, 1993 [Krefeld, 1990]: Volker Krefeld. The Hand in The Web: An Interview with Michel Waisvisz. Computer Music Journal, Vol. 14, No. 2, pp. 28-33, 1990. [Moog, 1987]: Robert Moog. Position and Force Sensors and Their Application to Keyboards and Related Control Devices. Proceedings of the AES 5th International Conference, pp 173-181, 198? [Puckette, Settel 1993]: Miller Puckette and Zack Settel, Nonobvious Roles For Electronics in Performance Enhancement. ICMC Proceedings 1993, Tokyo, pp. 134-137, 1993. [Rheingold, 1991]: Howard Rheingold, Virtual Reality. Mandarin Paperbacks, Great Britain, 1991. [Ruschkowski, 19901: Andr6 Ruschkowski. Soundscapes. Lied der Zeit, Musikverlag, Berlin, pp. 27-30, 1990. [Tanaka, 19931: Atau Tanaka. Musical Technical Issues in Using Interactive Instrument Technology with Application to the BioMuse. ICMC Proceedings 1993, Tokyo, pp. 124-126, 1993. [Young, 1984]: Gayle Young. Hugh le Caine's 1948 Sackbut Synthesizer. ICMC Proceedings 1984, Paris, pp. 203-212, 1984 Interactive Performance 174 ICMC Proceedings 1994