AN OVERVIEW OF THE RESEARCH APPROACHES ON MUSICAL PERFORMANCE ROBOTSSkip other details (including permanent urls, DOI, citation information)
This work is licensed under a Creative Commons Attribution-NonCommercial-NoDerivatives 3.0 License. Please contact email@example.com to use this work in a way not covered by the license. :
For more information, read Michigan Publishing's access and usage policy.
Page 00000356 AN OVERVIEW OF THE RESEARCH APPROACHES ON MUSICAL PERFORMANCE ROBOTS Jorge Solis, Atsuo Takanishi Waseda University Mechanical Engineering Department Tokyo, Japan E-mail: firstname.lastname@example.org ABSTRACT The relation between the human and the music has a long history dating from the antiquity. Music constitutes an important mean of communication of everyday life. Since the golden era of automata, music also served as a mean for understanding the human itself. In particular, the human organs required for playing musical instruments were mechanical simulated. Nowadays, thanks to the advances on robot technology, the research on Musical Performance Robots (MPR) is aiming in understanding better the human motor control as well as serving as a mean for understanding better how robots can transmit sensitive information to the human partners. Additionally, the research on MPR is promoting the introduction of novel ways of musical expression that have been hidden behind the rubric of musical intuition. In this paper, an introduction of the research trends on MPR is presented. Examples of such research trends are described. 1. INTRODUCTION The relation between the human and the music has a long history dating from the antiquity, during which poetry, dancing and music were inseparable and constitute an important mean of communication of everyday life. During the golden era of automata (17th and 18th centuries), music also served as a mean to understand how the human brain is able of coordinating multi-degrees of freedom in order to play musical instruments. In particular, some researchers were interested in studying how the human is able of performing musical instruments, such as the flute. As an example, the mechanical flutist player developed by Vaucanson was used as a mean for understanding human breathing mechanism. Vaucanson presented "The Flute Player" to the Academy of Science in 1738 . For this occasion he wrote a lengthy report carefully describing how his flutist can play exactly like an alive person. This first automaton was a life-size figure capable of playing a flute and it had a repertoire of twelve pieces which included the "Le Rossignol" by Blavet. The technical details of the operation would be too long; therefore, we will simply give the broad outline of the mechanism . The driving movement is composed of nine bellows, which are arranged by groups of three of them, connected each other using levers and cords (Figure 1). Each group of three bellows sends air into a pipe leading to one of the compartments of a small tank arranged in the chest of the automaton. A cylinder, made up of points and bridges, is placed above the six lower bellows. On the cylinder there is a keyboard provided with fifteen levers. Each end has a nozzle which is raised in the passing of the points and the bridges. On the other end, these levers pull on wires communicating with the moving parts of the automaton (fingers, lips, tongue, etc). 4 digits of the Variable pressure 3 digits of theand lrfht hand cam, Prsur IMixer -, valve Iviotor Medium axis iCrankshaft Pressure Regulator Pressure Figure 1. Mechanism detail of "The Flute Player" is shown. This automaton was developed by Vaucanson in order to understand the human respiratory system. More recently, several researchers have developed different musical performance robots. The first attempt of developing an anthropomorphic musical robot was done by Waseda University in 1984. In particular, the WABOT-2 was able of playing a concert organ . Then, in 1985, the WASUBOT built also by Waseda, could read a musical score and play a repertoire of 16 tunes on a keyboard instrument. Prof. Kato argued that the artistic activity such as playing a keyboard instrument would require human-like intelligence and dexterity. After the developing of WABOT-2, several researchers have been presented different kind of robots able of playing conventional musical instruments (Figure 2). The anthropomorphic flutist robot, developed at Waseda University since 1990, has been focused on understanding the human motor control, facilitating the communication between human and robots in musical terms and proposing novel applications for humanoid 356
Page 00000357 WABOT-2 (Waseda Univ., Japan) Flutist Robot WF-4RII (Waseda Univ., Japan) Saxophone Robot (Hosei Univ., Japan) Violinist Robot (Univ. of ElectroCommunications, Japan) Violinist Robot (Ryukoku Univ., Japan) Trumpet Robot (Toyota Co., Japan) http://www.toyota.co,jp/en/special/robot/ Percussionist Robot (Georgia Tech., USA) Figure 2. Examples of Musical Performance Robots are shown: WABOT-2 , WF-4RII , The Saxophone Robot , The Violinist Robot (, ), Trumpet Robot  and Percussionist Robot . robots [4-6]. As a result from this research, the flutist robot is able of performing nearly similar to an intermediate flutist as well as interacting with musical partners (i.e. playing a performance in a duet, teaching basic skills to beginners, etc). In this year, we have developed the Waseda Flutist Robot No.4 Refined III (WF-4RIII) which has a total of 43-DOFs designed to nearly emulate the physiology and anatomy of the human organs required to play the flute . In particular, the simulated organs are: lips, lungs, neck, arms, oral cavity, tongue, throat, fingers and eyes. The Musician Robot (MUBOT), developed at University of Electro-Communication in 1989, was designed to be able of automatically playing a violin or cello . The MUBOT was developed with the premise that music should be played by robot without remodeling the musical instruments at all. Such kind of violin performance robot was developed to act both as an entertainment robot and performance robot as an approach for studying robot and musical engineering. Takashima has being developing at Hosei University different music performance robots that are able of playing wind instruments such as : saxophone, trumpet, trombone and shakuhachi (traditional Japanese bamboo flute). In particular, the saxophone playing robot has been developed under the condition that the musical instrument played by robots should not be change or remodeled at all. This robot is composed by an artificial mouth and fingering mechanisms and air supplying system. Due to the complexity of replicating the motion of human fingers, the fingering mechanism is composed by twenty-three fingers so that each finger can press each key of the saxophone. Weinberg and Driscoll at Georgia Technology University have presented a percussionist robot which is able of interacting with musical partners . The robot, named HAILE, has two arms that are controlled to hit the drum. Both arms are controlled to strike in different locations of the drum to change the pitch. In additions, the velocity of motion of arms is controlled to change the volume of the produced sound. The right arm is controlled by a solenoid actuator to produce fast hits and the left arm is controlled by a linear motor to produce powerful hits. Shimojo has been developing a violin-playing robot at the University of Electro-communications , which it is composed by a commercial 7-DOF manipulator which holds the bow and a fingering mechanism with 2-DOF. A bowing holder was designed and attached to the end-effector of the multilink manipulator. More recently, some companies such as Toyota has being introducing musical performance robots, such as the trumpet-playing robot, for introducing novel ways of entertainment and assistance for elderly care . For this purpose, Toyota has developed artificial lips that move similar to the human lips and human-like hands that enable the robot to play trumpets like humans do. Such a robot also is able of walking while performing. Other researchers have being developing automatic musical instruments from the musical engineering point of view. Sobth has also been carried out research on 357
Page 00000358 developing robot musicians ; however, their approach is based on remodeling the musical instrument to open new domains of research for musical engineering. The robot musician band is composed by the following instruments: drum, percussion, electric guitar, violin, cello, and several kinds of guitars which are automatically controlled to perform a score. Alford has developed a musician robot for playing the Theremin that is an electronic musical instrument that is played without physical contact . Hayashi, from Kyushu Institute of Technology, presented an automatic piano which is able of producing soft tones . Such an automatic piano employs feedback control the movement of an actuator used to strike a key. Singer et al introduced the LEMURs musical robots . LEMUR is a Brooklyn-based group of artist and technologists who create robotic musical instruments. 2. RESEARCH TRENDS ON MUSICAL PERFORMANCE ROBOTS In recent decades, the research on musical performance robots, from the engineering point of view, has been particularly intensified. Thanks to the advances in computers, electronics, sound processing and artificial intelligence, several examples of musical performance robots have been developed in the world. Most of them are designed to display high level of motor skills required to perform musical instruments. As a result, the research on musical performance robots aims not on just merely developing soundmaking mechanical devices that automatically play musical instruments. In fact, several robotic researchers are focused on developing robots that can both display human-like dexterities as well as some kind of intelligence in order to play musical instruments. Thanks to the possibility of programming a musical performance robot to produce a live performance by mechanical means, we can study different aspects related to robotic engineering, music engineering, human motor control, etc. Thus, the research on musical performance robots may opens new ways of doing research as well as creating new ways of music production. In particular, by analyzing current research trends on musical performance robots, we may observe basically the following research approaches: 1. Human-Robot Interaction: musical performance robots aim in understanding how humans can communicate at the emotional level of perception (in musical terms). In particular, some researchers have been interested on analyzing the human musical expression to extract some musical parameters that can be then applied to the robot's performance to express emotions/ideas. Some examples of this approach are: The violinist robot developed by Shibuya , which is designed to reproduce an expressive performance by taking into account the KANSEI (which is related to sensitivity) information; the Waseda Flutist Robot WF-4RIII, developed by authors , which has been designed to automatically generate an expressive performance by modeling a professional flutist by using Neural Networks, etc. 2. Human Motor control: musical performance robots can be designed to nearly reproduce the anatomy/physiology of the human organs require for playing musical instruments. From this approach, the robot can be used as a benchmark to better understanding about how humans are able of synchronizing multi-degrees of freedom. In particular, some researchers have been focused on developing anthropomorphic robots that are able of nearly imitating the human playing. Some examples of this approach are: The Waseda Flutist Robot WF4RIII , which is designed to reproduce the human respiratory system and the human upper limbs as well as the head; the percussionist robot HAILE , which reproduces the motion human forearm to hit the drum at different locations and velocities; the Violin Musician Robot , which reproduces the motion of human arm in order to hold the bow and control its trajectory to produce the sound, etc. 3. Art/Entertainment: musical performance robots have contributed in finding new ways of musical expression that may have been hidden behind the rubric of "musical intuition". From this approach, basically two ways of musical expression can be found: passive and active musical performance. The first one is realized by programming the robot to just perform the musical melody with a human partner. In this case, the human performance is actually synchronized to the robot's one. Some examples of this approach can be found in , , , etc. The second approach is characterized by programming the robot to interpret gestures from a human partner at the same level of perception, which is then translated by the robot into musical parameters (i.e. tempo, volume, etc.). Some examples of this approach are: the percussionist robots developed by Suguro at IRCAM , which that are controlled by recognizing human gestures using the data-suit; the Waseda Flutist Robot WF4RIII , which is able of changing in real-time some of the musical parameters depending on the inputs coming from an array of sensors attached to a professional flutist; the percussionist robot HAILE is able of listening players , analyze their music and use the product of such analysis to add some improvisations, etc. 358
Page 00000359 4. Education: musical performance robots can be also used as musical education tools in order to transfer skills to beginners. From this approach, the advantage of design human-like robots may help students in understanding more easily the skills required to play a musical instrument. Some examples of this approach are: The Waseda Flutist Robot is able of analyzing the musical parameters from the performance of a flutist beginner in order to provide advices , etc. 3. CONCLUSIONS In this paper, an overview of the research on musical performance robots has been presented. In particular, the research trends on this research field were pointed out. Basically, the musical performance robots have been used for facilitating the study of the human/robot interaction, understanding better the human motor control and introducing novel ways of musical expression. Some examples of each of the research approaches were used to simplify its understanding. The future of the research on musical performance robots seems quite promising. As a result, new ways of musical art can be conceived, more advanced cognitive functionalities can be developed to enable robots to interact with human players, etc. 4. REFERENCES  Doyon, A. Jacques Vaucanson: mecanicien de genie. PUF, 1966.  Solis, J., Suefuji, K., Chida, K., Taniguchi, K., Takanishi, A., "The mechanical improvements of the anthropomorphic flutist robot WF-4RII to increase the sound clarity and to enhance the interactivity with humans," in Proc. of the 16th CISM-IFToMM Symposium on Robot Design, Dynamics, and Control, pp. 247-254, 2006.  Kato, I., Ohteru, S., Kobayashi, H., Shirai, K., Uchiyama, A., "Information-power machine with senses and limbs," In Proc. of the CIS-IFToMM Symposium Theory and Practice of Robots and Manipulators, pp. 12-24, 1973.  Solis, J., Chida, K., Suefuji, K, Takanishi, A., "The development of the anthropomorphic flutist robot at Waseda University," International Journal of Humanoid Robots, Vol. 3(2), 2006, pp. 127-15 1.  Solis, J., Chida, K., Suefuji, K., Taniguchi, K., Hashimoto, S.M., Takanishi, A., "Imitation of human flute playing by the anthropomorphic flutist robot WF-4R1I," Computer Music Journal, 2006, vol. 30(4).  Solis J., Suefuji, K., Taniguchi, K., Takanishi, A. "Towards and autonomous musical teaching system from the Waseda Flutist Robot to Flutist Beginners," in Proc. of the IEEE/RSJ Int. Conference on Intelligent Robots and Systems - Workshop: Musical Performance Robots and Its Applications, pp. 24-29, 2006.  Kajitani, M., "Development of musician robots," Journal of Robotics and Mechatronics, vol. 1, pp. 254-255, 1989.  Takashima, S., Miyawaki, T., "Control of an automatic performance robot of saxophone: performance control using standard MIDI files," in Proc. of the IEEE/RSJ Int. Conference on Intelligent Robots and Systems - Workshop: Musical Performance Robots and Its Applications, pp. 30-35, 2006.  Weinberg, G., Driscoll, S., "The perceptual robotic percussionist: New developments in form, mechanics, perception and interaction design," in Proc. of the ACM/IEEE Int. Conference on HumanRobot Interaction, pp. 97-104, 2007.  Kuwabara, H., Seki, H., Sasada, Y., Aiguo, M., Shimojo, M., "The development of a violin musician robot," in Proc. of the IEEE/RSJ Int. Conference on Intelligent Robots and Systems - Workshop: Musical Performance Robots and Its Applications, pp. 18-23, 2006.  http://www.toyota.co.jp/en/special/robot/  Sobh, T.M., Wang, B., Coble, K.W., "Experimental robot musicians," Journal ofIntelligent and Robotic Systems, Vol. 38, pp. 197-212, 2003.  Alford, A., Northup, S., Kamakura, K., Chan, K.W., Barile, J., " Music playing robot," In Proc. of the International Conference on Field and Service Robots, pp. 174-178, 1999.  Hayashi, E., "Development of an automatic piano that produce appropriate: touch for the accurate expression of a soft tone," in the Proc. of the IEEE/RSJ International Conference on Intelligent Robots and Systems - Workshop: Musical Performance Robots and Its Applications, 2006, pp. 7-12.  E. Singer, J. Feddersen, C. Redmon, B. Bowen, "LEMUR's musical robots," in Proc. Of the Conference on New Interfaces for Musical Expression, 2004, pp. 183-184.  Shibuya, K., "Analysis of human KANSEI and development of a violin playing robot," in the Proc. of the IEEE/RSJ Int. Conference on Intelligent Robots and Systems - Workshop: Musical Performance Robots and Its Applications, 2006, pp. 13-17.  Solis, J.; Suefuji, K.; Taniguchi, K.; Ninomiya, T.; Maeda, M.; Takanishi, A., "Implementation of Expressive Performance Rules on the WF-4R1II by modeling a professional flutist performance using RN," in Proc. of the IEEE/RAS Int. Conference on Robotics and Automation, pp. 2552-2557, 2007.  Goto, S., "The case study of an application of the system "BodySuit and RobotMusic: Its introduction and aesthetics," in Proc. of the Int. Conference on New Interfaces for Musical Expression, pp. 292-295, 2006. 359