Page  106 ï~~An Improvisational Accompaniment System Observing Performer's Musical Gesture Yushi Aonot, Haruhiro Katayosett, Seiji Inokuchit t Dep. of System Engineering, Osaka University tt Laboratories of Image Information Science and Technology Tel: +81-6-850-6373 / E-mail: ABSTRACT: In improvisational music, human accompanists understand patterns from the features of a soloist's playing, and use them effectively in many situations of real time performance. The authors have originated a computer accompaniment system with this feature of human accompanists. The system can draw out patterns from a soloist's musical gestures (combination of chord, rhythm, phrase and structural contour), and generate real time accompaniment using and reconstructing memorized patterns. 1. Introduction Many interactive computer systems have been developed with various sections. However, most of them are finally divided into two major streams. The first group aims at composing computer music by the collaboration of a human player and a computer. (Rowe R.) The second group aims at accompanying a human soloist more perfectly with scores. (Dannenberg R.)(Vercoe B.) The system we originated models another situation, such as music production in a small scale band. Only the human soloist act a composing or performing leader. The accompanists take his intention into consideration, and try to make a more suitable response. This situation needs characteristic interaction between a soloist and accompanists. 2. Interaction and Patterns In the aforementioned situation, a major aim of impro- SOLIST displaying initial patterns visation is to compose complex or well-formed pieces of music in progress. However, members have more MACHINE: initial response or less scattered information about the music that they Interaction... are going to perform than when they play with a score. compromising So musicians should interact to posses a common in- follow against try or trick tention. Through interaction, they try to make their performance and understanding fused. In short, a better result needs closer interaction and common understanding about the music they are playing. (See Fig.l) The question now arises about the scheme of human musician's interaction. They interact smoothly and quickly by exchanging sounds and notes between each member. They listen to other's performance carefully, and detect various pieces of musical information, information meaning musical primitives (such as (Fg)Fuinoudesaig a cluster of notes, beat accent), and musical structures between musicians with passage 106 6ICMC PROCEEDINGS 1995

Page  107 ï~~(such as the start and end of a phrase). However, Musical Gesture Recognition & Extraction they do not use them directly. They always put N them into more abstract patterns, almost unconsciously. They memorize the assembly of these patterns as a piece of music, and schedule the next musical tasks from an accumulation of patterns.."r"m a,.... For musicians, the interaction demands special abilities, including abstraction of many details into memorizing pieces patterns and utilization of them. The same ability by abstract patternssource is also needed for the computer accompanist. The estimatio system is designed along this concept. (See Fig.2) Practices judgement 3. Pattern Extraction and Utilization Musical Rules constraint Creation In this system, the musical gestures owe to the Background connection between the raw notes and the patterns. Knowledge Musical gestures mean chords, rhythm, phrases N MiNNte and so on. Automatically the system recognizes Mi Gt Rc rt close notes as a chord, relatively high velocity as a (Fig.2) A Stream of Recognition and Creation rhythm accent, and sequential notes as a phrase. of Music The system always searches for closures in the musical gestures. Long intervals of two notes, sudden high velocity, bars, beats, and other elements strengthen the likelihood of a closure. (Murao T.) The system groups musical gestures between two closures as patterns, and memorizes them. Those patterns previously cut off and memorized are compared with an initial part of a newly entered pattern to calculate the similarity. If it is a known pattern, the system anticipates the next event. This anticipating function enables the system to know many pieces of information, such as the position of the next space given to the accompanists (a phrase interval for ornamental fill-in or tutti), the position of changing scenes, and so on. In short, the system can schedule future musical plans by using these patterns. 4. Summary The proposed system shows that using patterns and predicting next events is effective for real time accompaniment. The system can accompany smoothly by comprehending the player's intention in the middle of his input. In comparison with this system, simple accompaniment systems, which generate performance data after a certain amount of an input note sequence is obtained, do not secure the actual real time response. References (Rowe R.) Rowe R.: "Interactive Music Systems", MIT Press. (Dannenberg R.) Dannenberg R.: "An Online Algorithm for Real-Time Accompaniment", Proc. ICMC, pp. 193-198 (1984). (Vercoe B.) Vercoe B.:"The Synthetic Performer in the context of live performance", Proc. LCMC, pp. 199-200 (1984). (Murao T.) Murao T.:"Identifying Structural Tones Through An Objective Assessment of Closure Points", Proc. Summer Symposium of Japan Music and Computer Science Society, pp. 67-72 (1992). I C M C P R OC EE D I N G S 199510 107