Page  55 ï~~Exemplar-based learning in adaptive optical music recognition system Ichiro Fujinaga Peabody Conservatory of Music, Johns Hopkins University 1 E. Mt. Vernon Place Baltimore, Maryland 21202 U. S. A. ich@peabody jhu. edu The learning process of an adaptive optical music recognition system (AOMR) is described here. By combining k-nearest neighbor classifier and genetic algorithm, the system can learn to recognize new music symbols and handwritten music notations, and it also continually improves the accuracy in recognizing these objects. Given the wide characteristics of a music recognizer. 1. Introduction The learning process of an adaptive optical music recognition system described here is a part of a computer program for recognition of musical notation. The strength of this system is its ability to learn new music symbols and handwritten notations. It also continually improves its accuracy in recognizing these objects by adjusting internal parameters. Given the wide range of music notation styles, these are essential characteristics of a music recognizer. The present implementation of the adaptive system is built on exemplar-based incremental learning system that classifies an unknown object by comparing it to the known examples already stored in its database. The entire process is based on two simple, yet powerful algorithms: k-nearest neighbor (k-NN) classifier and genetic algorithms. Using these algorithms, the system is designed to increase its accuracy over time as more data is processed. 2. Exemplar-based categorization Exemplar-based model of categorization, which is analogous to the idea of "learning by examples," identifies objects by their similarity to one or more of the stored examples. This model is one ofÂ~ the fundamental models in human learning theory [Brooks, 1978; Hintzman, 1986; Medin and Schaffer, 1978; Reed, 1972] and many successful applications for machine learning have been reported. [Aha, et al., 1991; Cost and Salzberg, 1993). 3. K-nearest neighbor classifier The exemplar-based model can be implemented on a computer by the nearest-neighbor algorithm [Cover and Hart, 1967], a classification scheme to determine the class of a given sample by its range of music notation styles, these are essential features. Features are measurable attributes of an object that may characterize the object differently from other objects. Currently, the system uses features of music symbols such as, width, height, area, and central moments [Fujinaga, et al, 1991]. The combination of these features in a feature space results in a feature vector for each symbol. Distances between the feature vectors of an unclassified sample and previously classified samples are calculated; then the class represented by the closest neighbor is assigned to the unclassified sample. In the k-NN classifier, the class represented by the majority among k-nearest neighbors is assigned to the unknown symbol. "The nearest neighbor algorithm is one of the simplest learning methods known, and yet no other algorithm has been shown to outperform it consistently" [Cost and Salzberg, 1993, p. 76]. Aside from its simplicity and intuitive appeal, the classifier can be easily modified as a learning system. 4. Incremental learning By continually adding new samples that it "encounters" into the database, k-NN classifier improves its performance, thus learning to identify the symbols more accurately. The recognition can be further enhanced by modifying the feature space, or equivalently, changing the weights in the distance measure. A commonly used weighted-Euclidean metric between two vectors X and Y in an N-dimensional feature space is defined as: d- w._ co(Xi - Yi )2. By changing the weights, (Os, the shape of the feature space can be changed. (See Figure 1.) ICMC Proceedings 1996 55 Fujinaga

Page  56 ï~~Although the problem is simple for a twodimensional case, i.e. using two features, when many features (up to 20) are used, the problem of determining the set of weights that results in the optimal recognition rate becomes extremely complicated. Since no known deterministic method References [Aha, et al., 1991] D. Aha, D. Kibler, and M. Albert. Instance-based learning algorithms. Machine Learning, 6: 37-66, 1991. [Brooks, 1978] L. R. Brooks. Non-analytic a) b) Y1 Y2 h h'!X Xl X2 X1 X2 w w Figure 1.Unkown object X and its neighbours in feature spaces. By changing the weights in the distance metric, the shape of the feature space can be mod. a), the nearest neighbour of X is Y2. In b), the nearest neighbour is X1, where the vertical been scaled. for finding a optimal solution exits, some other technique is needed to address this problem. 5. Genetic algorithms Genetic algorithms (GA) [Holland, 1975] are often used whenever exhaustive search of the solution space is impossible or prohibitive. The set of weights are converted to "genes" and those that have high recognition rates are made to survive in this pseudo-biological environment. Briefly, the initial environment is randomly populated. Through the process of selection, fit individuals (those who perform well) are mated to produce offspring, who will hopefully outperform their parents. Although the optimal solution is not guaranteed by GA, near-optimal results can be obtained relatively quickly and preliminary experiments with the system have shown dramatic improvements in the recognition rate. By using GA from the beginning of the learning process, a set of good genes, or the set of weights, are saved so that they can be used as the starting points for the future selection processes. 6. Conclusions The exemplar-based model is an attractive and useful solution for the classification stage of AOMR. It also offers a promising and alternative approach for many other types of pattern recognition, categorization, and learning tasks in music and other fields. concept formation and memory for instances. In E. Rosch and B. Lloyd, eds. Cognition and categorization. Hillside, N. J.: Erlbaum, 1978. [Cost and Salzberg, 1993] S. Cost and S. Salzberg. A weighted nearest neighbor algorithm for learning with symbolic features. Machine Learning, 10: 57-78, 1993. [Cover and Hart, 1967] T. Cover and P. Hart. Nearest neighbor pattern classification. IEEE Transactions on Information Theory, 13(1): 21 -7, 1967. [Fujinaga, et al., 1991] 1. Fujinaga, B. Alphonce, B. Pennycook, and K. Hogan. Computer recognition of musical notation. Proceedings of the International Computer Music Conference. 66-73, 1991. [Hintzman, 1986] D. L. Hintzman. 'Schema abstraction' in a multiple-trace memory model. Psychological Review, 93: 411-28, 1986. [Holland, 1975] J. H. Holland. Adaptation in natural and artificial systems. U. of Michigan Press, Ann Arbor, 1975 (reprinted by Cambridge: MIT Press. 1992.) [Medin and Schaffer, 1978] D. L. Medin and M. M. Schaffer. Context theory of classification learning. Psychological Review, 85:207-38, 1978. [Reed, 1972] S. K. Reed. Pattern recognition and categorization. Cognitive Psychology, 3: 383 -407, 1972. Fujinaga 56 ICMC Proceedings 1996