Page  00000001 University of Leeds Interdisciplinary Centre for Scientific Research in Music Kia Ng Interdisciplinary Centre for Scientific Research in Music (ICSRiM), School of Computing & School of Music, University of Leeds, Leeds LS2 9JT, UK email: kia@kcng.org web.: www.kcng.org Abstract This report presents a brief background of the University of Leeds Interdisciplinary Centre for Scientific Research in Music (ICSRiM, www.ICSRiM.org.uk) and discusses various ongoing research projects, including optical music recognition and restoration, interactive multimedia performance with virtual and augmented musical interfaces, and upcoming projects which explore relationships between physical gesture and musical gesture; related programmes of studies (taught and research) available at the University, and the newly completed building for the School of Music and the Centre. 1 Introduction In 1987, the School of Music established an Electronic Studio to provide undergraduate and postgraduate programmes in electronic- and computer-music courses and research. Research and development in this area has been steadily increasing in the University, leading to direct collaboration between artists and scientists within and outside the School. This forms the basis of the Interdisciplinary Centre for Scientific Research in Music (ICSRiM), which was founded in 1999. ICSRiM currently involves members of the Schools of Computing, School of Music, Electronic and Electrical Engineering, Mathematics, Earth Science, Sport and Exercise Sciences, Psychology, and Physics & Astronomy, with external members from other academic institutions, freelance artists and industrial collaborators. We are keen to establish new contacts in any relevant discipline to add to the growing list of national, international, academic and commercial collaborators. For further information about the Centre, please see the Centre website at www.icsrim.org.uk or contact info@icsrim.org.uk. 2 New Building A new physical centre was completed in the Music School building in January 2003, with major funding support from the HEFCE (Higher Education Funding Council for England), SRIF (Science Research Investment Fund) and the University. Figure 1 presents some views of the new building which is linked to the Clothworkers Centenary Concert Hall (CCCH). Figure 2 shows some interior views of the building, presenting: * the foyer which interlinks the old and new parts of the building, joining the new building with the Clothworkers Centenary Concert Hall (top photo of Fig. 2); and * the multimedia laboratory on Level 1 of the new building (bottom photo of Fig.2) Together with the Multimedia Laboratory, three ICSRiM laboratories and a machine room occupy Level 1 of the new building, and five Electronic Studios are located in the basement. Proceedings ICMC 2004

Page  00000002 _______________________I________________ Figure 2. CCCH foyer (top) and Multimedia Laboratory (botttom). 3 Research Themes ICSRiM provides a venue for research and development in a wide range of interdisciplinary research areas, including: * Analysis, encoding and transcription of musical information * Creative Human-Computer Interactions, Interactive Gesture Music, Virtual and Augmented instruments * Music Psychology and its technological applications This section presents various ongoing projects, current activities and future plans of ICSRiM. 3.1 Music Imaging This group of projects focus on the translation of paperbased music scores (printed music scores and handwritten music manuscripts) into machine readable representations. Applications include digital archiving, reprinting, restoration, preservation, distribution, digital library and others (Ng 2002, Ng 2003). The main project is supported by the UK Arts and Humanities Research Board (AHRB). Interactive MUSICNETWORK Currently we are involved in a comparative study of currently available commercial OMR systems. Issues of Optical Music Recognition (OMR or Music OCR) are complex. Informative and consistent evaluation of OMR systems is non-trivial due to the complexities and diverse range of music notation. In order to provide a coherent review, a set of groundtruth dataset among many other necessities (e.g. precise terminologies, assessment method etc) are vital. Ideally this dataset should include a wide range of sample music scores from all major music publishers, containing a representative range of fonts used, layout styles, page sizes and many other variations. These issues are currently being investigated by a new initiative called Interactive MUSICNETWORK supported by the European Commission, with a specialist working-group on Music Imaging. For more information of the project and to join the network, please see the project website at www.interactivemusicnetwork.org. Besides music imaging, the MUSICNETWORK is exploring many other related topics including music notation, standards education, protection, accessibility, distribution and other issues, bringing together the music industry, content providers and research institutions to explore the potential of new technologies, tools, formats and models, with interactive multimedia technologies. 3.2 Music via Motion (MvM) MvM is a research framework which can be described as Interactive Music (ISIDM 2001, Wanderley 2000, 2002). The main features of these projects involved interactive control of multimedia events, for example musical sound, using camera tracking technologies or physical sensors, with mapping strategies to create relevant output in accordance to the detected input (visual, gesture or other form of activities) and mapping functions. Currently ongoing MvM-related projects include: * a motion and colour sensitive video tracking system; The system uses live video and can be viewed as examples of virtual instruments; instruments without physical constraints, with dynamic and flexible configuration and mapping. In this case, the whole body of the user/player could be an instrument, and hence, dance was naturally one of the first application domains. The system was successfully utilised in a collaborative project, using the framework to integrate choreography, costume design, sound design and composition (Ng 2002a, Ng 2002b). * a multimedia face tracker; A real-time face tracker system to detect and track features of the face (eyes, mouth) for controlling synthesis parameters. For example, opening distance of a mouth to influence pitch (Ng and Scott 2002), and * an augmented drum with flex sensor Exploiting familiar instrument interface, the augmented-drum system uses standard drum brushes with embedded electronic sensor to provide additional controls to the performer (Ng et al. 2002). Further details and short video sequence of improvisation using the augment-drum can be found at www.leeds.ac.uk/icsrim/mvm/maxis02.htm. Currently we are developing a wireless version of the augmented-drum. Besides trans-domain mapping from motion to audio output, we are exploring mapping of motion to graphics and other multimedia outputs. Current MvM collaborations Proceedings ICMC 2004

Page  00000003 include De Montfort University and the Liverpool Institute for Performing Arts, supported by the Yorkshire Arts (Landy, Jamieson and Ng, 2003). Figure 3: An MvM public performance. ICSRiM is also participating in a new initiative supported by the ESF (European Science Foundation) EC COST-TIST (www.cordis.lu/cost/src/tisthome.htm) named ConGAS (Gesture Controlled Audio Systems). Further information about this project can be found online at www.cost287.org 3.3 Expressive Performance We are also interested in studying the gesture of musical instrumental playing, trying to find out the relationship between body gesture and expressivities; studying relationship between physical gesture and musical gesture. New collaboration include "expressive piano" which analyses dataset of experienced musicians' natural upper body movements (head, shoulders, elbows, wrists), captured with a 3D infra-red tracking system with respect to expressive timing and dynamics, captured using a Yamaha Disklavier. 3.4 Virtual Ballet We are developing and integrating a group of projects which would provide a virtual choreography system, taking dance score as input to render (3D animation) virtual ballet dancers (Neagle, Ng and Ruddle 2002) with the influence of the leamt expressive model. Figure 4 illustrates main sections of the overall plan, and Figure 5 presents a reconstruction of the 3D motion data captured using a 12 -camera motion tracking system (VICON). 3.5 Holo-Stage This project plans to augment the stage with 3D computer graphics which could be influenced and animated by the mapping module (from MvM). It intends to use photo-realistic 3D model of real-environment (see Figure 6) to transform the stage, augment and interact with the performance, mixing the performer, instrument and environment (Ng, et al. 2002b). Since the whole scene is represented by graphical primitives, the surfaces can be easily modified and animated. Operations such as transformation, translation,......................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................... t m ) a m e.................................................................... in................................................................................... --..................................... instru ental playig, tryn...................nshi betwee body esture and..................yin relatio ship b tw een hysic......................ture New cllabratin inlude"e~c~res..........hic analyes dtase of xperence muscia..........ppe body ovemets (had, souldrs, ebow............ure with 3D nfi--redtracing sste..........t t exprssiv timng ad dyamic, cature.........mah................................... W e ar de......................... f pr ject whichwouldprovie a irtua chorogr............kin dancescor as iput o rener (D ania.............lle dan ers(N ag................... te iflenc o the l arnt e.......................... ate m ai secti ns of t --...................... es nts recon truc i.......................... sing a 12 cam era m o.....................ON)..................... T his pr ject........................th 3 Since he whle scne isrepr.............ica primities, th surfa es..................d an animaed. Oeratins sch astransor.............ion and many other functions are possible. Physically demanding or impossible scenarios can be virtually created, projected onto the stage and dynamically altered by the mapping module depending on the sensor input. Notation (dance score) Machine SRepresentation Voicn r ther forms of|r interactionr/inputcontro Expressive IModel 3D Avatar/ Character eBallet Simulation 3D Virtual L-, Environment ----- HCIllnteraction Figure 4: Virtual choreography. Figure 6: 3D wire-frame model generated from 3D data captured by a laser-range-finder (top), and 3D photo realistic model, textured with digital images captured at the real-scene (bottom). Proceedings ICMC 2004

Page  00000004 4 Programmes Currently available programmes at the University which relates to the research interests of the Centre include: * BA/BSc (Hons) Music Multimedia and Electronics (MME) * Postgraduate Diploma (PGDip): Music Technology and Computer Music * Master of Music (MMus): Music Technology and Computer Music * Higher research degrees (MA, MSc, MPhil and PhD), and other joint honours taught programmes The MME programme is for both BA and BSc, depending on the focus of the optional modules and topic of the final year project. Besides traditional fundamental courses from both the School of Music and School of Electronic and Electrical Engineering, there are special cross-disciplinary modules, at every level, which will be jointly offered by members of both Schools, focusing on the interdisciplinary topics including audio-visual processing, communications, music and multimedia technologies. Further information of this programme can be found at http://www.leeds.ac.uk/mme In addition to four Electronic Studios and a Recording Studio (in the basement of the new building), a Multimedia Laboratory has also been designed to assist the learning and teaching activities for the new MME programme (see Figure 2). Larger classes can be carried out at the main lecture theatre with smaller hands-on and/or practical sessions in the Studios. 5 Recent activities and Summary International events host by ICSRiM in 2003 include: * MAXIS 2003: 2nd International Festival & Symposium of Sound and Experimental Music, 10th - 13th April 2003 (www.maxis.org.uk), * WEDELMUSIC 2003: 3rd International Conference on Web Delivering of Music, 15th - 17th September 2003 (www.wedelmusic.org/wedelmusic2003/), * MUSICNETWORK Open Workshop 2003, 17th - 18th September 2003. In 2004, ICSRiM is hosting the AISB 2004 Convention: Motion, Emotion and Cognition, 29 March - 1 April 2004 (www.leeds.ac.uk/aisb) and the COST287-ConGAS Symposium on Gesture Interfaces for Multimedia Systems (www.leeds.ac.uk/aisb/gims). In summary, this report presents the newly completed building for ICSRiM and the School of Music at the University of Leeds; briefly described various research projects; taught and research programmes currently available at the University, as well as many other activities related to ICSRiM. 6 Acknowledgments Thanks to funding support from EC IST FP5, MUSICNETWORK, Arts Council England, COST287 -ConGAS, AHRB, HEFCE and EPSRC. Thanks to all ICSRiM members, research partners and collaborators for their kind support and collaborations. References ISIDM, ICMC/EMF Working Group on Interactive Systems and Instrument Design in Music mailing list, K.C. Ng and M.M. Wanderley, http://wwwjiscmail.ac.uk/lists/isidm.html, created Jan 2001. ISIDM Website: http://www.igmusic.org Ng, K. C. 2004. "Music via Motion (MvM): Trans-Domain Mapping of Motion and Sound for Interactive Performances." in Gunnar Johannsen (ed.) Proceedings of the IEEE, Special Issue on: Engineering and Music - Supervisory Control & Auditory Communication, 92(4): 645 - 655. Landy, L., E. Jamieson, and K. Ng. 2003. "In Transit or Realising One's Aesthetic when the Technology Finally Catches up." Proc of the International Festival & Symposium of Sound and Experimental Music. Neagle, R. J., Ng, K. C. and R. A. Ruddle. 2002. "Notation and 3D Animation of Dance Movement." In Proceedings of the International Computer Music Conference (ICMC2002), pp. 459-462. Ng, K. C. 2002. "Music Manuscript Tracing." Blostein, D & Kwon, Y-B (eds) Graphics Recognition: Algorithms and Applications. pp. 322-334, Springer-Verlag. Ng, K. C. 2002a. "Interactive Gesture music Performance Interface." Proc. of the International Conference on New Interfaces for Musical Expression (NIME 2002). Ng, K C. 2002b. "Sensing and Mapping for Interactive Performers." Organised Sound, vol. 7, pp. 191-200. Ng, K. C., I. Symonds, J. Scott, J. Cook, A. Bohn and R. Sage. 2002. "Music via motion: An Interactive Multimedia Installation with Video and Sensor." Proc. ofMAXIS: A Festival of Sound and Experimental Music. Ng, K. C. and J. Scott. 2002. "A Distributed Real-Time System for Interactive Music Mapping with Multiple Inputs." Proceedings of the International Computer Music Conference (ICMC2002), pp. 534-537. Ng, K. C., V. Sequeira, E. Bovisio, N. Johnson, D. Cooper, J.G.M. Goncalves, D. Hogg. 2002b. "Playing on a Holo-stage: Towards the Interaction between Real and Virtual Performers Object Interaction." Beardon, C & Malmborg, L (editors) Digital Creativity: A Reader, pp. 241-249 Swets & Zeitlinger. Ng, K. C. 2004 (to appear). "Optical Music Analysis for Printed Music Score and Handwritten Music Manuscript." S.E. George (ed.), Visual Perception of Music Notation: On-line and Offline Recognition, Idea Group Inc. Wanderley, M. M. and M. Battier (eds.). 2000. "Trends in Gestural Control of Music." Ircam - Centre Pompidou. Wanderley, M. M. 2002. "Report on the ICMC/EMF Working Group on Interactive Systems and Instrument Design in Music - ISIDM." Array, International Computer Music Association. Proceedings ICMC 2004