Page  00000248 New Method for the Directional Representation of Musical Instruments in Auralizations Felipe Otondo, Jens Holger Rindel Orsted DTU, Acoustic Technology, Technical University of Denmark email: { fo,jhr}@oersted.dtu.dk Abstract The issue of the representation of sound sources that vary their directional pattern in time in auralizations is introduced. Musical instruments are used as a reference for the discussion of the traditional representations with assumed fixed directional characteristics. A new method for the representation of the spatial sound contributions in time is proposed using multiple-channel recordings and various virtual sources in room auralizations. Possible developments of the proposed recording/reproduction method are described. 1 Introduction The term "auralization" has been coined as an analogous term to visualization - it therefore names the process of rendering audible (imaginary) sound fields. Room auralizations have as main objective a simulation as accurate as possible of the binaural listening experience in a certain location within a modeled space (Kleiner, Dalenback, Svensson 1993), (Odeon 2002). An important factor to be taken into consideration in an auralization is the directional characteristics of the sound source. Musical instruments have a complex directivity pattern, which generates a particular acoustic behavior in a room. The aim of this investigation is to take a closer look at their directivity in the case of a real performance and to provide a better representation of this behavior in room auralizations. 2 Directional characteristics of musical instruments The sound produced by musical instruments involves many different acoustic features that are related to intrinsic characteristics of the instrument. One of these features is the directional characteristic, or directivity, which is the way in which the sound of the instrument is radiated in different directions at different frequencies. The directivity of a musical instrument is affected by the different notes played on the instrument (Meyer 1978), the different performing intensities (Rossing 1990) and the different playing techniques. These changes are different for the different families of musical instruments, due to the complexities of the musical instrument as a multi-resonating system (Fletcher, Rossing 1998). An example of the measured directional characteristics of four isolated notes played on a Spanish guitar at the 500 Hz octave can be seen in Figure 1. In this case the notes were played within two octaves by the performer trying to maintain the same intensity. 3 Musical instruments as sound sources in auralizations When musical instruments are used as sound sources for auralizations, it is important to take into consideration their radiation characteristics in order to have a representation of the sound source in the room model. As already mentioned, musical instruments are sound sources that have a complex directivity which cannot be easily described in a real performance situation. If a fixed directivity pattern per octave were to be considered, such as the case of a loudspeaker, the result would be rather poor and inaccurate. The directivity changes in time would be ignored and the consequence would be a wrong directional pattern with the level at certain frequencies of the particular spectra either emphasized or diminished. Perceptually this will deteriorate the listening experience due to the added colourations. A more accurate representation that will contain the source directivity changes in time is therefore necessary. 4 Improvement of the representation of sound spatial One could offer a better representation of the spatial sonic characteristics of a musical instrument in a room auralization, or of any source that changes its directivity in time, by taking into account the various samples of the sound field created by the source; they are to be used afterwards in the reproduction process. One method of achieving this is through simultaneous anechoic recordings of the musical instruments with microphones surrounding the source in order to capture the sound radiated in different directions. An example of an anechoic 4-track recording of a 248

Page  00000249 16th string (E2) 6 2nd string (B3) 0 45 315 45 90 -4- - I 1 135 - 270 90 0 / dB 270 ) dB 225 135 225 180 180 04th string (G3) 0 1st string (E4) 0 90 270 )dB 225 135 - 180 180 225 Figure 1. Polar diagram of the directivity of a Spanish guitar in the 500 Hz octave band for four isolated notes within two octaves. The notes were played by a performer trying to keep the same level of intensity. The position of the guitar was vertical, as in a normal classical music performance. The dynamic range is plotted from 20 to 60 dB. 249

Page  00000250 musical instrument can be seen in Figure 2, where four microphones are located around the source. After making the multi-channel recording of the instrument, each of the particular recordings registered by the microphones is played by a particular virtual source in the auralization according to the original position in the recordings. This can be done easily in the simulation program by defining sources that have a neutral directivity pattern (omnidirectional) within a discrete span of radiation. In the case of a 4-track recording of figure 2, each source span would correspond to a quarter of a sphere. Figure 3 shows a room acoustic simulation for the example of figure 2, where an auralization considering four virtual sources has been done, each source with an omnidirectional characteristic within a span of a quarter of a sphere and radiating in the direction of 0, 90, 180 and 270 degrees. The new source (consisting of the four virtual sources together) will radiate in a distinctive way in each of the four directions following changes in level, movements, asymmetries and orientation of the original source that were recorded by the individual microphones. Figure 2. Setup for a 4-track anechoic recording of a source. Figure 3. Room acoustic simulation with one listener and four virtual sources (left). View from above of the room showing the four virtual sources, each in a different direction with a discrete radiation pattern of a quarter of a sphere (right). 250

Page  00000251 5 Concluding remarks The directivity of musical instruments in a real performance situation in room auralizations is an issue that needs a better representation, as the one already existing only assumes a fixed directivity per octave band. The proposed method offers an alternative directivity representation without the need of any directional data of the instruments. Further developments of the system will be aimed at an optimization of the recording setup, by considering the influence of the number and the position of microphones and their perceptual consequences in the room auralizations. Binaural virtual reality systems for room models usually lack a sound source definition and have problems with the spatial representation of the source (Kleiner, Dalenback, Svensson 1993). The use of a different kind of representation of the source's directional characteristics in these systems, such as the one proposed by this work, can help to improve and make more reliable the spatial representation of a sound source in sound demonstrations avoiding sound colorations. The directivity representation of sound sources in movement (like the real performance case of a saxophone player or an actor in movement) is not possible with the fixed directivity representations available nowadays. An implementation of this work could help to make more reliable representations of a live performance situation. The use of headphones in virtual reality systems limits the possibilities of sound reproduction. This is one of the reasons for the lack of commercial success of such systems. The use of multi-channel loudspeaker reproduction systems for multi-track recordings could be a further development of this project, considering a crosstalk cancellation system or some other filtering technique to avoid destructive sound interference (Gardner 1998). 6 Acknowledgments The work reported in this article has been financed by the European Community project MOSART (Music Orchestration Systems in Algorithmic Research and Technology) HPRN-CT2000-00115. References Kleiner, M., Dalenback, B. I., and Svensson, P. 1993. "Auralization - An Overview." Journal of the Audio Engineering Society, 41(11):861-874. ODEON. 2002. "Odeon Room acoustics software." http://www.dat.dtu.dk~odeon Meyer, J. 1978. Acoustics and the performance of music. Verlag Das Musikinstrument, Frankfurt. pp. 75-102. Rossing, T. D. 1990. The science of sound. Second Edition, Addison-Wesley (chapter 11, pp. 208-209). Fletcher, H.N. and Rossing, T. D. 1998. The physics of musical instruments. Second-Edition, SpringerVerlag, New York. (Parts III, IV and V). Kleiner, M., Dalenback, B. I., and Svensson, P. 1993. "Audibility of Changes in Geometric Shape, Source Directivity, and Absorptive Treatment - Experiments in Auralization." Journal of the Audio Engineering Society, 41(11): 905-913. Gardner, W. 1998. 3D Audio using loudspeakers. Kluwer Academy Publishers, Boston. 251