Page  1 ï~~EXPLORING MUSICAL MAPPINGS AND GENERATING ACCOMPANIMENT WITH CHAOTIC SYSTEMS E. Cupellini Department of Mathematics University of Torino, Italy C. Rizzuti*, E. Bilotta*, P. Pantanot * Department of Linguistics t Department of Mathematics University of Calabria, Italy M. Wozniewski, J. R. Cooperstock Centre for Interdisciplinary Research in Music Media and Technology McGill University, Canada ABSTRACT Chaotic systems can exhibit a complex evolution that is well-suited to musical control. By modelling one such system, known as Chua's circuit, a number of chaotic attractors that oscillate in rhythmic fashion were discovered, similar to traditional wave oscillators used in sound synthesis. This led to an exploration of the sonification of the system and an investigation of musical mappings such as filter control, spatial rendering, and note generation. A method of controlling musical accompaniment was also developed, where the chaotic system is used to generate melodies that conform to a key and scale specified by the user. Finally, an immersive virtual interface was designed that allows a performer to play along with the accompaniment, while controlling various high-level parameters. 1. INTRODUCTION The virtualization of musical interfaces has led to many exciting forms of artistic expression, where novel input techniques are used to control audio synthesis and processing. Many interfaces, however, end up being too simplistic to allow for virtuosic playability, or they become too complicated and nondeterministic to learn. The primary challenge is thus to define the most effective mappings of control parameters to musical effects. Many researchers argue that these mappings need to be complex and multimodal in order to provide for artistic expression. Wessel and Wright [16] for instance, suggest that simple controls should be mapped to high-level musical behaviours such as navigation through a timbre space or multidimensional sound synthesis. Hunt and Kirk [9] likewise suggest that holistic control of a parameter space is better than sequential control using one-to-one mappings. These styles of interaction mean that a performer will explore the effects of manipulating several parameters at once rather than explicitly learning several individual mappings. The performer can practice and improve his or her technique, leading to the potential of virtuosity with the interface. Thus, rather than mapping a controller to a simple musical effect (e.g., a knob that controls the frequency of an oscillator), interfaces should exhibit complex behaviours, that need to be learned and practiced by their performers. Our objective is thus to create mappings that control high-level processes, in particular, we consider the generation of musical accompaniment. Several prior works have investigated automatic generation of real-time accompaniment[11]. The task of such systems is to detect pitches, transpositions, and other variations of a performance, then provide additional content that enhances or extends the musical complexity. Typically, Bayesian networks, Markov chains, or other probabilistic methods are used to generate events correlated with the material being played. However, in our case, we are interested in the accompaniment problem both as a means of exploring the use of chaotic systems for musical creation and to gain understanding of the evolution of chaotic systems. Perhaps surprisingly, this direction is particularly well suited for expressive interaction. Chua's circuit [3, 4, 5], for example, exhibits chaotic behaviour that is both complex, yet controllable with only a few parameters. By modelling this system, we have found that the circuit's output has a tendency to oscillate in a rhythmic fashion, lending quite well to musical control. Various mappings of the resulting signals are explored, such as a direct mapping of chaotic signals into sound, and using the signals to drive higher-level algorithmic processes such as musical note generation. For the latter case, we also develop the Circle Of Fifths interface, which allows a performer to control and constrain the notes being played. The result is a musical accompaniment system that follows in the key of the user, generating pseudo-random melodies according to the scale and range specified. This system is prototyped in an immersive 3-D environment, providing an engaging experience for anyone interested in exploring music and learning about chaotic behaviour. 2. CHUA'S CIRCUIT In Chaos Theory, we consider non-linear dynamical systems that evolve in a seemingly random fashion from initial conditions, even though their behaviour is deterministic and well structured. Unlike random series, these systems always evolve in the same way for a given set of initial conditions and parameters values. This is, of course, impossible to witness in real-world settings because we cannot have full knowledge of the system state, but it is theoretically possible in digital simulations. Thus, chaotic

Page  2 ï~~systems can be simulated repeatedly to obtain the same complex data series, lending well to use in control systems. Chaotic systems exhibit both a complex behaviour and stable dynamics in a well defined region of space known as an attractor" - The complexity of the behaviour is not in opposition with the stability of the system. Geometrically, this can be a point, a circular orbit such as a limit cycle, or even a fractal-structured shape called a strange attractor. The oscillatory behaviour is quite appropriate for musical applications, allowing for the use of chaotic oscillators instead of traditional shapes such as sine waves or sawtooth waves. Furthermore, we suggest that the behaviour of such oscillators provides an aesthetic that may seem more natural to users. Chua's oscillator in particular, is a canonical system for research in chaos, since it can be realized in a realworld setting as a simple electronic circuit. The system has three degrees of freedom, and is described by three state equations whose variables corresponds to the magnitudes of physical forces. However, we use a dimensionless model to simulate it, with six parameters: a, /3, '7, a, b, k corresponding to the ratios between two or more physical components. Every set of parameters in the dimensionless model corresponds to more than one set of parameters for the physical model. The k parameter determines how 'fast' the real circuit is in comparison with the dimensionless system. The state equations for the dimensionless model are: d ka[y- x - f (x)] dk(x-y+z) d k(-j3y- yz) where: \ C Figure 1. Three-dimensional rendering of Chua's double scroll attractor in phase space, with parameters: a=9.3515908493, /3=14.7903198054, y7=0.0160739649, a=-1.1384111956, b=-0.7224511209, k=1. grouped into 19 families, each corresponding to a separate region in the five-dimensional (a, /3, 7y, a, b) space. To keep the control system manageable, we constrain the operational range of our six control parameters to the 19 possible families, offering a stable and expressive input space for experimenting with chaotic systems. The output space is three-dimensional, encouraging the use of technologies that can render this system in 3-D. It should be noted that although visual renderings are an obvious choice, the repetitive oscillatory nature of the output implies that a sonic rendering may also prove to be useful in analysis. 3. CHUA-BASED MAPPINGS When considering the mapping of a chaotic system, the goal is to extract meaningful features of the signals, in order to generate interesting musical effects. For Chua's circuit, this involves an exploration of the 19 families of stables attractors, which each have three correlated oscillations (x, y, z) that can be used as control signals. Figure 2 shows some typical signals, corresponding to a twosecond time series of the z coordinate for a particular attractor. Some of our previous work explained the theory of sonifying these time series by mapping mathematical structures to audio events [13]. In the remainder of this section, we expand the possible mapping strategies, and f(x) bx + -(a -b){x + 1 2 x-1} Solving this system of integral equations allows us to simulate the evolution of Chua's oscillator in threedimensional phase space, made up by the variables: x, y, and z. The qualitative behaviour of the oscillation is controlled by the six input parameters mentioned earlier. Changing the set of the control parameters, the system can either converge to a fixed point, show a limit cycle, or result in a strange attractor. Some sets of control parameters are, however, physically impossible, so the system is unstable and the numeric simulation diverges to infinity. As the control parameters and the initial values change, the dimensionless equations produce a large variety of strange attractors of different shapes and sizes. Figure 1 shows a graphical rendering of the double scroll attractor. In previous work, we have shown 150 different attractors [1], each of which provide an interesting control source for musical purposes. However, these attractors can be 1 We use a 4th order Runge-Kutta integration method.

Page  3 ï~~consider whether these signals are appropriate for direct synthesis, or for the control of filter parameters and spatial effects. Finally, we describe a method that allows for the extraction of musical notes from these signals. This last mapping strategy is one that we feel to be quite powerful for musical purposes, and is used later in the design of the Circle of Fifths interface. 3.1. Direct Mapping Two approaches are possible for a direct mapping of Chua's time series to sonic effect [2]. The first is to apply the chaotic system as one component of a physical instrument's simulation. Most commonly, the oscillator would be used as a non-linear excitation source coupled with a linear resonator. The second approach is to apply the chaotic system directly as a signal generator, thus exploring the potential musical properties of the circuit. In the early 1990s, Xavier Rodet used Chua's circuit in a sound synthesis process based on physical modelling. He found that chaotic signals can exhibit rich harmonic structure, and also behave largely as noisy signals [14]. The case when both occur simultaneously is an important feature, since traditional musical instruments often exhibit a mixture of harmonic and noisy components, in a way that is relatively difficult to model with standard synthesis techniques. Furthermore, the noise exhibited by chaotic systems tends to be distinct from white noise, which has a broad band spectrum. Instead, the spectrum of chaotic signals resembles a 1/f energy distribution [2], which has a more natural sound to most listeners. Recognizing the simultaneous presence of sinusoidal components and noise in chaotic signals, researchers have tried to incorporate Chua's circuit in a musical performance. Choi for example, noted that there is often a perceived pitch "attributable to a centered energy concentration around specific frequency regions" [2]. He proposed a simple approach to change that pitch by modifying the integration step, hence varying the sampling rate of the system, resulting in a respective change in the frequency of the oscillator. We have tried a similar approach, but instead manipulate the k variable of the system state equations defined above. Our experimental experience has shown that this will change the speed of the oscillator while avoiding errors associated with a large integration step. Other examples include Mayer-Kress et al. [10], who used Chua's circuit as a sound generator for both pitched and noisy sounds. They note how Chua's circuit provides filter-like effects when parameters change such that the pitch remains constant, but spectral energy is redistributed. This is interesting from a musical perspective because harmonic sounds can change without needing to use a separate filter module in the sound-processing pipeline. The generated tones naturally exhibit tremolo or vibrato effects, or slow evolutions in timbre. 3.2. Filter-based Mappings Another possible use of the Chua oscillator can be in the realm of digital signal processing, where chaotic modulation is applied to an external signal (e.g., a signal from a musical instrument or microphone). The modulation can be obtained by a convolution of the two signals, by using the chaotic signal as a frequency modulator for FM synthesis, or simply by multiplying the signals together (ring modulation). The rate of oscillation can be controlled either by changing the integration time-step, changing the k parameter value, or in more complicated ways by acting on the five input parameters. If the oscillator is reduced to frequencies in the sub-audio range (0-20Hz), it can even be used as a low frequency oscillator (LFO), and mapped to effects such as filter cutoff, gain, arpeggiation speed, or any other traditional synthesis/filtering parameters. The interesting aspect of this kind of mapping is that chaotic behaviour produces slight irregularities in the rhythmic patterns, which might feel more natural than the repetitive aspects of typical sinusoidal, sawtooth, or square-wave oscillators. Furthermore, it may be possible to generate multiple effects with one oscillator, including those such as flanging, phasing, tremolo, and variable frequency band pass filters. 3.3. Spatial Mappings Given that Chua's circuit has an output parameter space in three-dimensions, it seems natural to map that output to coordinates in 3-D Cartesian space. The system's evolution can hence be modelled as the trajectory of a point moving through 3-D space, and by using a spatialization engine, we can add audio cues to a sound source to make it appear at that location. This can be accomplished with amplitude panning when using a loudspeaker array [8, 12], or HRTF filtering when using a binaural display with headphones [7]. 3.4. Musical Note Generation In contrast to directly sonifying the oscillation of Chua's circuit, one can use a codification process that periodically samples the system for note values. Here we describe one particular method that allows a user to specify the key and the type of musical scale from which to choose these values. The resulting melody can thus be used as musical accompaniment that conforms to a particular tonality. When sampling the system, we must choose a quantization that will produce harmonically related notes instead of arbitrary pitches. Our choice is to consider the piano keyboard, since it is one of the most widely used instruments for learning music and highly representative of our musical culture. The division of keys supports western musical practice, where pitch is half-step quantized (i.e., an octave divided into 12 notes), and the diatonic scale (7 notes of the chromatic scale) is easily identified. Our learned pitch alphabet is structured in a hierarchy where

Page  4 ï~~Family 04 a=,.1.8802, P=-0.148275, y=-0.3416, a=1.21842, b=-0531446, k=-0.577485 Ferrly 05 a=-2.5926., 0=1.5206, y=O, a=-1'4 29, b=-0.7143, k=-0.2 Pamir' 06 Q.i=1.515. A=O.0126, y=0.1329, a=-02241, b=-00281, k=-0.437657 Family 07 a=-1.44554, 0=-0.0139531, y=-0.0297, a=0.371612, b=-0.4768, k=0719833 Family 18 Family 19 a=1.14697, 0=0.000208, y=0.00732735, a=-0.781553, b=0.00127, k=-1.25978 a=-1.09423, =0.0001i07608, y=0.00760137, a=-0.309696, b=0.0008901 Figure 2. Sample oscillations from various attractor families. Only oscillation in the z-axis is shown. the lowest level contains only chromatic notes, followed in the next levels by the diatonic scales, the triads, and finally the root of the scale at the highest abstract level [6]. However, we will avoid using the full set of chromatic notes and root note alone, because they do not convey significant musical structure. Instead, our note extraction algorithm will focus only on diatonic scales and triads. As a first step, a possible range of notes is defined by specifying a central pitch and the upper and lower boundaries (e.g., one octave above and below the middle C of a piano). The type of musical scale also needs to be chosen, such as the diatonic (seven pitches per octave), pentatonic (five notes per octave), or a scale based on triads (the first, third, and fifth degree of a diatonic scale). After specifying these parameters, we can consider Sh as the ordered set of h possible notes that can be generated by the process: Sh = {nrwtei, note2,.., noteh} We calculate the minimum value m, and maximum value M, for the time series associated with each of the three axes of the Chua circuit. As a result, we obtain [m', MM] as the range of oscillation in x(t), and likewise for y(t) and z(t). These three ranges can be thought of as the three dimensions of a parallelepiped, in which the system's evolution is constrained. The length of an edge (MM - m) can then be divided into h segments, where the length of one small segment l can be described with the following type of equation: lx = (MM - mx)/h In this way, each of the three time series, x(t), y(t) and z(t) can associated with an integer n(t), my (t), nz (t) where n(t) e [1, h]. This results in a quantization of the series to fit with the amount of possible notes. In the case of the x-axis, the segment number can be found by: nr(t) = 1 + (x(t)- M)/lm Notes are generated by the system in two ways. In the simple case, they are played every time the system evolution results in a different n(t) value. Since the system oscillates at a certain frequency, these notes will naturally form a rhythmic melody, where the user controls the tempo with parameter k. The other case that causes note generation is when a user imposes a sampling of the system, either in a periodic fashion, using perhaps a MIDI device, or triggered by a traditional instrument (coupled with a pitch-tracking algorithm). The user-triggered note value causes the oscillator to restart from its initial conditions, and the system begins to generate accompanying notes related to the user's pitch. Specifically, the note inserted is assumed to be the root of a musical mode, 2 and 2 A mode is a scale that starts on a specific degree of the diatonic scale.

Page  5 ï~~____ I play only pitches in C-major key.wh a t....................,. I play I want! I 'Id -\ L 'now AIL, Figure 3. In this example, the Sh register has been set to the C-major key and a scale of triads. With each user event, the oscillator begin a new evolution and generates notes from Sh starting from the closest pitch played by the user. If the user inserts a note outside of the target scale, it will be transposed to match the closest value in the register. In this manner, accompaniments are created according to a fixed key and scale. the generated melody will start on that note and continue with new n(t) values. An interesting feature about the accompaniment that follows is that the key is maintained no matter what notes the user plays. Furthermore, the pattern will always be deterministic, since the evolution of the chaotic system is restarted with every event. Figure 3 shows the difference between the user's input and the system-generated notes, which always follow the same musical structure. In this case, the key of Sh is Cmajor, and a triad-based scale is selected. Thus, all notes generated by the system belong to the the C-major triad, starting from the root note specified by the user. If the user accidentally (or perhaps intentionally) plays a pitch that does not match the musical scale, it will be transposed to the closest value in Sh. While this interaction is occurring, the user can dynamically change the contents of Sh in several ways. Particularly, the central pitch can be changed, and a different scale can be selected. For example, a pedal can be pressed, indicating that the next note played by the user will change the reference key of the note generator accompaniment. This will replace the contents of Sh with notes related to that new key. Another control can be provided to switch between different scales. Thus the user can change the level of accompaniment, ranging from simple arpeggiated triads to notes from the full diatonic scale. An example of such a control system is presented in Section 5. 4. AUDIOSCAPE: A 3-D PLATFORM FOR EXPLORING MUSIC & CHAOS As mentioned in Section 3.3, the output parameter space of Chua's circuit is well-suited to representation in 3-D space. This permits a straightforward graphical rendering that accompanies the audio generated Chua-based mappings. The addition of visual feedback may help performers understand the current state of the system, and associate certain trajectories to memorable sonic effects. The Audioscape project 3 provides a platform that alFor example, the Ionian mode is built from the tonic, or first degree of the key. The Aeolian mode, on the other hand, starts on the submediant, or sixth degree of a key. 3 www.audioscape.org lows for this type of visualization. Being heavily geared towards the organization of sound and signal processing in a virtual 3-D environment, Audioscape also allows for the placement of sound sources, virtual microphones, and signal processors at specific coordinates in 3-D space [17, 18]. Furthermore, mapping patches can be designed to create dynamic control of the objects in the scene, allowing sounds to change position according to the output of Chua's oscillator. We have recently added several features to support the exploration of chaotic systems using Audioscape. One example is the soundLine structure seen in Figure 4, that allows for the visualization of trajectories in 3-D space. A soundLine can be thought of as a fixed-length FIFO queue of connected vertices, which can be fed by the output of a generator such as the Chua circuit. New points are always added to the front of the line until a maximum length is reached, after which the oldest vertex is pushed off with any new addition. The visual effect is similar to the snakelike trajectory found in many variants of the classic Snake game.4 A soundNode, which represents the position of a sound or processing unit in 3-D space, can be attached to a soundLine so that it travels along its trajectory. This allows a sound source to be spatialized using the Audioscape engine. Users listening to the scene with headphones or a multi-channel loudspeaker array will hear the sound at a certain location, depending on their relative placement to the node on the trajectory. However, other interesting possibilities occur when we consider that the 3-D position can also be mapped to higher-level sound parameters. For example, the height of the node could represent the frequency or brightness of a sound. 5. THE CIRCLE OF FIFTHS We have seen that Chua's circuit provides interesting mapping possibilities with rich behaviours. We would like to exploit such richness for the purpose of musical expres4 "Snake" is a common game on many mobile phones, also known as "Nibbles" on early MS Windows, and originally as the 1977 arcade game, "Hustle".

Page  6 ï~~the melodies that our chaotic system generates. The interface is organized in a doughnut shape, as shown in figure 4 and 5, with a radial segregation into twelve circular nodes that are a perfect fifth apart. We assume control input in a two dimensional plane, allowing for the use of standard controllers such as joysticks, tablets, and computer mice. These devices control a cursor that can move between each of the twelve nodes. However, by moving one's entire body instead of a small device, more kinesthetic feedback is available, thus providing a stronger conceptual model of the layout. This feedback allows a performer to return to various positions without relying on visual information, similar to how musicians remember hand and finger position without looking at their instruments. We achieve such interaction capability by implementing the Circle of Fifths interface within our Audioscape platform. This positions the user in an immersive environment, where we monitor position offsets from a central sweet spot using a motion tracking system, thus allowing user body position to control the cursor. Figure 6 shows an example where a user is tracked by the system, allowing for simultaneous control of accompaniment while playing an instrument. Figure 4. A soundLine is used to visualize the evolution of a Chua attractor, seen here as a spiralling trajectory in the middle of the image. Also visible are target areas for the Circle of Fifths interface, described in Section 5. sion and exploration, although constrained to operate in accordance with musical theory. Most western musical instruments are designed to operate on the chromatic scale, where octaves are divided into twelve notes. It is therefore useful to impose similar constraints when designing new computer music interfaces, facilitating the performance of traditional repertoire. As discussed in Section 3.4, notes can be generated from chaotic signals, with extra controls to specify the root note and scale from which notes are chosen. In this section, we describe an interface that provides these controls, but consider the fact that modulation should also be supported. Several musical styles, jazz for example, make broad use of jumps between keys. We thus seek an easy way to transpose melodies, similar to the way the capo or slide is used with a guitar. Also, the ability to choose musical modes, such as changing betwen major and minor keys, is needed to conform with musical repertoire. Figure 6. Playing the guitar with the Circle of Fifths interface. The gray box beside the performer is the Polhemus Liberty, a 3-D tracking device that reports the position of the guitar to the system. 5.1. Tonality & Modulation As seen earlier in figure 5, each of the 12 nodes in the circle has an associated note value. When the user (or cursor) is within one of these nodes, all musical accompaniment will be generated in that key. This is due to the fact that the Sh register described in Section 3.4 has been filled with only those notes. When the cursor moves to another node, the register changes, and the generated music is effectively transposed to a new key. Figure 7 shows an actual melody produced by the system. In the first staff, the user has selected the Aeolian mode by standing in the C-major node and playing an 'A'. Figure 5. The imaginary geometrical space, in music theory called the Circle of Fifths. Letters refers to the key name, each of which is one perfect fifth apart. The resulting system, which we call the Circle of Fifths interface, thus allows performers to shape and transpose

Page  7 ï~~r-2-qPrlC ri ","r-3-I7 l i idi Figure 7. Transposition of a produced melody while the Aeolian mode is specified. First staff: The user stands in the C-major node. Second staff: The user has moved to the F-major node, and the resulting melody is transposed. Before the second staff, the user moves to the adjacent F-major node without playing any additional notes. The system continues to play in Aeolian mode but now, only notes in the F-major key are generated. This is heard as a modulation in the musical accompaniment, since the entire melody has been transposed to a new key. This example also shows how a simple chord progression (I-IV) can be realized with the interface. In fact, by selecting the right scale, the user can move across adjacent nodes without generating a modulation effect. This demonstrates that the Circle of Fifths is more useful than a simple key increasing strategy. With our interface we can often reach the most similar key by applying minimal movement, and in practice the most common modulations are between perfect fifths or fourths (corresponding to moving to the adjacent node in the circle). 5.2. Modal Scales To extend possibilities of musical exploration, a single node of the Circle of Fifths interface is divided into regions that correspond to the different scales mentioned in Section 3.4. We then divide a node such that the outside is the most simple while the center exhibits the greatest complexity, as seen in Figure 8. It has been shown that tension and physical effort help performers establish a causal relationship between their actions and the resulting audible effect [15]. This organization of scales results in a correlation of tension to the complexity of musical accompaniment, since triggering the center of a node requires the greatest accuracy and hence, physical effort, from the performer. The notes that are generated are more varied, containing all possibilities in the diatonic scale. Conversely, the outer edge of a node requires less accuracy since the target area is larger, and the complexity of music produced is diminished since outer scales contain fewer notes. In this manner, the interface tends to be more expressive and offers the potential for virtuosic control. Figure 8 shows how the scales are changed when a major mode has been selected. On the outer edge of the node, only notes from the triad are generated. Moving the cursor midway to the center results in the generation of notes from the pentatonic scale. Finally, in the center of the node, all notes from the diatonic scale are made available. This allows the performer to choose the level of accom Full diatonic scale I, II, III, V, VI, VII I, II, III, V, VI oo. I, III, V, VI.. I, III, V No notes produced Figure 8. The choice of the scale according to the cursor position inside of a node (for a major third degree). paniment generated by the system, and constrain chaotic note generation according to musical practice. 6. DISCUSSION & FUTURE DIRECTIONS We have considered several mapping possibilities for Chua's circuit and shown how it can be employed in a musical context, from direct synthesis to note generation. The richness of behaviours offered by the system cannot be attained by other linear dynamical systems or random series. The Circle of Fifths interface results in an immersive control paradigm that constrains musical accompaniment to operate within a controlled condition adhering to western musical theory and practice. Hence, performers can achieve a certain level of interplay with the system, as they perform along with their traditional musical instruments. The interface that we have created can in fact be used to transpose any musical material, without necessarily using the chaotic note generation component. This allows for a more interactive musical experience that is more engaging than playing over repetitive loops, pre-recorded accompaniment, or other forms of background musical material. Finally, the Circle of Fifths interface has significant pedagogical value. It can serve as an embodied platform on which to learn musical theory, while improving memory of pitch ratios using a clever mental organization. It can also be considered as a practice tool for skilled musicians, and for experimentation with new kinds of musical grammars, for example, by choosing to move in uncommon directions within the circle. The immersive and

Page  8 ï~~physical nature of interaction also alleviates some of the cognitive load imposed on a performer. By capitalizing on body motion, users may employ a variety of feedback modalities, with a balance of information between the visual, auditory, haptic, and kinesthetic channels. Future directions include providing the system with a polyphonic MIDI input control, so that the user is not constrained to play the root of a mode in order to obtain a coherent accompaniment, but can play a chord instead. Another extension would be to apply different mappings for each output parameter of Chua's oscillator. For example, one axis could select pitches and harmonic content, while another could be used to modify dynamics or temporal features. Combined with a rich mapping methodology, many such novel interfaces and musical experiences can be created. 7. ACKNOWLEDGEMENTS This work is the first result of the MASC (Music And Sound from Chaos) project, an exchange of knowledge and technology between McGill University and the University of Calabria. This collaboration was made possible by a Canada-Italy research exchange, supported by the Ministere du Developpement dconomique, de l'Innovation et de l'Exportation (MDEIE) in Canada, and the Ministero degli Affari Esteri, Direzione Generale Promozione e Cooperazione Culturale (DGPC) in Italy. The authors would further like to acknowledge the generous support of NSERC and the Canada Council for the Arts, who have funded the Audioscape research project through the New Media Initiative. 8. REFERENCES [1] Bilotta, E., Pantano, P. and Stranges, F. "A gallery of chua attractors: Part III", International Journal of Bifurcation and Chaos, 17(3):657-734, 2007. [2] Choi, I. "Interactive exploration of a chaotic oscillator for generating musical signals in real-time concert performance", Journal of the Franklin Institute, 331B(6):785-818, 1994. [3] Chua, L. O. "Global unfolding of chua circuits", In IEICE Transactions on Fundamentals of Electronics, Communications and Computer Sciences, pages 704-734, 1993. [4] Chua, L. O., Wu, C. W., Huang, A. and Zhong, G. Q. "A universal circuit for studying and generating chaos. I. routes to chaos", IEEE Transactions on Circuits and Systems I: Fundamental Theory and Applications, 40:732-744, 1993. [5] Chua, L. O., Wu, C. W., Huang, A. and Zhong G. Q. "A universal circuit for studying and generating chaos. II. strange attractors", IEEE Transactions on Circuits and Systems I: Fundamental Theory and Applications, 40:745-761, 1993. [6] Deutsch, D. The Psychology of Music. Second edition, Academic Press, London, 1999. [7] Gardner, W. G. and Martin, K. D. "HRTF measurements of a KEMAR", J. Acoustics Soc. of America, 97(6):3907-3908, 1995. [8] Gerzon, M. "Periphony: With-height sound reproduction", J. Audio Eng. Soc., 21(1):2-10, 1973. [9] Hunt, A. and Kirk, R. "Mapping strategies for musical performance", In M. Wanderley and M. Battier, editors, Trends in Gestural Control of Music. IRCAM - Centre Pompidou, 2000. [10] Mayer-Kress, G., Choi, I., Weber, N., Barger, R. and Hubler, A. "Musical signals from Chua's circuit", IEEE Transactions on Circuits & Systems II - Analog & Digital Signal Processing, 40(10):688-695, 1993. [11] Papadopoulos, G. and Wiggins, G. "AI methods for algorithmic composition: A survey, a critical view and future prospects", AISB Symposium on Musical Creativity. G. Wiggins editor. Edinburgh, UK, 1999. [12] Pulkki, V. "Virtual sound source positioning using vector base amplitude panning", J. Audio Eng. Soc., 45(6):456-466, 1997. [13] Rizzuti, C. "Mapping chaotic dynamical systems into timbre evolution", In Proceedings of Sound and Music Computinc Conference (SMC), pages 22-29, Lefkada, Greece, 2007. [14] Rodet, X. "Stability/instability of periodic solutions and chaos in physical models of musical instruments", In Proceedings of the 1994 International Computer Music Conference (ICMC), Aarhus, Denmark, 1994. [15] Vertegaal, R., Ungvary, T. and Kieslinger, M. "Towards a musician's cockpit: Transducers, feedback and musical function", In Proceedings of ICMC, 1996. [16] Wessel, D. and Wright, M. "Problems and prospects for intimate musical control of computers", In Workshop at NIME, 2001. [17] Wozniewski, M., Settel, Z. and Cooperstock, J. R. "A paradigm for physical interaction with sound in 3-D audio space", In Proceedings of International Computer Music Conference (ICMC), 2006. [18] Wozniewski, M., Settel, Z. and Cooperstock, J. R. "User-specific audio rendering and steerable sound for distributed virtual environments", In Proceedings of lnternational conference on auditory displays (ICAD), 2007.