~Proceedings ICMCISMCI2014 14-20 September 2014, Athens, Greece fundamental to the Modality project as "performance practice", "control strategies" and even "software paradigms" were highly ambiguous and interpreted in different ways. Further, it turned out to be a learning process to not only listen to other people's opinions but to also take them into account during software design and implementation. As a third interpretation level, the term Modality influences the structure of the meetings. Reflecting the divergence between participants, most of the meetings consisted of a broad spectrum of activities, namely (a) developer phases in which the Modality Toolkit was implemented, (b) public workshops disseminating knowledge about the Modality Toolkit, and (c) concerts in which participants performed with their custom instruments. 2.1 Related work The Modality Toolkit stands in the tradition of a line of related systems, dedicated to control data flow and filtering. Particularly, it is informed by systems like OSCulator [1], STEIM's junXion [2], the Digital Orchestra Kit [3] and SC's own multiton pattern implementations. OSCulator Osculator is an OS X GUI based software aimed at connecting devices and routing messages between them. It supports multiple protocols such as MIDI, HID, OSC or TUIO and is capable of creating complex responses to incoming events, including scaling values, splitting events, merging events, storing values for later use, enabling or disabling actions and toggling global presets. junXion is a "[...] data routing application that can process [hardware] 'sensors' [... ] using conditional processing and remapping" [2]. It is a stand-alone program to be put in the middle between the control input layer and the synthesis layer. The roots of its development lay in the advanced sensor input and data manipulation features of pioneering live sampling software LiSa [2]. 2 In JunXion, data flow is organised in patches with an input-action-output-logic. Inputs can come from as many as eight different types of data sources. The actions process that data by means of user-definable behaviours such as switching or toggling but also differentiation, or complex activity measurement and based on conditional statements incorporate other incoming data. Output can be generated and sent in various formats to listening programs. Digital Orchestra Toolkit [4] was created as part of the Digital Orchestra project around "[... ] a number of paradigms for the design, creation and performance of digital musical instruments in the context of a long-term interdisciplinary, collaborative environment. 2 As of today, LiSa's sampling engine is not being further developed, as many software synthesizer are available to replace its functionality. Similarly, STEIM's groundbreaking sensor and interfacing technologies have become readily available through a host of affordable controllers and DIY-kits, e.g., those based around the Arduino platform. Issues related to mapping strategies, notation, the relationship of physical and musical gestures, robustness, responsiveness, and haptic feedback arose during the course of the project."[5]. The toolkit consists of a number of Max/MSP objects implementing data acquisition and processing for various hardware devices and protocols. Multiton design patterns in SC SuperCollider has flexible proxy objects for tasks, patterns, sound processes, and functions, which allow replacing the proxy's object while using it. (Modality follows these, e.g. in the MKt 1 (<name>) access scheme.) Named variants of these classes, like Tdef, Pdef, Ndef, or MIDIde f, OSCde f follow the multiton pattern by creating named instances only, and keeping them in a global dictionary. Calling the constructor e.g., Nde f ( \ a), returns an existing instance by that name or, if not found create it. Supplying a second argument, Nde f (\a, { LF S aw. a r }), replaces the proxy's current object with the new one given. This is very useful in live coding situations, where remembering name-function pairs is much easier than doing full variable administration by hand. 3. THE MODALITY MEETINGS To illustrate the Modality way as described in Section 1, this section reports on the outcomes and discussions within the four modality meetings held so far. October 2010, BEK, Bergen Initiated by Jeff Carey and Bj rnar Habbestad, several experts and sound artists met to discuss shared ideas about modal control in performance and rehearsal situations. The attendees soon agreed that easy access and outlining of modal control structures is of great interest for all. First sketches for uniform access were made based on the then already existing JITMIDIKtl quark 3, creating a more uniform access scheme to controllers in the Ktl quark. May 2011, STEIM, Amsterdam Discussions revealed the need for users to abstract from hardware dependencies, and being able to do flexible routings and filtering incoming data. A new SC quark was initiated and the group started implementing two sets of functionalities: MKt 1 objects were intended to connect MIDI and HID hardware devices. They stored capabilities of each device in a configuration file. Instead of assigning functions to hardware-specifics, we considered controllers as a combination of controller elements, which were given human-readable short names for semantically simple access (e.g., ' s 11' instead of MIDI channel 0 cc 14). This scheme was considered extensible for OpenSoundControl, serial ports, and other hardware interfaces, to have a uniform workflow, abstracted away from the actual backend. 3 a quark is an extension library in SuperCollider parlance. - 1070 - 0
Top of page Top of page