ï~~5. SUMMARY AND FUTURE WORK
We have demonstrated an application and toolkit designed
to facilitate the rapid creation of graphical interfaces for
music and media applications. Motivated by the absence of
a complete system for interactively prototyping expressive
software interfaces, Argos stands as an application and
toolkit that leverages the multi-touch interaction paradigm
to empower musical users and developers.
In the future we plan to integrate ability to bind
physical controls & fiducials to the surface of Argos
tabletop-based interfaces using a method similar to the one
presented by Fiebrink et. al. in [2].
In addition to continuing code-level optimizations, we
plan to conduct extensive usability and accessibility tests
beyond our preliminary evaluations, especially in the
musical pedagogy and interface design domains. Work to
extend the widget library with physics-based controls,
menus, and other experimental controls is ongoing. Based
on previous evaluation, two-way OSC communication is
currently being implemented.
A stable pre-release version of Argos is available as
C++ source and compiled binary on a Google Code SVN
located at http://code.google.com/p/ofxargos/
6. ACKNOWLEGEMENTS
This application was originally a product of the 2009
Google Summer of Code program. Many thanks to Seth
Sandler for his helpful comments & ideas about multitouch interaction design.
7. REFERENCES
[1] Davidson, P. and Han, J. Synthesis and Control
on Large Scale Multi-Touch Sensing Displays. in
Proceedings of the International Conference on
New Interfaces for Musical Expression. 2006.
Paris, France.
[2] Fiebrink, R., et al. Dynamic Mapping of Physical
Controls for Tabletop Groupware. in Human
Factors in Computing Systems. 2009. Boston,
MA.
[3] Hansen, T.E., et al. PyMT: a post-WIMP multitouch user interface toolkit. in International
Conference On Interactive Tabletops And
Surface. 2009. Alberta, Canada.
[4] Hochenbaum, J. and Vallis, O. Bricktabke: A
Musical Tangible Multi-Touch Interface. in
Proceedings of the Berlin Open Conference.
2009. Berlin, Germany.
[5] Jorda, S., et al. The reacTable. in Proceedings of
the International Computer Music Conference.
2004. Barcelona, Spain.
[6] Kaltenbrunner, M., et al. TUIO: A Protocol for
Table-Top Tangible User Interfaces. in
Proceedings of the International Workshop on
Gesture in Human-Computer Interaction and
Simulation 2005. Berder Island, France.
[7] Kellum, G. and Crevoisier, A. A Flexible
Mapping Editor for Multi-Touch Musical
Instruments. in Proceedings of the International
Conference on New Interfaces for Musical
Expression. 2009. Pittsburgh, PA.
[8] Patten, J., et al. Interaction Techniques for
Musical Performance with Tabletop Tangible
Interfaces. in Proceedings of the Conference on
Advances in Computer Entertainment
Technology. 2006. Hollywood, California.
[9] Ramanahally, P., et al. Sparsh-UI: A Multi-Touch
Framework for Collaboration and Modular
Gesture Recognition. in Proceedings of the World
Conference on Innovative VR. 2008. Brussels,
Belgium.
[10] Wang, G. and Cook, P. ChucK:. A Concurrent,
On-the-fly, Audio Programming Language. in
Proceedings of the International Computer Music
Conference. 2003. Singapore.
[11] Wang, G., et al. Building Collaborative Graphical
Interfaces in the Audicle in Proceedings of the
International Conference on New Interfaces for
Musical Expression. 2006. Paris, France.
[12] Wright, M. and Freed, A. Open Sound Control: A
New Protocol for Communicating with Sound
Synthesizers. in Proceedings of the International
Computer Music Conference. 1997. Thessaloniki,
Greece.
91