Medialogy: a bridge between technology and creativity based on the Aalborg modelSkip other details (including permanent urls, DOI, citation information)
This work is licensed under a Creative Commons Attribution-NonCommercial-NoDerivatives 3.0 License. Please contact firstname.lastname@example.org to use this work in a way not covered by the license. :
For more information, read Michigan Publishing's access and usage policy.
Page 00000001 Medialogy: a bridge between technology and creativity based on the Aalborg model Rolf Nordahl, Stefania Serafin Medialogy Aalborg University Copenhagen www.medialogi.com rn, email@example.com Abstract This paper describes Medialogy, a new education under development at Aalborg University in Denmark whose goal is to combine technology and creativity in new media. Music technology plays an important role inside this education. 1 Introduction The purpose of the course in Medialogy is to educate diploma engineers with a solid foundation in areas within New Media, comprising both tools and communication protocols. Medialogy is about developing as well as combining areas and topics within the fields of humanistic science, computer science and technology that until now by conventional standards have been kept or are being created apart. With the main aim of bridging creativity and technology the education is directed towards professions that deal with the apparent and future wishes and needs of society, culture, industry, groups or individuals in equipment, hardware and software. One of the main subjects belonging the Medialogy education is music technology. Throughout the curriculum, students learn theory and applications of digital sounds both for musical purposes and connected to the use in other media such as computer animation, computer vision systems and interactive new media in general. 2 What is Medialogy In 2002 Medialogy was established at Aalborg University. The goal of the Medialogy education is to create a bridge between current technologies and creative applications in doing so breaking down conventional barriers between humanities and Science; creating the education in Medialogy has been founded on the belief that in real-life creative teams there is a need for people that have a strong intuitive understanding of technology. Currently Medialogy enrols student that have a short degree in Designer of New Media and equivalent educations. The goal of Medialogy is so far to support three different approaches: 1. The technical approach: creating tools and new applications available in the field of integrating media. 2. The design approach: by learning technical issues new fields of applications are found. 3. Strategic development of existing technologies and research. To encourage the learning process through projects, collaborations with students and companies from other fields, such as art schools and more conventional companies are endorsed. The Medialogy education consists of 4 semesters for a Bachelor degree, with 4 additional semester to obtain a Master degree. During the first semester, students are introduced to the Aalborg way of studying, and they learn basics of software technology, human perception and how to simulate it with computer models. The second semester is dedicated to the study of sensor technology, music synthesis and motion synthesis. Students learn how to integrate sensors to everyday objects by using microcontrollers, and how to use such sensors to drive sound synthesis models and/or computer animated images, to support making projects that come close to real-life applications. Furthermore to give a firm foundation in where technology, media and the need of society is today the student are taught sociological perspectives as well as given a historical overview of mans ever evolving use and dependence on media. The third semester is dedicated to the design of visual effects and the study of screen media. Proceedings ICMC 2004
Page 00000002 The goal of the fourth semester is to put together the knowledge acquired in the previous semesters to produce an interactive product. 3 The Aalborg model In 1974 Aalborg University was established and founded on the problem based learning (PBL) approach. Through an active implementation of the pedagogical principles of the Aalborg Model the education in Medialogy points towards students each semester make projects that come close to real life problem solving, i.e., PBL. Where traditional engineering approaches find their goal on finding a solution to a problem through the tools of professional skills, PBL points at solving problems as a tool for the learning process; the primary aim is to learn through active dealing with problems and the problem is a tool for encouraging this process (Kolmos, Fink, and Krogh 2004). The key element behind PBL is working on projects through teamwork. During each semester students must carry out a major project, in a team of approximately 5-6 people, which constitutes the core of the semester. Early in the semester a project is defined closely related to the theme of the semester. To support the projects work different courses are offered with the aim of supporting the work on the project, i.e., necessary knowledge for the project. The core of each project consists of problem analysis, problem definition, problem solving and documentation in terms of a report. The Medialogy education strongly follows the Aalborg model. One semester is a 20 weeks program, and each semester has its own theme. The theme of the first semester is Human senses and computer perception. Students learn how vision, audition and the other percepts are processed by the human system, how they can be exploited for modern media and how they can be imitated by technical systems such as robots and computer controlled vision systems. The theme of the second semester is Interface design and sound effects. In this semester, students achieve a better understanding of modeling and perception of human senses. Focus is put on tactile and auditory information. In the project work, students learn how to combine input devices in the creation of new tangible objects which can be used to control auditory and/or visual information. The third semester theme is Animated environments and visual effects. Students learn how to relate technical knowledge with humanistic knowledge within available screen-media and interactive communication interfaces. The theme of the last semester is Computer games and interactive systems. Students learn how new media technol ogy is evolving, and they learn how to build an immersive system which combines knowledge achieved during the previous semesters. After completing the Bachelor education, students can continue to a Master education which lasts two years. During the Master education, students can choose to specialize in one or more topics related to Medialogy. 4 Music technology in Medialogy One of the main subjects belonging to the Medialogy education is music technology. As for the other fields, music technology combines the technical aspects of music production and synthesis to the creative aspects concerning the use of music in new media technology. Moreover, projects which combine music technology with digital video are also interesting for the students. Students start by learning during the first semester principles of sound perception and basic techniques of digital sound synthesis. During their project work, they investigate the relationship between physical and perceptual characteristics of sounds. As an example of a project, they record, analyze and synthesize annoying sounds in a class called automatic perception, where basic signal processing techniques for both audio and vision are learnt. From the perceptual side, they investigate if there exists a correlation between physical characteristics of annoying sounds and the way they are perceived. Another approach is to combine audio and visual into the same project. This can be done by understanding basic sound and visual parameters, and creating a mapping between the two which is scientifically solid and interesting from an artistic point of view. In the second semester, a course on sound design is offered, which covers most aspects of digital sound synthesis and processing, and the role of sound in new media technology. During the first time this course was offered, students decided to design a soundtrack for their animation project. In the last two semesters, the knowledge on sound design and interactive acoustics is brought to a higher level, and combined more closely to real-time screen media, virtual reality and interactive spaces. 5 Examples of Medialogy projects 5.1 Audio-visual mapping In this project students have used Symbolic Pointillistic Images (Kruger, Lappe, and Wrgtter. 2004) to encourage people to create sound. In an installation the sound produced by Proceedings ICMC 2004
Page 00000003 the person is recorded and sound features become extracted. These sound features are then used to alter a symbolic pointillistic image. The altered visual feedback is then observed by the individual which then can influence her/his attempts to produce sound. In this way an efferent/afferent loop is closed which has shown to encourage humans to explore their sound making capabilities. The condensed representation of the visual attributes in our Primitives has shown to be essential since it allow to map high extracted sound features to meaningful visual features. In psychophysical experiments we showed that humans become encouraged by the audiovisual loop to explore their voice and sound making capabilities. Figure 2: The figure shows the image after sound mapping. 5.4 Perception and modeling of annoying sounds Another group of students is investigating the relationship between physical and perceptual characteristics of annoying sounds. 5.5 Soundtrack for interactive movies Another group of students designed an interactive soundtrack to be used together with an interactive movie. Interactive movies are becoming popular also in home entertainment systems. A challenge in designing an interactive soundtrack is the fact that the reaction of the audience to the movie is not known a priori, so the sound designer needs to avoid repetitivity in the ambient sounds and needs to take care of all the possible sound events which can occur in the environment, according to the reaction of the audience. 5.6 Body synthesizer As a last example, a group of students used the human body as a controller for computer generated sounds. Different motions and positions of a moving person are mapped into different parameters of a sound effect processor. As an example, amplitude in controlling by motion of the body up and down, and panning by motion left and right. Figure 1: Audio-visual installation. A pointillistic image is modified in real-time by using sound parameters. The figure shows the image before sound mapping. 5.2 Detection of audio-visual asynchrony in digital applications In this project the research concerning analog equipment is compared to the new findings obtained by using digital equipment. Results show that previous research in this area can be improved by using digital equipment. 5.3 Soundtrack for computer animation Other projects concerned the design of a soundtrack for a computer animation. In particular, a group of students worked together with an Esthonian orchestra and reedited their recordings and processed them for surround sound. Proceedings ICMC 2004
Page 00000004 6 Research activities 6.1 Benogo Benogo 1 is an european consortium whose goal is to develop and explore new recording and visualization technologies enabling people to experience presence at real and possibly known places. The experience is based on true-to-life visual and auditorial sensory information presented in realtime. While the goal of the project is to recreate photorealistic spaces, from the auditory point of view it is not evident that realistic soundscapes produce the higher sense of place. Although it is well known that sound enhances the sense of presence, i.e., the sense of being in a specific place, quantitative results on sound quantity and quality and specific design patterns are not yet available. Sound designers in the Benogo project are creating tools which allow to answer to these questions. 6.2 Physically based models Stefania Serafin is pursuing her research on physics based models of musical instruments and everyday sounds, which she started as a PhD student at Stanford University. In particular, she is working on the following projects: 1. Use of physical models in audio-visual simulations and for sensorial substitution, with Federico Avanzini from the University of Padova and Davide Rocchesso from the University of Verona. 2. The virtual violin project and controllers for friction driven instruments with Diana Young from MIT Media Lab. 3. Physical models in interactive acoustics with Juraj Kojs. The use of physical models to extend possibilities offered by traditional instruments is studied and applied in interactive performances. References Kolmos, A., F. K. Fink, and L. Krogh (2004). The Aalborg pbl model - progress, diversity and challenges. Aalborg University Press, February ISBN 87-7307-700-3. Kruger, N., M. Lappe, and F. Wrgtter. (2004). Biologically motivated multi-modal processing of visual primitives. Accepted for the Interdisciplinary Journal of Artificial Intelligence the Simulation of Behavious 1(5). 'www.benogo.dk Proceedings ICMC 2004