Animation Music Interaction InterpretationIn the Electric Garden of SIGGRAPH 97
Virtual mucisians conducted by a real conductor. On stage of real/nonreal space.
You may come, wear a Data Dress Suit (DDS), take a baton and see how our mucisians react to your touch.
Move around in a virtual space and experience the acoustics of a real space. Have a look at the musicians and the architecture.
We of Digital Interactive Virtual Acoustics will be presenting:
Mucisians play their instruments with real hand movements. Movements are determined by inverse cinematics calculation from the desired musical event. For example a guitarist will move his hands according to the chords found in a MIDI file.
Acoustics are simulated in real-time by calculating the sound in the spot where the listener is and then projecting that space to the listener using headphones. Combined with the visual input this creates strong illusion of reality.
Interaction with the mucisians is possible by conducting the music as a maestro would. Artificial neural networks are trained with actual conductor movements and used to analyze their meaning.
This page is maintained by Tommi Ilmonen, E-mail: Tommi.Ilmonen(at)hut.fi.
The page has last been updated 30.9.1999.