language-icon Old Web
English
Sign In

Ménagerie imaginaire

2007 
Our work leading to this piece began with the search for a better way to interact with electronic sound during performance. Taking the current trend towards performing on stage with laptop computers, we have conceived of a radically different arrangement where performer motion is not only unrestricted, but actually serves a principal role in the performance interface. We abandon conventional electronic audio devices, and use physically modelled virtual simulation as the performance medium. This allows for the use of natural gestures to control sophisticated audio processing, while still allowing traditional musical instruments to be played. In our approach, motions and sounds are captured and modelled within a virtual 3D environment. A combination of microphones and sensors allows multiple performers to input their audio into the scene and steer their signal with great precision, allowing for interaction with digital audio effects located throughout the virtual world. For example, when a performer wishes to send sound through a reverberator, they simply point their instrument toward the reverb unit in the 3D space. The performers may also travel different sonic regions that contain varying types of musical accompaniment and effects. The scene is rendered graphically in realtime, allowing the audience to watch the performance on a large screen situated above the performers on stage. The audience viewpoint provides a subjective rendering, with proper spatialization of all virtual audio sources. Live video from webcams provide textures which are mapped onto 3D avatars, allowing the audience to see close-up views of the performers.
    • Correction
    • Source
    • Cite
    • Save
    • Machine Reading By IdeaReader
    0
    References
    0
    Citations
    NaN
    KQI
    []