Controlling Physically Based Virtual Musical Instruments Using The Gloves
2014
In this paper we propose an empirical method to develop mapping strategies between a gestural-based interface (the Gloves) and physically based sound synthesis models. An experiment was conducted to investigate which gestures listeners associate with sounds synthesised using physical models, corresponding to three categories of sound: sustained, iterative and impulsive. The results of the experiment show that listeners perform similar gestures when controlling sounds from the different categories. We used such gestures in order to create the mapping strategy between the Gloves and the physically based synthesis engine.
- Correction
- Source
- Cite
- Save
- Machine Reading By IdeaReader
21
References
11
Citations
NaN
KQI