Approaches to visualising the spatial position of 'sound-objects'

2016 
In this paper we present the rationale and design for two systems (developed by the Integra Lab research group at Birmingham Conservatoire) implementing a common approach to interactive visualisation of the spatial position of 'sound-objects'. The first system forms part of the AHRC-funded project 'Transforming Transformation: 3D Models for Interactive Sound Design', which entails the development of a new interaction model for audio processing whereby sound can be manipulated through grasp as if it were an invisible 3D object. The second system concerns the spatial manipulation of 'beatboxer' vocal sound using handheld mobile devices through already-learned physical movement. In both cases a means to visualise the spatial position of multiple sound sources within a 3D 'stereo image' is central to the system design, so a common model for this task was therefore developed. This paper describes the ways in which sound and spatial information are implemented to meet the practical demands of these systems, whilst relating this to the wider context of extant, and potential future methods for spatial audio visualisation.
    • Correction
    • Source
    • Cite
    • Save
    • Machine Reading By IdeaReader
    37
    References
    4
    Citations
    NaN
    KQI
    []