Interactive Sound Rendering on Mobile Devices using Ray-Parameterized Reverberation Filters.

2018 
We present a new sound rendering pipeline that is able to generate plausible sound propagation effects for interactive dynamic scenes. Our approach combines ray-tracing-based sound propagation with reverberation filters using robust automatic reverb parameter estimation that is driven by impulse responses computed at a low sampling rate.We propose a unified spherical harmonic representation of directional sound in both the propagation and auralization modules and use this formulation to perform a constant number of convolution operations for any number of sound sources while rendering spatial audio. In comparison to previous geometric acoustic methods, we achieve a speedup of over an order of magnitude while delivering similar audio to high-quality convolution rendering algorithms. As a result, our approach is the first capable of rendering plausible dynamic sound propagation effects on commodity smartphones.
    • Correction
    • Source
    • Cite
    • Save
    • Machine Reading By IdeaReader
    29
    References
    12
    Citations
    NaN
    KQI
    []