A theoretical analysis of a buffer frame size conversion algorithm for audio applications ensuring minimum latency

2011 
In implementing digital signal processing (DSP) algorithms for audio real-time applications, one is frequently faced with problems regarding incompatibilities between the hardware buffer length (the internal buffer of a professional sound card) and the software buffer size imposed by the underlying algorithm (due to i.e. multirate or FFT constraints). This mismatch is solved by proper frame size conversion algorithms which inevitably introduce delay. In this context, this paper presents a buffering scheme together with a theoretical proof of the minimum delay property shown by it. Some examples derived from frequently encountered issues in DSP applications are reported.
    • Correction
    • Source
    • Cite
    • Save
    • Machine Reading By IdeaReader
    6
    References
    0
    Citations
    NaN
    KQI
    []