A Comparative Study of Blending Algorithms for Realtime Panoramic Video Stitching.

2016 
Panoramic video stitching consists of two major steps: remapping each candidate video stream to its final position and compositing them to generate seamless results. Given videos captured with cameras in fixed relative positions, the remapping step can be done directly using a precomputed look-up table. Greater challenges lie in the more time-consuming composition step. Real world applications typically use blending to perform composition; the performance of the whole system largely depends on the efficiency of the blending algorithm. In this paper, we provide in-depth analysis of the application of several state-of-the-art image blending techniques to realtime panoramic video stitching, as realtime panoramic video stitching enables near-immediate broadcast. Test videos were captured under various conditions, and stitched using various blending methods. Both computational efficiency and quality of composition results were evaluated. Source code and test videos are all publicly available.
    • Correction
    • Source
    • Cite
    • Save
    • Machine Reading By IdeaReader
    0
    References
    2
    Citations
    NaN
    KQI
    []