Long-term correlation tracking via spatial–temporal context

2019 
In this paper, we mainly deal with the problems of long-term visual tracking while the target objects undergo sophisticated scenarios such as occlusion, out-of-view, and scale changes. We employ two discriminative correlation filters (DCFs) for achieving long-term object tracking, which is performed by learning a spatial–temporal context correlation filter for translation estimation. As for the scale estimation, which is achieved by learning a scale DCF centered on the estimated target position to estimate scale from the best confident results. In addition, we proposed an efficient model update and redetecting activate strategy to avoid unrecoverable drift due to noisy updates, and achieve robust long-term tracking in the case of tracking failure. We evaluate our algorithm carry on OTB benchmark datasets, and the tracking results of both qualitative and quantitative evaluations on challenging sequences demonstrate that the proposed algorithm performs superiorly against several state-of-the-art DCFs methods including some methods which follow deep learning paradigm.
    • Correction
    • Source
    • Cite
    • Save
    • Machine Reading By IdeaReader
    53
    References
    4
    Citations
    NaN
    KQI
    []