Visual tracking in continuous appearance space via sparse coding

2012 
Particle Filter is the most widely used framework for object tracking. Despite its advantages in handling complex cases, the discretization of the object appearance space makes it difficult to search the solution efficiently, and the number of particles is also greatly limited in consideration of computational cost, especially for some time-consuming object representations, e.g. sparse representation. In this paper, we propose a novel tracking method in which the appearance space is relaxed to be continuous, the solution then can be searched efficiently via sparse coding iteratively. As particle filter, our method can be combined with many generic tracking methods; typically, we adopt l1 tracker, and demonstrate that with our method both its efficiency and accuracy can be improved in comparison to the version based on particle filter. Another advantage of our method is that it can handle dynamic change of object appearance by adaptively updating the object template model using the learned dictionary, and at the same time can avoid drifting by using representation error for supervision. Our method thus can perform more robust than previous methods in dynamic scenes of gradual changes. Both qualitative and quantitative evaluations demonstrate the efficiency and robustness of the proposed method.
    • Correction
    • Source
    • Cite
    • Save
    • Machine Reading By IdeaReader
    24
    References
    2
    Citations
    NaN
    KQI
    []