Robust leader tracking from an unmanned ground vehicle

2013 
While many leader-follower technologies for robotic mules have been developed in recent years, the problem of reliably tracking and re-acquiring a human leader through cluttered environments continues to pose a challenge to widespread acceptance of these systems. Recent approaches to leader tracking rely on leader-worn equipment that may be damaged, hidden from view, or lost, such as radio transmitters or special clothing, as well as specialized sensing hardware such as high-resolution LIDAR. We present a vision-based approach for robustly tracking a leader using a simple monocular camera. The proposed method requires no modification to the leader’s equipment, nor any specialized sensors on board the host platform. The system learns a discriminative model of the leader’s appearance to robustly track him or her through long occlusions, changing lighting conditions, and cluttered environments. We demonstrate the system’s tracking capabilities on publicly available benchmark datasets, as well as in representative scenarios captured using a small unmanned ground vehicle (SUGV).
    • Correction
    • Source
    • Cite
    • Save
    • Machine Reading By IdeaReader
    0
    References
    1
    Citations
    NaN
    KQI
    []