Omni-Directional Vehicle Platform Development for Visual SLAM Construction

2021 
This research demonstrated a visual SLAM (Simultaneous Localization And Mapping) construction with only encoder localization for an omni-directional vehicle platform, utilizing a 3D commercial Kinect device. The reconstructed SLAM is stitched from the local particle cloud by video tapping the surrounding area without a-priori knowledge, relying on only encoder position calculated with an on-board embedded system, and comparing the relative distance between object and camera. This vehicle is designed with PID feedback control to enhance the positional encoding capability, thus by minimizing the wheel's tracking error, this pilot study demonstrated that with full access to all the system data through ROS operation system, visual SLAM can be re-constructed with minimum position data, without need of GPS or gyroscope in the vehicle. This in-house omni-directional vehicle also includes Wifi remote control and data collection through Wifi in real-time, and reconstructs visual SLAM off-line.
    • Correction
    • Source
    • Cite
    • Save
    • Machine Reading By IdeaReader
    0
    References
    0
    Citations
    NaN
    KQI
    []