A Real-Time Pose Estimation Algorithm Based on FPGA and Sensor Fusion

2018 
Combining measurements of different sensors are a crucial step to achieve better precision in pose estimation. Sensor fusion is an effective state estimation method (in this case Kalman filter), which is used in several disciplines. Using sensor fusion, the information from the sensors and the characteristics of each sensor can be used together to improve the estimate and decrease the uncertainty of the measured variables. In this paper a real-time pose estimation algorithm using sensor fusion of visual odometry (optical flow), Inertial Measurement Unit (IMU) and Global Positioning System (GPS) measurements is presented. The IMU contains calibrated three degrees of freedom (3Dof) accelerometer and an also 3DoF gyroscope. A Kalman filter is used for the fusion of the measurements of the different sensors. The algorithm is implemented in MATLAB and on a low-cost Z-7010 Field-Programmable Gate Array (FPGA) using the ZYBO development board, which is capable of real-time pose estimation with sensor fusion.
    • Correction
    • Source
    • Cite
    • Save
    • Machine Reading By IdeaReader
    11
    References
    3
    Citations
    NaN
    KQI
    []