Infrared and Camera Fusion Sensor for Indoor Positioning

2019 
An indoor positioning fusion sensor composed of a five infrared detector set and one camera is presented in this work. The position of the target is obtained by hyperbolic trilateration from phase-difference measurements with the infrared sensor and by an homography with the camera. Subsequent fusion is carried out with a maximum likelihood estimator. A model is proposed for the infrared and camera observation variances and their propagation to the fusion estimation covariance matrix. The system shows cm-level accuracy in infrared multipath-free conditions and good matching with the model. Real measurements are conducted to assess sensor performance, also compared with Monte Carlo simulations for model validation. The evaluation of the fusion sensor performance is specially focused on its dependence on the camera resolution, tested at three resolution levels.
    • Correction
    • Source
    • Cite
    • Save
    • Machine Reading By IdeaReader
    20
    References
    4
    Citations
    NaN
    KQI
    []