Long term and robust 6DoF motion tracking for highly dynamic stereo endoscopy videos.

2021 
Abstract Real-time augmented reality (AR) for minimally invasive surgery without extra tracking devices is a valuable yet challenging task, especially considering dynamic surgery environments. Multiple different motions between target organs are induced by respiration, cardiac motion or operative tools, and often must be characterized by a moving, manually positioned endoscope. Therefore, a 6DoF motion tracking method that takes advantage of the latest 2D target tracking methods and non-linear pose optimization and tracking loss retrieval in SLAM technologies is proposed and can be embedded into such an AR system. Specifically, the SiamMask deep learning-based target tracking method is incorporated to roughly exclude motion distractions and enable frame matching. This algorithm’s light computation cost makes it possible for the proposed method to run in real-time. A global map and a set of keyframes as in ORB-SLAM are maintained for pose optimization and tracking loss retrieval. The stereo matching and frame matching methods are improved and a new strategy to select reference frames is introduced to make the first-time motion estimation of every arriving frame as accurate as possible. Experiments on both a clinical laparoscopic partial nephrectomy dataset and an ex-vivo porcine kidney dataset are conducted. The results show that the proposed method gives a more robust and accurate performance compared with ORB-SLAM2 in the presence of motion distractions or motion blur; however, heavy smoke still remains a big factor that reduces the tracking accuracy.
    • Correction
    • Source
    • Cite
    • Save
    • Machine Reading By IdeaReader
    37
    References
    0
    Citations
    NaN
    KQI
    []