Toward real-time 3D ultrasound registration-based visual servoing for interventional navigation

2016 
While intraoperative imaging is commonly used to guide surgical interventions, automatic robotic support for image-guided navigation has not yet been established in clinical routine. In this paper, we propose a novel visual servoing framework that combines, for the first time, full image-based 3D ultrasound registration with a real-time servo-control scheme. Paired with multi-modal fusion to a pre-interventional plan such as an annotated needle insertion path, it thus allows tracking a target anatomy, continuously updating the plan as the target moves, and keeping a needle guide aligned for accurate manual insertion. The presented system includes a motorized 3D ultrasound transducer mounted on a force-controlled robot and a GPU-based image processing toolkit. The tracking accuracy of our framework is validated on a geometric agar/gelatin phantom using a second robot, achieving positioning errors of on average 0.42–0.44 mm. With compounding and registration runtimes of up to total around 550 ms, real-time performance comes into reach. We also present initial results on a spine phantom, demonstrating the feasibility of our system for lumbar spine injections.
    • Correction
    • Source
    • Cite
    • Save
    • Machine Reading By IdeaReader
    31
    References
    17
    Citations
    NaN
    KQI
    []