Wearable AR and 3D Ultrasound: Towards a Novel Way to Guide Surgical Dissections

2021 
Nowadays, ultrasound (US) is increasingly being chosen as imaging modality for both diagnostic and interventional applications, owing to its positive characteristics in terms of safety, low footprint, and low cost. The combination of this imaging modality with wearable augmented reality (AR) systems, such as the head-mounted displays (HMD), comes forward as a breakthrough technological solution, as it allows for hands-free interaction with the augmented scene, which is an essential requirement for the execution of high-precision manual tasks, such as in surgery. What we propose in this study is the integration of an AR navigation system (HMD plus dedicated platform) with a 3D US imaging system to guide a dissection task that requires maintaining safety margins with respect to unexposed anatomical or pathological structures. For this purpose, a standard scalpel was sensorized to provide real-time feedback on the position of the instrument during the execution of the task. The accuracy of the system was quantitatively assessed with two different experimental studies: a targeting experiment, which revealed a median error of 2.53 mm in estimating the scalpel to target distance, and a preliminary user study simulating a dissection task that requires reaching a predefined distance to an occult lesion. The second experiment results showed that the system can be used to guide a dissection task with a mean accuracy of 0.65 mm, with a mean angular error between the ideal and actual cutting plane of 2.07°. The results encourage further studies to fully exploit the potential of wearable AR and intraoperative US imaging to accurately guide deep surgical tasks, such as to guide the excision of non-palpable breast tumors ensuring optimal margin clearance.
    • Correction
    • Source
    • Cite
    • Save
    • Machine Reading By IdeaReader
    0
    References
    0
    Citations
    NaN
    KQI
    []