Surgical Tools Detection Based on Modulated Anchoring Network in Laparoscopic Videos

2020 
Minimally invasive surgery like laparoscopic surgery is an active research area of clinical practice for less pain and a faster recovery rate. Detection of surgical tools with more accurate spatial locations in surgical videos not only helps to ensure patient safety by reducing the incidence of complications but also makes a difference to assess the surgeon performance. In this paper, we propose a novel Modulated Anchoring Network for detection of laparoscopic surgery tools based on Faster R-CNN, which inherits the merits of two-stage approaches while also maintains high efficiency of comparable speed as state-of-the-art one-stage methods. Since objects like surgical instruments with a wide aspect ratio are difficult to recognize, we develop a novel training scheme named as modulated anchoring to explicitly predict arbitrary anchor shapes of objects of interest. For taking the relationship of different tools into consideration, it is useful to embed the relation module in our network. We evaluate our method using an existing dataset (m2cai16-tool-locations) and a new private dataset (AJU-Set), both collected from cholecystectomy surgical videos in hospital, covering information of seven surgical tools with spatial bounds. We show that our detector yields excellent detection accuracy of 69.6% and 76.5% over the introduced datasets superior to other recently used architectures. We further verify the efficiency of our method by analyzing the usage patterns of tools, the economy of the movement, and the dexterity of operations to assess surgical quality.
    • Correction
    • Source
    • Cite
    • Save
    • Machine Reading By IdeaReader
    44
    References
    8
    Citations
    NaN
    KQI
    []