Skip to main content
Log in

Application and evaluation of surgical tool and tool tip recognition based on Convolutional Neural Network in multiple endoscopic surgical scenarios

  • Dynamic Manuscript
  • Published:
Surgical Endoscopy Aims and scope Submit manuscript

Abstract

Background

In recent years, computer-assisted intervention and robot-assisted surgery are receiving increasing attention. The need for real-time identification and tracking of surgical tools and tool tips is constantly demanding. A series of researches focusing on surgical tool tracking and identification have been performed. However, the size of dataset, the sensitivity/precision, and the response time of these studies were limited. In this work, we developed and utilized an automated method based on Convolutional Neural Network (CNN) and You Only Look Once (YOLO) v3 algorithm to locate and identify surgical tools and tool tips covering five different surgical scenarios.

Materials and methods

An algorithm of object detection was applied to identify and locate the surgical tools and tool tips. DarkNet-19 was used as Backbone Network and YOLOv3 was modified and applied for the detection. We included a series of 181 endoscopy videos covering 5 different surgical scenarios: pancreatic surgery, thyroid surgery, colon surgery, gastric surgery, and external scenes. A total amount of 25,333 images containing 94,463 targets were collected. Training and test sets were divided in a proportion of 2.5:1. The data sets were openly stored at the Kaggle database.

Results

Under an Intersection over Union threshold of 0.5, the overall sensitivity and precision rate of the model were 93.02% and 89.61% for tool recognition and 87.05% and 83.57% for tool tip recognition, respectively. The model demonstrated the highest tool and tool tip recognition sensitivity and precision rate under external scenes. Among the four different internal surgical scenes, the network had better performances in pancreatic and colon surgeries and poorer performances in gastric and thyroid surgeries.

Conclusion

We developed a surgical tool and tool tip recognition model based on CNN and YOLOv3. Validation of our model demonstrated satisfactory precision, accuracy, and robustness across different surgical scenes.

This is a preview of subscription content, log in via an institution to check access.

Access this article

Price excludes VAT (USA)
Tax calculation will be finalised during checkout.

Instant access to the full article PDF.

Fig. 1
Fig. 2
Fig. 3
Fig. 4
Fig. 5

Similar content being viewed by others

References

  1. Baumhauer M, Feuerstein M, Meinzer HP, Rassweiler J (2008) Navigation in endoscopic soft tissue surgery: perspectives and limitations. J Endourol 22:751–766

    Article  PubMed  Google Scholar 

  2. Hua S, Gao J, Wang Z, Yeerkenbieke P, Li J, Wang J, He G, Jiang J, Lu Y, Yu Q, Han X, Liao Q, Wu W (2022) Automatic bleeding detection in laparoscopic surgery based on a faster region-based convolutional neural network. Ann Transl Med 10:546

    Article  PubMed  PubMed Central  Google Scholar 

  3. Bouget D, Allan M, Stoyanov D, Jannin P (2017) Vision-based and marker-less surgical tool detection and tracking: a review of the literature. Med Image Anal 35:633–654

    Article  PubMed  Google Scholar 

  4. Kanakatte A, Ramaswamy A, Gubbi J, Ghose A, Purushothaman B (2020) Surgical tool segmentation and localization using spatio-temporal deep network. Annu Int Conf IEEE Eng Med Biol Soc 2020:1658–1661

    PubMed  Google Scholar 

  5. Jo K, Choi Y, Choi J, Chung JW (2019) Robust real-time detection of laparoscopic instruments in robot surgery using convolutional neural networks with motion vector prediction. Appl Sci 9:2865

    Article  Google Scholar 

  6. Ramesh A, Beniwal M, Uppar AM, Madhav Rao VV (2021) Microsurgical tool detection and characterization in intra-operative neurosurgical videos. Annu Int Conf IEEE Eng Med Biol Soc 2021:2676–2681

    PubMed  Google Scholar 

  7. Leppänen T, Vrzakova H, Bednarik R, Kanervisto A, Elomaa A-P, Huotarinen A, Bartczak P, Fraunberg M, Jääskeläinen JE (2018) Augmenting microsurgical training: microsurgical instrument detection using convolutional neural networks. In: 2018 IEEE 31st international symposium on computer-based medical systems (CBMS), IEEE, Karlstad, Sweden, pp 211–216

  8. Sarikaya D, Corso JJ, Guru KA (2017) Detection and localization of robotic tools in robot-assisted surgery videos using deep neural networks for region proposal and detection. IEEE Trans Med Imaging 36:1542–1549

    Article  PubMed  Google Scholar 

  9. Twinanda AP, Shehata S, Mutter D, Marescaux J, De Mathelin M, Padoy N (2016) Endonet: a deep architecture for recognition tasks on laparoscopic videos. IEEE Trans Med Imaging 36:86–97

    Article  PubMed  Google Scholar 

  10. Redmon J, Divvala S, Girshick R, Farhadi A (2016) You only look once: unified, real-time object detection. In: Proceedings of the IEEE conference on computer vision and pattern recognition, Las Vegas, NV, pp 779–788

  11. Choi B, Jo K, Choi S, Choi J (2017) Surgical-tools detection based on Convolutional Neural Network in laparoscopic robot-assisted surgery. In: 2017 39th annual international conference of the IEEE engineering in medicine and biology society (EMBC). IEEE, Jeju, South Korea, pp 1756–1759

  12. Hua S, Wang Z, Gao J, Wang J, He G, Han X, Chen G, Lia Q (2022) Exploration of deep learning to identify recurrent laryngeal nerve in endoscopic thyroidectomy via unilateral axillary approach. Chin J Endocr Surg 16:7

    Google Scholar 

  13. Hua S, Wang Z, Li J, Gao J, Wang J, He G, Yeerkenbieke P, Han X, Chen G, Liao Q (2022) Application of deep learning to identify recurrent laryngeal nerve in endoscopic thyroidectomy via breast approach. Chin J Endocr Surg 16:6

    Google Scholar 

  14. Hua S, Wang Z, Wang J, He G, Gao J, Yu Q, Han X, Liao Q, Wu W (2021) Application value of machine learning algorithms for gauze detection in laparoscopic pancreatic surgery. Chin J Digest Surg 12:1324–1330

    Google Scholar 

  15. Zhang L, Ye M, Chan PL, Yang GZ (2017) Real-time surgical tool tracking and pose estimation using a hybrid cylindrical marker. Int J Comput Assist Radiol Surg 12:921–930

    Article  PubMed  PubMed Central  Google Scholar 

  16. Kranzfelder M, Schneider A, Fiolka A, Schwan E, Gillen S, Wilhelm D, Schirren R, Reiser S, Jensen B, Feussner H (2013) Real-time instrument detection in minimally invasive surgery using radiofrequency identification technology. J Surg Res 185:704–710

    Article  PubMed  Google Scholar 

  17. Koskinen J, Torkamani-Azar M, Hussein A, Huotarinen A, Bednarik R (2022) Automated tool detection with deep learning for monitoring kinematics and eye-hand coordination in microsurgery. Comput Biol Med 141:105121

    Article  PubMed  Google Scholar 

  18. Al Hajj H, Lamard M, Cochener B, Quellec G (2017) Smart data augmentation for surgical tool detection on the surgical tray. Annu Int Conf IEEE Eng Med Biol Soc 2017:4407–4410

    PubMed  Google Scholar 

  19. Redmon J, Farhadi A (2018) Yolov3: an incremental improvement. arXiv. https://doi.org/10.48550/arXiv.1804.02767

    Article  Google Scholar 

  20. Kitaguchi D, Takeshita N, Matsuzaki H, Oda T, Watanabe M, Mori K, Kobayashi E, Ito M (2020) Automated laparoscopic colorectal surgery workflow recognition using artificial intelligence: experimental research. Int J Surg 79:88–94

    Article  PubMed  Google Scholar 

  21. Choi J, Cho S, Chung JW, Kim N (2021) Video recognition of simple mastoidectomy using convolutional neural networks: detection and segmentation of surgical tools and anatomical regions. Comput Methods Programs Biomed 208:106251

    Article  PubMed  Google Scholar 

  22. Pose Díez de la Lastra A, García-Duarte Sáenz L, García-Mato D, Hernández-Álvarez L, Ochandiano S, Pascau J (2021) Real-time tool detection for workflow identification in open cranial vault remodeling. Entropy (Basel) 23(7):817

    Article  PubMed  Google Scholar 

Download references

Acknowledgements

This work was supported by grant to Surong Hua from the National High Level Hospital Clinical Research Funding (No. 2022-PUMCH-A-052) and by grant to Zhihong Wang from the College Student Innovation Training Program of Peking Union Medical College, Beijing (2022zzglc06059). Algorithm operation in the work was supported by the high-performance computing platform of Peking Union Medical College Hospital (PUMCH). We thank all colleagues from the Department of General Surgery at PUMCH for their generous help in providing surgery recordings. We thank Mr. Jiqiang Yue, chief engineer of Hangzhou Kangji Medical Instrument Co., Ltd, for his support and guidance in our algorithms and technologies. We also thank all colleagues and friends for their critical discussions and valuable support of our work.

Funding

This study was funded by the National High Level Hospital Clinical Research Funding (No. 2022-PUMCH-A-052) and College Student Innovation Training Program of Peking Union Medical College, Beijing (2022zzglc06059).

Author information

Authors and Affiliations

Authors

Corresponding authors

Correspondence to Surong Hua or Huizhen Wang.

Ethics declarations

Disclosures

Lu Ping, Zhihong Wang, Jingjing Yao, Junyi Gao, Sen Yang, Jiayi Li, Jile Shi, Wenming Wu, Surong Hua, and Huizhen Wang have no conflicts of interest or financial ties to disclose.

Ethical approval

This study was performed in line with the principles of the Declaration of Helsinki and approval was granted by the Institutional Review Board of Peking Union Medical College Hospital (No. S-K1901).

Additional information

Publisher's Note

Springer Nature remains neutral with regard to jurisdictional claims in published maps and institutional affiliations.

Supplementary Information

Below is the link to the electronic supplementary material.

Supplementary file1 (MP4 22,165 KB)

Rights and permissions

Springer Nature or its licensor (e.g. a society or other partner) holds exclusive rights to this article under a publishing agreement with the author(s) or other rightsholder(s); author self-archiving of the accepted manuscript version of this article is solely governed by the terms of such publishing agreement and applicable law.

Reprints and permissions

About this article

Check for updates. Verify currency and authenticity via CrossMark

Cite this article

Ping, L., Wang, Z., Yao, J. et al. Application and evaluation of surgical tool and tool tip recognition based on Convolutional Neural Network in multiple endoscopic surgical scenarios. Surg Endosc 37, 7376–7384 (2023). https://doi.org/10.1007/s00464-023-10323-3

Download citation

  • Received:

  • Accepted:

  • Published:

  • Issue Date:

  • DOI: https://doi.org/10.1007/s00464-023-10323-3

Keywords

Navigation