single-rb.php

JRM Vol.34 No.6 pp. 1245-1252
doi: 10.20965/jrm.2022.p1245
(2022)

Paper:

Real-Time Suture Thread Detection with an Image Classifier

Kyotaro Horio*1, Kanako Harada*1, Jun Muto*2, Hirofumi Nakatomi*3, Nobuhito Saito*3, Akio Morita*4, Eiju Watanabe*5, and Mamoru Mitsuishi*1

*1The University of Tokyo
7-3-1 Hongo, Bunkyo-ku, Tokyo 113-8656, Japan

*2Fujita Health University Hospital
1-98 Dengakugakubo, Kutsukake-cho, Toyoake, Aichi 470-1192, Japan

*3The University of Tokyo Hospital
7-3-1 Hongo, Bunkyo-ku, Tokyo 113-8655, Japan

*4Nippon Medical School
1-1-5 Sendagi, Bunkyo-ku, Tokyo 113-8602, Japan

*5Jichi Medical School Hospital
3311-1 Yakushiji, Shimotsuke-shi, Tochigi 329-0498, Japan

Received:
May 27, 2022
Accepted:
September 23, 2022
Published:
December 20, 2022
Keywords:
surgical robot, deep learning, object detection
Abstract
Real-Time Suture Thread Detection with an Image Classifier

Result of thread detection

Micro-anastomosis is considered to be a difficult task even for skilled surgeons. Our group has developed a surgical robotic system to assist surgeons. Going further, the detection of surgically relevant objects in the microscopic view is indispensable for the automation or semi-automation of the system. This paper proposes a novel surgical thread detector inspired by an automatic crack detection method. The proposed method achieved a Dice score of 76.30% and an intersection over union (IOU) of 66.08% at 34.50 fps.

Cite this article as:
K. Horio, K. Harada, J. Muto, H. Nakatomi, N. Saito, A. Morita, E. Watanabe, and M. Mitsuishi, “Real-Time Suture Thread Detection with an Image Classifier,” J. Robot. Mechatron., Vol.34, No.6, pp. 1245-1252, 2022.
Data files:
References
  1. [1] R. D. Katz, G. D. Rosson, J. A. Taylor, and N. K. Singh, “Robotics in microsurgery: use of a surgical robot to perform a free flap in a pig,” Microsurgery, Vol.25, No.7, pp. 566-569, 2005.
  2. [2] P. A. Liverneaux, S. H. Berner, M. S. Bednar, S. J. Parekattil, G. M. Ruggiero, and J. C. Selber, “Telemicrosurgery: Robot Assisted Microsurgery,” Springer Science+Business Media, 2012.
  3. [3] R. Cau, “Design and realization of a master-slave system for reconstructive microsurgery,” Ph.D. thesis, Eindhoven University of Technology, 2014.
  4. [4] S. Saraf, “Robotic assisted microsurgery (RAMS): application in plastic surgery,” Medical Robotics, p. 363, 2008.
  5. [5] L. Vanthournhout, J. Szewczyk, J. Duisit, B. Lengelé, B. Raucent, and B. Herman, “ASTEMA: Design and preliminary performance assessment of a novel tele-microsurgery system,” Mechatronics, Vol.81, 102689, 2022.
  6. [6] M. Mitsuishi, A. Morita, N. Sugita, S. Sora, R. Mochizuki, K. Tanimoto, Y. M. Baek, H. Takahashi, and K. Harada, “Master-slave robotic platform and its feasibility study for micro-neurosurgery,” Int. J. Med. Robot., Vol.9, No.2, pp. 180-189, 2013.
  7. [7] P. Moreira, S. Patil, R. Alterovitz, and S. Misra, “Needle steering in biological tissue using ultrasound-based online curvature estimation,” IEEE Int. Conf. Robot. Autom., Vol.2014, pp. 4368-4373, 2014.
  8. [8] N. Padoy and G. D. Hager, “Human-Machine collaborative surgery using learned models,” Proc. of the 2011 IEEE Int. Conf. on Robotics and Automation, pp. 5285-5292, 2011.
  9. [9] R. Nakamura, “Automatic surgical workflow estimation method for brain tumor resection using surgical navigation information,” J. Robot. Mechatron., Vol.24, No.5, pp. 791-801, 2012.
  10. [10] K. Sugiyama, T. Matsuno, T. Kamegawa, T. Hiraki, H. Nakaya, M. Nakamura, A. Yanou, and M. Minami, “Needle tip position accuracy evaluation experiment for puncture robot in remote center control,” J. Robot. Mechatron., Vol.28, No.6, pp. 911-920, 2016.
  11. [11] R. C. Jackson, R. Yuan, D.-L. Chow, W. Newman, and M. C. Çavuşoğlu, “Automatic initialization and dynamic tracking of surgical suture threads,” Proc. of the IEEE Int. Conf. Robot. Autom., Vol.2015, pp. 4710-4716, 2015.
  12. [12] B. Lu, H. K. Chu, K. Huang, and J. Lai, “Surgical suture thread detection and 3-D reconstruction using a Model-Free approach in a calibrated stereo visual system,” IEEE/ASME Trans. Mechatron., Vol.25, No.2, pp. 792-803, 2020.
  13. [13] B. Guo, Q. Li, X. Huang, and C. Wang, “An improved method for Power-Line reconstruction from point cloud data,” Remote Sensing, Vol.8, No.1, 36, 2016.
  14. [14] Y. Jwa and G. Sohn, “A piecewise catenary curve model growing for 3D power line reconstruction,” Photogrammetric Engineering & Remote Sensing, Vol.78, No.12, pp. 1227-1240, 2012.
  15. [15] Y. Xu, Z. Xie, Y. Feng, and Z. Chen, “Road extraction from High-Resolution remote sensing imagery using deep learning,” Remote Sensing, Vol.10, No.9, 1461, 2018.
  16. [16] Y.-J. Cha, W. Choi, and O. Büyüköztürk, “Deep Learning-Based crack damage detection using convolutional neural networks,” Computer-Aided Civ. Infrastruct. Eng., Vol.32, No.5, pp. 361-378, 2017.
  17. [17] F. Ni, J. Zhang, and Z. Chen, “Zernike-moment measurement of thin-crack width in images enabled by dual-scale deep learning,” Computer-Aided Civ. Infrastruct. Eng., Vol.34, No.5, pp. 367-384, 2019.
  18. [18] K. Horio, M. M. Marinho, K. Harada, J. Muto, H. Nakatomi, N. Saito, A. Morita, E. Watanabe, and M. Mitsuishi, “Automatic suture thread detection for surgical assistance,” Proc. of the Robotics and Mechatronics Conf. (ROBOMECH), 2021 (in Japanese).
  19. [19] K. Horio, M. M. Marinho, K. Harada, J. Muto, H. Nakatomi, N. Saito, A. Morita, E. Watanabe, and M. Mitsuishi, “The effects of grid size for automatic suture thread detection with an image classifier,” Proc. of the Asian Conf. on Computer Aided Surgery (ACCAS), 2021.
  20. [20] K. He, X. Zhang, S. Ren, and J. Sun, “Deep residual learning for image recognition,” Proc. of the IEEE Conf. on Computer Vision and Pattern Recognition, pp. 770-778, 2016.
  21. [21] K. Horio, M. M. Marinho, K. Harada, J. Muto, H. Nakatomi, N. Saito, A. Morita, E. Watanabe, and M. Mitsuishi, “Evaluation of data augmentation strategies for automatic suture thread detection using image processing,” Japan Society of Computer Aided Surgery (JSCAS), 2021 (in Japanese).
  22. [22] N. Okamoto and H. Akama, “Extended invariant information clustering is effective for leave-one-site-out cross-validation in resting state functional connectivity modeling,” Front. Neuroinform., Vol.15, 709179, 2021.

*This site is desgined based on HTML5 and CSS3 for modern browsers, e.g. Chrome, Firefox, Safari, Edge, Opera.

Last updated on Feb. 01, 2023