single-rb.php

JRM Vol.36 No.3 pp. 694-703
doi: 10.20965/jrm.2024.p0694
(2024)

Paper:

Use of Mixed Reality in Attachment of Surgical Site Measurement Robot to Surgical Bed

Miho Asano*1, Yoshito Yamada*2, Takahiro Kunii*3, Masanao Koeda*4 ORCID Icon, and Hiroshi Noborio*2

*1Department of Life Design, Osaka International College
6-21-57 Tohdacho, Moriguchi, Osaka 570-8555, Japan

*2Department of Computer Science, Osaka Electro-Communication University
1130-70 Kiyotaki, Shijonawate, Osaka 575-0063, Japan

*3Kashina System Co., Ltd.
116-22 Hiratacho, Hikone, Shiga 522-0041, Japan

*4Department of Human Information Engineering, Okayama Prefectural University
111 Kuboki, Soja, Okayama 719-1197, Japan

Received:
August 30, 2023
Accepted:
February 19, 2024
Published:
June 20, 2024
Keywords:
depth–depth matching algorithm, depth image camera, surgical navigation, digital imaging and communication in medicine, mixed reality
Abstract

Recently, we have observed that the digital potential function defined by the difference between the real and virtual organ depth images is globally stable where the real and virtual livers coincide. This globality is then used to overlay the real and virtual livers. In this study, we consider the installation of a robotic mechanical system for measuring the depth images of real organs in the surgical bed. In general, virtual organs measured by CT or MRI show the position and posture of blood vessel groups and malignant tumors, and if these can be presented to the physician during surgery, he or she can operate while confirming their positions in real time. Although this robotic mechanical system is designed such that the camera can be raised or lowered as necessary to avoid interfering with the movement of the doctor, assistant, or nurse during surgery, it may still shift owing to contact with the hands or head of the doctor or nurse. In this study, an experiment was conducted in which a surgical measurement robotic mechanical system was constructed in a VR environment, and an actual robot was installed using this as a model. In the experiment, a video image of a virtual object was superimposed on that of a real object to confirm whether the surgical robotic mechanical system was able to accurately measure the surgical site.

A robot simply attached to the surgical bed using a jig

A robot simply attached to the surgical bed using a jig

Cite this article as:
M. Asano, Y. Yamada, T. Kunii, M. Koeda, and H. Noborio, “Use of Mixed Reality in Attachment of Surgical Site Measurement Robot to Surgical Bed,” J. Robot. Mechatron., Vol.36 No.3, pp. 694-703, 2024.
Data files:
References
  1. [1] N. Matsumoto, J. Hong, M. Hashizume, and S. Komune, “A minimally invasive registration method using Surface Template-Assisted Marker Positioning (STAMP) for image-guided otologic surgery,” Otolaryngology – Head and Neck Surgery, Vol.140, No.1, pp. 96-102, 2009. https://doi.org/10.1016/j.otohns.2008.10.005
  2. [2] J. Hong and M. Hashizume, “An effective point-based registration tool for surgical navigation,” Surgical Endoscopy, Vol.24, No.4, pp. 944-948, 2010. https://doi.org/10.1007/s00464-009-0568-2
  3. [3] Y. Mise et al., “Virtual liver resection: Computer-assisted operation planning using a three-dimensional liver representation,” J. of Hepato-Biliary-Pancreatic Sciences, Vol.20, No.2, pp. 157-164, 2013. https://doi.org/10.1007/s00534-012-0574-y
  4. [4] S. Ieiri et al., “Augmented reality navigation system for laparoscopic splenectomy in children based on preoperative CT images using optical tracking device,” Pediatric Surgery Int., Vol.28, No.4, pp. 341-346, 2012. https://doi.org/10.1007/s00383-011-3034-x
  5. [5] N. Mahmud, J. Cohen, K. Tsourides, and T. M. Berzin, “Computer vision and augmented reality in gastrointestinal endoscopy,” Gastroenterology Report, Vol.3, No.3, pp. 179-184, 2015. https://doi.org/10.1093/gastro/gov027
  6. [6] P. Pessaux et al., “Towards cybernetic surgery: Robotic and augmented reality-assisted liver segmentectomy,” Langenbeck’s Archives of Surgery, Vol.400, No.3, pp. 381-385, 2015. https://doi.org/10.1007/s00423-014-1256-9
  7. [7] S. Satou et al., “Initial experience of intraoperative three-dimensional navigation for liver resection using real-time virtual sonography,” Surgery, Vol.155, No.2, pp. 255-262, 2014. https://doi.org/10.1016/j.surg.2013.08.009
  8. [8] H. Nishino et al., “Real-time navigation for liver surgery using projection mapping with indocyanine green fluorescence: Development of the novel medical imaging projection system,” Annuals of Surgery, Vol.267, No.6, pp. 1134-1140, 2018. https://doi.org/10.1097/SLA.0000000000002172
  9. [9] M. Doi et al., “Knife tip position estimation using multiple markers for liver surgery support,” Proc. of the 6th Int. Conf. on Advanced Mechatronics (ICAM2015), pp. 74-75, 2015. https://doi.org/10.1299/jsmeicam.2015.6.74
  10. [10] D. Yano, M. Koeda, K. Onishi, and H. Noborio, “Development of a surgical knife attachment with proximity indicators,” A. Marcus and W. Wang (Eds.), “Design, User Experience, and Usability: Designing Pleasureable Experiences,” pp. 608-618, Springer, 2017. https://doi.org/10.1007/978-3-319-58637-3_48
  11. [11] M. Koeda, D. Yano, M. Doi, K. Onishi, and H. Noborio, “Calibration of surgical knife-tip position with marker-based optical tracking camera and precise evaluation of its measurement accuracy,” J. of Bioinformatics and Neuroscience, Vol.4, No.1, pp. 155-159, 2018.
  12. [12] H. Noborio et al., “Motion transcription algorithm by matching corresponding depth image and Z-buffer,” Proc. of the 10th Anniversary Asian Conf. on Computer Aided Surgery, pp. 60-61, 2014.
  13. [13] M. Asano et al., “Stability maintenance of depth-depth matching of steepest descent method using an incision shape of an occluded organ,” M. Kurosu (Ed.), “Human-Computer Interaction: Human Values and Quality of Life,” pp. 539-555, Springer, 2020. https://doi.org/10.1007/978-3-030-49065-2_38
  14. [14] M. Asano et al., “Convergence stability of depth-depth-matching-based steepest descent method in simulated liver surgery,” Int. J. of Pharma Medicine and Biological Sciences, Vol.10, No.2, pp. 60-67, 2021. https://doi.org/10.18178/ijpmbs.10.2.60-67
  15. [15] H. Noborio, K. Onishi, M. Koeda, K. Watanabe, and M. Asano, “Depth–depth matching of virtual and real images for a surgical navigation system,” Int. J. of Pharma Medicine and Biological Sciences, Vol.10, No.2, pp. 40-48, 2021. https://doi.org/10.18178/ijpmbs.10.2.40-48
  16. [16] H. Yoshida, F. Ujibe, and H. Noborio, “On the force/shape reappearance of MSD rheology model calibrated by force/shape sequence,” The Japanese J. for Medical Virtual Reality, Vol.5, No.1, pp. 40-49, 2007 (in Japanese). https://doi.org/10.7876/jmvr.5.40
  17. [17] R. Nogami, R. Enoki, and H. Noborio, “Deformation properties of three kinds of MSD models of rheology object calibrated by randomized algorithm,” Trans. of the Virtual Reality Society of Japan, Vol.8, No.3, pp. 271-278, 2003. https://doi.org/10.18974/tvrsj.8.3_271
  18. [18] H. Noborio et al., “Fast surgical algorithm for cutting with liver standard triangulation language format using Z-buffers in graphics processing unit,” M. G. Fujie (Ed.), “Computer Aided Surgery,” pp. 127-140, Springer, 2016. https://doi.org/10.1007/978-4-431-55810-1_11
  19. [19] K. Onishi et al., “Virtual liver surgical simulator by using Z-buffer for object deformation,” Proc. of the 9th Int. Conf. on Learning and Collaboration Technologies (UAHCI 2015), Part 3, pp. 345-351, 2015. https://doi.org/10.1007/978-3-319-20684-4_34
  20. [20] M. Nonaka et al., “Multi-camera coordinate calibration and accuracy evaluation for robot control,” M. Kurosu (Ed.), “Human-Computer Interaction. Recognition and Interaction Technologies,” pp. 506-523, Springer, 2019. https://doi.org/10.1007/978-3-030-22643-5_40
  21. [21] M. Nonaka et al., “A useful robotic-mechanical system for measuring a surgical area without obstructing surgical operations by some surgeon,” M. Kurosu (Ed.), “Human-Computer Interaction. Interaction in Context,” pp. 43-52, Springer, 2018. https://doi.org/10.1007/978-3-319-91244-8_4
  22. [22] M. Nonaka, K. Watanabe, H. Noborio, M. Kayaki, and K. Mizushino, “Capturing a surgical area using multiple depth cameras mounted on a robotic mechanical system,” A. Marcus and W. Wang (Eds.), “Design, User Experience, and Usability: Designing Pleasureable Experiences,” pp. 540-555, Springer, 2017. https://doi.org/10.1007/978-3-319-58637-3_42
  23. [23] S. Numata, M. Koeda, K. Onishi, K. Watanabe, and H. Noborio, “Performance and accuracy analysis of 3D model tracking for liver surgery,” M. Kurosu (Ed.), “Human-Computer Interaction: Recognition and Interaction Technologies,” pp. 524-533, Springer, 2019. https://doi.org/10.1007/978-3-030-22643-5_41
  24. [24] T. Nakamura and H. Kimura, “A shape measurement system by a fast depth measurement device,” Conf. Proc. the Japan Society of Naval Architects and Ocean Engineers, Vol.24, pp. 199-204, 2017 (in Japanese). https://doi.org/10.14856/conf.24.0_199

*This site is desgined based on HTML5 and CSS3 for modern browsers, e.g. Chrome, Firefox, Safari, Edge, Opera.

Last updated on Oct. 19, 2024