single-jc.php

JACIII Vol.27 No.6 pp. 1216-1229
doi: 10.20965/jaciii.2023.p1216
(2023)

Research Paper:

An Automatic and Robust Visual SLAM Method for Intra-Abdominal Environment Reconstruction

Guodong Wei*1 ORCID Icon, Weili Shi*1,*2, Guanyuan Feng*1,*2 ORCID Icon, Yu Ao*1,*2 ORCID Icon, Yu Miao*1,*2 ORCID Icon, Wei He*1,*2, Tao Chen*3, Yao Wang*4, Bai Ji*5, and Zhengang Jiang*1,*2,† ORCID Icon

*1School of Computer Science and Technology, Changchun University of Science and Technology
No.7089 Weixing Road, Chaoyang District, Changchun, Jilin 130022, China

*2Zhongshan Institute, Changchun University of Science and Technology
No.16 Huizhan East Road, Torch Development Zone, Zhongshan, Guangdong 528437, China

*3Department of General Surgery, Nanfang Hospital, Southern Medical University
No.1023 Shatai South Road, Baiyun District, Guangzhou, Guangdong 510515, China

*4Department of General Surgery, Zhongshan City People’s Hospital
No.2 Sunwen East Road, Central City District, Zhongshan, Guangdong 528403, China

*5Department of Hepatobiliary and Pancreatic Surgery, The First Hospital of Jilin University
No.71 Xinmin Street, Chaoyang District, Changchun, Jilin 130012, China

Corresponding author

Received:
July 27, 2023
Accepted:
August 28, 2023
Published:
November 20, 2023
Keywords:
stereo laparoscope, 3D reconstruction, stereo matching, feature tracking, kernel correlation filter
Abstract

Three-dimensional (3D) surface reconstruction is used to solve the problem of the narrow field of view in laparoscopy. It can provide surgeons or computer-assisted surgery systems with real-time complete internal abdominal anatomy. However, rapid changes in image depth, less texture, and specular reflection pose a challenge for the reconstruction. It is difficult to stably complete the reconstruction process using feature-based simultaneous localization and mapping (SLAM) method. This paper proposes a robust laparoscopic 3D surface reconstruction method using SLAM, which can automatically select appropriate parameters for stereo matching and robustly find matching point pairs for laparoscope motion estimation. The changing trend of disparity maps is used to predict stereo matching parameters to improve the quality of the disparity map. Feature patch extraction and tracking are selected to replace feature point extraction and matching in motion estimation, which reduces its failure and interruption in feature-based SLAM. The proposed feature patch matching method is suitable for parallel computing, which can improve its computing speed. Evaluation results on public in vivo and ex vivo porcine abdominal video data show the efficiency and robustness of our 3D surface reconstruction approach.

Cite this article as:
G. Wei, W. Shi, G. Feng, Y. Ao, Y. Miao, W. He, T. Chen, Y. Wang, B. Ji, and Z. Jiang, “An Automatic and Robust Visual SLAM Method for Intra-Abdominal Environment Reconstruction,” J. Adv. Comput. Intell. Intell. Inform., Vol.27 No.6, pp. 1216-1229, 2023.
Data files:
References
  1. [1] H.-S. Tao et al., “Application of real-time augmented reality laparoscopic navigation in splenectomy for massive splenomegaly,” World J. of Surgery, Vol.45, No.7, pp. 2108-2115, 2021. https://doi.org/10.1007/s00268-021-06082-8.
  2. [2] S. Malhotra et al., “Augmented reality in surgical navigation: A review of evaluation and validation metrics,” Applied Sciences, Vol.13, No.3, Article No.1629, 2023. https://doi.org/10.3390/app13031629
  3. [3] D. Li and M. Wang, “A 3D image registration method for laparoscopic liver surgery navigation,” Electronics, Vol.11, No.11, Article No.1670, 2022. https://doi.org/10.3390/electronics11111670
  4. [4] Q. Cheng and Y. Dong, “Da Vinci robot-assisted video image processing under artificial intelligence vision processing technology,” Computational and Mathematical Methods in Medicine, Vol.2022, Article No.2752444, 2022. https://doi.org/10.1155/2022/2752444
  5. [5] R. Raj and A. Kos, “A comprehensive study of mobile robot: History, developments, applications, and future research perspectives,” Applied Sciences, Vol.12, No.14, Article No.6951, 2022. https://doi.org/10.3390/app12146951
  6. [6] T. Taketomi, H. Uchiyama, and S. Ikeda, “Visual SLAM algorithms: A survey from 2010 to 2016,” IPSJ Trans. on Computer Vision and Applications, Vol.9, Article No.16, 2017. https://doi.org/10.1186/s41074-017-0027-2
  7. [7] P. Mountney et al., “Simultaneous stereoscope localization and soft-tissue mapping for minimal invasive surgery,” Proc. of the 9th Int. Conf. on Medical Image Computing and Computer-Assisted Intervention (MICCAI 2006), Part 1, pp. 347-354, 2006. https://doi.org/10.1007/11866565_43
  8. [8] D. Stoyanov et al., “Real-time stereo reconstruction in robotically assisted minimally invasive surgery,” Proc. of the 13th Int. Conf. on Medical Image Computing and Computer-Assisted Intervention (MICCAI 2010), Part 1, pp. 275-282, 2010. https://doi.org/10.1007/978-3-642-15705-9_34
  9. [9] J. Totz et al., “Enhanced visualization for minimally invasive surgery,” Int. J. of Computer Assisted Radiology and Surgery, Vol.7, No.3, pp. 423-432, 2012. https://doi.org/10.1007/s11548-011-0631-z
  10. [10] J. Song et al., “Dynamic reconstruction of deformable soft-tissue with stereo scope in minimal invasive surgery,” IEEE Robotics and Automation Letters, Vol.3, No.1, pp. 155-162, 2018. https://doi.org/10.1109/LRA.2017.2735487
  11. [11] C. Shi et al., “Gastroscopic panoramic view: Application to automatic polyps detection under gastroscopy,” Computational and Mathematical Methods in Medicine, Vol.2019, Article No.4393124, 2019. https://doi.org/10.1155/2019/4393124
  12. [12] H. Zhou and J. Jayender, “Real-time nonrigid mosaicking of laparoscopy images,” IEEE Trans. on Medical Imaging, Vol.40, No.6, pp. 1726-1736, 2021. https://doi.org/10.1109/TMI.2021.3065030
  13. [13] H. Zhou and J. Jayender, “EMDQ-SLAM: Real-time high-resolution reconstruction of soft tissue surface from stereo laparoscopy videos,” Proc. of the 24th Int. Conf. on Medical Image Computing and Computer-Assisted Intervention (MICCAI 2021), Part 4, pp. 331-340, 2021. https://doi.org/10.1007/978-3-030-87202-1_32
  14. [14] M. Bansal, M. Kumar, and M. Kumar, “2D object recognition: A comparative analysis of SIFT, SURF and ORB feature descriptors,” Multimedia Tools and Applications, Vol.80, No.12, pp. 18839-18857, 2021. https://doi.org/10.1007/s11042-021-10646-0
  15. [15] R. Mur-Artal and J. D. Tardós, “ORB-SLAM2: An open-source SLAM system for monocular, stereo, and RGB-D cameras,” IEEE Trans. on Robotics, Vol.33, No.5, pp. 1255-1262, 2017. https://doi.org/10.1109/TRO.2017.2705103
  16. [16] C. Campos et al., “ORB-SLAM3: An accurate open-source library for visual, visual-inertial, and multimap SLAM,” IEEE Trans. on Robotics, Vol.37, No.6, pp. 1874-1890, 2021. https://doi.org/10.1109/TRO.2021.3075644
  17. [17] N. Mahmoud et al., “ORBSLAM-based endoscope tracking and 3D reconstruction,” Proc. of the 3rd Int. Workshop on Computer-Assisted and Robotic Endoscopy (CARE 2016), pp. 72-83, 2016. https://doi.org/10.1007/978-3-319-54057-3_7
  18. [18] N. Mahmoud et al., “Live tracking and dense reconstruction for handheld monocular endoscopy,” IEEE Trans. on Medical Imaging, Vol.38, No.1, pp. 79-89, 2019. https://doi.org/10.1109/TMI.2018.2856109
  19. [19] J. Song et al., “MIS-SLAM: Real-time large-scale dense deformable SLAM system in minimal invasive surgery based on heterogeneous computing,” IEEE Robotics and Automation Letters, Vol.3, No.4, pp. 4068-4075, 2018. https://doi.org/10.1109/LRA.2018.2856519
  20. [20] H. Zhou and J. Jagadeesan, “Real-time dense reconstruction of tissue surface from stereo optical video,” IEEE Trans. on Medical Imaging, Vol.39, No.2, pp. 400-412, 2020. https://doi.org/10.1109/TMI.2019.2927436
  21. [21] N. Chong et al., “Virtual reality application for laparoscope in clinical surgery based on Siamese network and census transformation,” Proc. of 2021 Int. Conf. on Medical Imaging and Computer-Aided Diagnosis (MICAD 2021), pp. 59-70, 2021. https://doi.org/10.1007/978-981-16-3880-0_7
  22. [22] H. Wu et al., “Semantic SLAM based on deep learning in endocavity environment,” Symmetry, Vol.14, No.3, Article No.614, 2022. https://doi.org/10.3390/sym14030614
  23. [23] H. Wu et al., “3D texture reconstruction of abdominal cavity based on monocular vision SLAM for minimally invasive surgery,” Symmetry, Vol.14, No.2, Article No.185, 2022. https://doi.org/10.3390/sym14020185
  24. [24] A. Afifi et al., “Real-time expanded field-of-view for minimally invasive surgery using multi-camera visual simultaneous localization and mapping,” Sensors, Vol.21, No.6, Article No.2106, 2021. https://doi.org/10.3390/s21062106
  25. [25] M. Turan et al., “A non-rigid map fusion-based direct SLAM method for endoscopic capsule robots,” Int. J. of Intelligent Robotics and Applications, Vol.1, No.4, pp. 399-409, 2017. https://doi.org/10.1007/s41315-017-0036-4
  26. [26] K. Lu et al., “A vision-based detection and spatial localization scheme for forest fire inspection from UAV,” Forests, Vol.13, No.3, Article No.383, 2022. https://doi.org/10.3390/f13030383
  27. [27] P. Liao et al., “A linear pushbroom satellite image epipolar resampling method for digital surface model generation,” ISPRS J. of Photogrammetry and Remote Sensing, Vol.190, pp. 56-68, 2022. https://doi.org/10.1016/j.isprsjprs.2022.05.010
  28. [28] B. B. K. Ayawli et al., “An overview of nature-inspired, conventional, and hybrid methods of autonomous vehicle path planning,” J. of Advanced Transportation, Vol.2018, Article No.8269698, 2018. https://doi.org/10.1155/2018/8269698
  29. [29] M. S. Grewal, A. P. Andrews, and C. G. Bartone, “Chapter 10: Kalman filtering,” M. S. Grewal, A. P. Andrews, and C. G. Bartone, “Global Navigation Satellite Systems, Inertial Navigation, and Integration,” 4th Edition, pp. 355-417, John Wiley & Sons, Inc., 2020. https://doi.org/10.1002/9781119547860.ch10
  30. [30] J. F. Henriques et al., “High-speed tracking with kernelized correlation filters,” IEEE Trans. on Pattern Analysis and Machine Intelligence, Vol.37, No.3, pp. 583-596, 2015. https://doi.org/10.1109/TPAMI.2014.2345390
  31. [31] X. X. Lu, “A review of solutions for perspective-n-point problem in camera pose estimation,” J. of Physics: Conf. Series, Vol.1087, No.5, Article No.052009, 2018. https://doi.org/10.1088/1742-6596/1087/5/052009
  32. [32] P. Li et al., “Evaluation of the ICP algorithm in 3D point cloud registration,” IEEE Access, Vol.8, pp. 68030-68048, 2020. https://doi.org/10.1109/ACCESS.2020.2986470
  33. [33] https://imperialcollegelondon.app.box.com/s/kits2r3uha3fn7zkoyuiikjm1gjnyle3 [Accessed May 23, 2023]

*This site is desgined based on HTML5 and CSS3 for modern browsers, e.g. Chrome, Firefox, Safari, Edge, Opera.

Last updated on Apr. 22, 2024