Review:
Introduction to Simultaneous Localization and Mapping
Takashi Tsubouchi
Faculty of System and Information Engineering, University of Tsukuba
1-1-1 Tennodai, Tsukuba, Ibaraki 305-8577, Japan
Simultaneous localization and mapping (SLAM) forms the core of the technology that supports mobile robots. With SLAM, when a robot is moving in an actual environment, real world information is imported to a computer on the robot via a sensor, and robot’s physical location and a map of its surrounding environment of the robot are created. SLAM is a major topic in mobile robot research. Although the information, supported by a mathematical description, is derived from a space in reality, it is formulated based on a probability theory when being handled. Therefore, this concept contributes not only to the research and development concerning mobile robots, but also to the training of mathematics and computer implementation, aimed mainly at position estimation and map creation for the mobile robots. This article focuses on the SLAM technology, including a brief overview of its history, insights from the author, and, finally, introduction of a specific example that the author was involved.
- [1] H. Durrant-Whyte and T. Bailey, “Simultaneous Localization and Mapping (SLAM): Part I,” IEEE Robotics & Automation Magazine, pp. 99-108, June 2006.
- [2] T. Bailey and H. Durrant-Whyte, “Simultaneous Localization and Mapping (SLAM): Part II,” IEEE Robotics & Automation Magazine, pp. 108-117, September 2006.
- [3] S. Thrun and J. J. Leonard, “Simultaneous Localization and Mapping,” B. Siciliano and O. Khatib (Eds.), “Springer Handbook of Robotics,” Chapter 37, pp. 871-889, 2008.
- [4] M. Tomono, “Simultaneous Localization and Mapping,” Ohmsha, 2018 (in Japanese).
- [5] H. Durrant-Whyte et al., “Localization of Automatic Guided Vehicles,” G. Giralt et al. (Eds.), “Robotics Research,” Springer, pp. 613-625, 1996.
- [6] R. C. Smith and P. Cheeseman, “On the Representation and Estimation of Spatial Uncertainty,” The Int. J. of Robotics Research, Vol.5, No.4, pp. 56-68, 1986.
- [7] R. Chatila and J.-P. Laumond, “Position Referencing and Consistent World Modeling for Mobile Robots,” Proc. of 1985 IEEE Int. Conf. on Robotics and Automation (ICRA 1985), pp. 138-145, 1985.
- [8] N. Ayache and O. D. Faugeras, “Building, Registering, and Fusing Noisy Visual Maps,” The Int. J. of Robotics Research, Vol.7, No.6, pp. 45-65, 1988.
- [9] W. D. Rencken, “Concurrent Localization and Map Building for Mobile Robots Using Ultrasonic Sensors,” Proc. of 1993 IEEE/RSJ Int. Conf. on Intelligent Robots and Systems (IROS 1993), pp. 2192-2197, 1993.
- [10] J. Iijima, S. Asaka, and S. Yuta, “Searching Unknown Environment by a Mobile Robot Using Range Sensor – an Algorithm and Experiment,” Proc. of 1989 IEEE/RSJ Int. Workshop on Intelligent Robots and Systems (IROS 1989), pp. 46-53, 1989.
- [11] J. Iijima and S. Yuta, “Searching Unknown 2-D Environment by a Mobile Robot with a Range Sensor,” Computers & Electrical Engineering, Vol.18, No.1, pp. 83-98, 1992.
- [12] S. Thrun, W. Burgard, and D. Fox, “Probabilistic Robotics,” MIT Press, 2006.
- [13] C. M. Bishop, “Pattern Recognition and Machine Learning,” Springer, 2006.
- [14] Y. Takita, S. Yuta, T. Tsubouchi, and K. Ozaki, “Special Issue on Real World Robot Challenge in Tsukuba,” J. Robot. Mechatron., Vol.30, No.4, p. 503, 2018.
- [15] N. Akai, L. Y. Morales, and H. Murase, “Teaching-Playback Navigation Without a Consistent Map,” J. Robot. Mechatron., Vol.30, No.4, pp. 591-597, 2018.
- [16] A. Sujiwo, T. Ando, E. Takeuchi, Y. Ninomiya, and M. Edahiro, “Monocular Vision-Based Localization Using ORB-SLAM with LIDAR-Aided Mapping in Real-World Robot Challenge,” J. Robot. Mechatron., Vol.28, No.4, pp. 479-490, 2016.
- [17] J. Eguchi and K. Ozaki, “Development of Method Using a Combination of DGPS and Scan Matching for the Making of Occupancy Grid Maps for Localization,” J. Robot. Mechatron., Vol.25, No.3, pp. 506-514, 2013.
- [18] T. Suzuki, Y. Amano, T. Hashizume, and S. Suzuki, “3D Terrain Reconstruction by Small Unmanned Aerial Vehicle Using SIFT-Based Monocular SLAM,” J. Robot. Mechatron., Vol.23, No.2, pp. 292-301, 2011.
- [19] A. Sakai, T. Saitoh, and Y. Kuroda, “Robust Landmark Estimation and Unscented Particle Sampling for SLAM in Dynamic Outdoor Environment,” J. Robot. Mechatron., Vol.22, No.2, pp. 140-149, 2010.
- [20] R. Mur-Artal, J. M. M. Montiel, and J. D. E. Tardós, “ORB-SLAM: a Versatile and Accurate Monocular SLAM System,” IEEE Trans. on Robotics, Vol.31, No.5, pp. 1147-1163, doi: 10.1109/TRO.2015.2463671, 2015.
- [21] T. Tsubouchi, A. Tanaka, A. Ishioka, and S. Yuta, “A SLAM Based Teleoperation and Interface System for Indoor Environment Reconnaissance in Rescue Activities,” Proc. of IEEE/RSJ Int. Conf. on Intelligent Robots and Systems, pp. 1095-1102, 2004.
- [22] E. Takeuchi and T. Tsubouchi, “A 3-D Scan Matching using Improved 3-D Normal Distributions Transform for Mobile Robotic Mapping,” Proc. of IEEE/RSJ Int. Conf. on Intelligent Robots and Systems, pp. 3068-3073, 2006.
- [23] E. Takeuchi and T. Tsubouchi, “Multi Sensor Map Building based on Sparse Linear Equations Solver,” Proc. of 2008 IEEE/RSJ Int. Conf. on Intelligent Robots and Systems, pp. 2511-2518, 2008.
- [24] Y. Hara, S. Bando, T. Tsubouchi, A. Oshima, I. Kitahara, and Y. Kameda, “6DOF iterative closest point matching considering a priori with maximum a posteriori estimation,” Proc. of 2013 IEEE/RSJ Int. Conf. on Intelligent Robots and Systems, pp. 4172-4179, 2013.
- [25] S. Bando, T. Tsubouchi, and S. Yuta, “Scan Matching Method using Projection in Dominant Direction of Indoor Environment,” Advanced Robotics, Vol.28, Issue 18, pp. 1243-1251, 2014.
- [26] T. Tsubouchi et al., “Forest 3D Mapping and Tree Sizes Measurement for Forest Management based on Sensing Technology for Mobile Robots,” K. Yoshida and S. Tadokoro (Eds.), “Filed and Service Robotics,” Vol.92, pp. 357-368, 2013.
- [27] T. Tsubouchi, K. Sasaki, T. Hayami, and Y. Chiba, “Application of Mobile Robot Technology to Forest Measurement,” Mokuzai Joho (J. of Japan Wood Products Information and Research Center), pp. 9-13, 2017 (in Japanese).
- [28] P. J. Besl and N. D. McKay, “A Method for Registration of 3-D Shapes,” IEEE Trans. on Patten Analysis and Machine Intelligence (PAMI), Vol.14, No.2, pp. 239-256, 1992.
- [29] J. Zhang and S. Singh, “LOAM: Lidar Odometry and Mapping in Real Time,” Proc. on Robotics: Science and Systems 2014, 2014.
- [30] S. Seki, T. Tsubouchi, S. Sarata, and Y. Hara, “Forest Mapping and Trunk Parameter Measurement on Slope Using a 3D-Lidar,” Proc. of 2017 IEEE/SICE Int. Symp. on System Integration (SII 2017), pp. 380-386, doi: 10.1109/SII.2017.8279242, 2017.
- [31] R. Mur-Artal and J. D. E. Tardós, “ORB-SLAM2: an Open-Source SLAM System for Monocular, Stereo and RGB-D Cameras,” IEEE Trans. on Robotics, Vol.33, No.5, pp. 1255-1262, doi: 10.1109/TRO.2017.2705103, 2017.
- [32] E. Rublee, V. Rabaud, K. Konolige, and G. Bradski, “ORB: An Efficient Alternative to SIFT or SURF,” IEEE 2011 Int. Conf. on Computer Vision, pp. 2564-2571, doi: 10.1109/ICCV.2011.6126544, 2011.
- [33] E. Rosten and T. Drummond, “Machine Learning for High-Speed Corner Detection,” A. Leonardis, H. Bischof, and A. Pinz (Eds.), “Computer Vision – ECCV 2006,” Vol.3951, pp. 430-443, Springer, doi: 10.1007/11744023_34, 2006.
- [34] M. Calonder, V. Lepetit, C. Strecha, and P. Fua, “BRIEF: Binary Robust Independent Elementary Features,” K. Daniilidis, P. Maragos, and N. Paragios (Eds.), “Computer Vision – ECCV 2010,” Vol.6314, pp. 778-792, Springer, doi: 10.1007/978-3-642-15561-1_56, 2010.
This article is published under a Creative Commons Attribution-NoDerivatives 4.0 Internationa License.