JRM Vol.23 No.2 pp. 292-301
doi: 10.20965/jrm.2011.p0292


3D Terrain Reconstruction by Small Unmanned Aerial Vehicle Using SIFT-Based Monocular SLAM

Taro Suzuki*, Yoshiharu Amano*, Takumi Hashizume*,
and Shinji Suzuki**

*Research Institute for Science and Engineering, Waseda University, 17 Kikui-cho, Shinjuku-ku, Tokyo 162-0044, Japan

**Department of Aeronautics and Astronautics, School of Engineering, The University of Tokyo, 7-3-1 Hongo, Bunkyo-ku, Tokyo 113-8656, Japan

October 6, 2010
February 7, 2011
April 20, 2011
SLAM, SIFT, UAV, 3D reconstruction

This paper describes a Simultaneous Localization And Mapping (SLAM) algorithm using a monocular camera for a small Unmanned Aerial Vehicle (UAV). A small UAV has attracted the attention for effective means of the collecting aerial information. However, there are few practical applications due to its small payloads for the 3D measurement. We propose extended Kalman filter SLAM to increase UAV position and attitude data and to construct 3D terrain maps using a small monocular camera. We propose 3D measurement based on Scale-Invariant Feature Transform (SIFT) triangulation features extracted from captured images. Field-experiment results show that our proposal effectively estimates position and attitude of the UAV and construct the 3D terrain map.

Cite this article as:
T. Suzuki, Y. Amano, T. Hashizume, and <. Suzuki, “3D Terrain Reconstruction by Small Unmanned Aerial Vehicle Using SIFT-Based Monocular SLAM,” J. Robot. Mechatron., Vol.23, No.2, pp. 292-301, 2011.
Data files:
  1. [1] M. Nagai et al., “UAV-Borne 3-D Mapping System by Multisensor Integration,” IEEE Trans. on Geoscience and Remote Sensing, Vol.47, Issue 3, pp. 701-708, 2009.
  2. [2] D. Törnqvist, G. Conte, R. Kärlsson, T. B. Schon, and F. Gustafsson, “Utilizing model structure for efficient simultaneous localization and mapping for a UAV application,” Proc. of the IEEE Aerospace Conf., 2008.
  3. [3] J. Artieda, J. M. Sebastian, P. Campoy, J. F. Correa, I. F. Mondragón, C. Martínez, and M. Olivares, “Visual 3-D SLAM from UAVs,” J. of Intelligent and Robotic Systems, Vol.55, No.4-5, pp. 299-321, 2009.
  4. [4] J. H. Kim and S. Sukkarieh, “Airborne simultaneous localisation and map building,” Proc. of the IEEE Int. Conf. on Robotics and Automation, pp. 406-411, 2003.
  5. [5] H. Eisenbeiss and L. Zhang, “Comparison of DSMs generated from mini UAV imagery and terrestrial laser scanner in a cultural heritage application,” Int. Archives of Photogrammetry, Remote Sensing and Spatial Information Sciences, Vol.XXXVI-Part5, pp. 90-96, 2006.
  6. [6] M. Bryson and S. Sukkarieh, “Building a Robust Implementation of Bearing-Only Inertial SLAM for a UAV,” J. of Field Robotics, Vol.24, Issue 1-2, pp. 113-143, 2007.
  7. [7] S. Thrun, W. Burgard, and D. Fox, “Probabilistic Robotics,” The MIT Press, 2005.
  8. [8] H. Durrant-Whyte et al., “Simultaneous Localisation and Mapping (SLAM): Part I The Essential Algorithms,” Robotics and Automation Magazine, Vol.13, No.2, pp. 99-110, 2006.
  9. [9] T. Bailey and H. Durrant-Whyte, “Simultaneous localization and mapping (SLAM): Part II,” IEEE Robotics and Automation Magazine, Vol.13, No.3, pp. 108-117, Sep., 2006.
  10. [10] A. Davison et al., “Monoslam: Real-time single camera slam,” IEEE Trans. on Pattern Analysis and Machine Intelligence, pp. 1052-1067, 2007.
  11. [11] B. Williams et al., “A comparison of loop closing techniques in monocular SLAM,” Robotics and Autonomous Systems, Vol.57, Issue 12, pp. 1187-1197, 2009.
  12. [12] A. Tsourdos, N. Aouf, V. Sazdovski, and B. White, “Low altitude airbone slam with ins aided vision system,” Proc. of the AIAA Guidance, Navigation and Control Conf. and Exhibit, 2007.
  13. [13] D. G. Lowe, “Distinctive Image Features from Scale-Invariant Keypoints,” Int. J. of Computer Vision, pp. 91-110, 2004.
  14. [14] R. Hirokawa, “Guidance and Control System Design and Evaluation for a Small UAV,” 2nd Int. Symposium on Innovative Aerial/Space Flyer Systems, pp. 105-108, 2005.
  15. [15] Z. Zhang, “Flexible Camera Calibration By Viewing a Plane From Unknown Orientations,” Int. Conf. on Computer Vision, pp. 666-673, 1999.
  16. [16] P. H. S. Torr, “Bayesian model estimation and selection for epipolar geometry and generic manifold fitting,” Int. J. of Computer Vision, Vol.50, pp. 35-61, 2002.
  17. [17] R. Hirokawa, N. Kajiwara, and T. Suzuki, “Low-Cost Miniature GPS/INS for Small UAVs with Reduced Order Kalman Filter,” Proc. of the Int. Symposium on GPS/GNSS 2008, pp. 211-220, 2008.

*This site is desgined based on HTML5 and CSS3 for modern browsers, e.g. Chrome, Firefox, Safari, Edge, IE9,10,11, Opera.

Last updated on Feb. 21, 2020