single-jc.php

JACIII Vol.15 No.8 pp. 1167-1174
doi: 10.20965/jaciii.2011.p1167
(2011)

Paper:

Development of an Intelligent Simulator with SLAM Functions for Visual Autonomous Landing on Small Celestial Bodies

Cedric Cocaud* and Takashi Kubota**

*Department of Electrical Engineering, University of Tokyo, ISAS campus 3-1-1, Yoshinodai, Chuo-ku, Sagamihara, Kanagawa 252-5210, Japan

**Institute of Space and Astronautical Science (JAXA/ISAS), ISAS campus 3-1-1, Yoshinodai, Chuo-ku, Sagamihara, Kanagawa 252-5210, Japan

Received:
May 16, 2011
Accepted:
August 1, 2011
Published:
October 20, 2011
Keywords:
simulator, computer graphics, SLAM, visual navigation, small body exploration
Abstract
As space agencies are currently looking at Near Earth Asteroids as a next step on their exploration roadmap, high precision autonomous landing control schemes will be required for upcoming missions. In this paper, an intelligent simulator is proposed to reproduce all of the visual and dynamic aspects required to test an autonomous Simultaneous Localization and Mapping (SLAM) system. The proposed simulator provides position and attitude information to a spacecraft during its approach descent and Landing phase toward the surface of an asteroid or other small celestial bodies. Because the SLAM system makes use of navigation cameras and a range sensor moving with the spacecraft as it approaches the surface, the simulator is also developed to reproduce a fully integrated 3D environment using computer graphics technology that mimics the noise, image detail and real-time performances of the navigation cameras and the range sensors. This paper describes the architecture and capability of the developed simulator and the SLAM system for which it is designed. The developed simulator is evaluated by using the specifications of the onboard sensors used in the Hayabusa spacecraft sent by JAXA/ISAS to the Itokawa asteroid in 2003.
Cite this article as:
C. Cocaud and T. Kubota, “Development of an Intelligent Simulator with SLAM Functions for Visual Autonomous Landing on Small Celestial Bodies,” J. Adv. Comput. Intell. Intell. Inform., Vol.15 No.8, pp. 1167-1174, 2011.
Data files:
References
  1. [1] J. S. Lewis, “Mining the Sky: Untold Riches from the Asteroids, Comets, and Planets,” Perseus Publishing, 1997.
  2. [2] C. Cocaud, “Autonomous Tasks Allocation and Path Generation of UAV’s,” Dept. of Mech. Eng., Univ. of Ottawa, 2006.
  3. [3] A. Ansar and Y. Cheng, “An Analysis of Spacecraft Localization from Descent Image Data for Pinpoint Landing on Mars and Other Cratered Bodies,” Photogrammetric Engineering & Remote Sensing, Vol.71, No.10, pp. 1197-1204, 2005.
  4. [4] T. Misu, K. Hashimoto, and K. Ninomiya, “Optical Guidance for Autonomous Landing of Spacecraft,” IEEE Trans. Aerospace and Electronic Systems, Vol.35, No.2, pp. 459-473, 1999.
  5. [5] M. Maruya, H. Ohyama, M. Uo, N. Muranaka, H. Morita, T. Kubota, T. Hashimotoa, J. Saito, and J. Kawaguchi, “Navigation Shape and Surface Topography Model of Itokawa,” AIAA/AAS Astrodynamics Specialist Conf. and Exhibit, Vol.8, pp. 21-24, 2006.
  6. [6] M. McCrum, S. Parkes, I. Martin, and M. Dunstan, “Mars Visual Simulation for ExoMars Navigation Algorithm Validation,” Proc. of i-SAIRAS, pp. 283-290, 2010.
  7. [7] S.Williams, S. Remy, and A.M. Howard, “3D Simulations for Testing and Validating Robotic-Driven Applications for Exploring Lunar Poles,” AIAA Infotech at Aerospace conf., pp. 1-9, 2010.
  8. [8] J. Artieda, J. M. Sebastian, P. Campoy, J. F. Correra, I. F. Mondragón, C. Martínez, and M. Olivares, “Visual 3-D SLAM from UAVs,” J. Intel. Robot. Syst., Vol.55, pp. 299-321, 2009.
  9. [9] R.Munguia and A. Grau, “Monocular SLAMfor Visual Odometry,” Proc. of IEEE Int. Symp. on Intelligent Signal Processing, pp. 1-6, 2007.
  10. [10] M. Montemerlo, “FastSLAM: A Factored Solution to the Simultaneous Localization and Mapping Problem With Unknown Data Association,” Ph.D. Thesis, Carnegie Mellon University, 2003.
  11. [11] G. Casella and C. P. Robert, “Rao-Blackwellisation of sampling schemes,” Biometrika, Vol.83, No.1, pp. 81-94, 1996.
  12. [12] A. Doucet, N. de Freitas, K. Murphy, and S. Russell, “Raoblackwellised particle filtering for dynamic bayesian networks,” Proc. of 16th Uncertainty in AI conf., pp. 176-183, 2000.
  13. [13] K. Murphy, “Bayesian map learning in dynamic enviro-nments,” Neural Info. Proc. Sys., pp. 1015-1021, 1999.
  14. [14] R. Hartley and A. Zisserman, “Multiple View Geometry in Computer Vision Second Edition,” Cambridge University Press, 2nd Edition, 2003.
  15. [15] T. K. Marks, A. Howard, M. Bajracharya, G. W. Cottrell, and L. H. Matthies, “Gamma-SLAM: Visual SLAM in Unstructured Environments Using Variance Grid Maps,” J. of Field Robotics, Vol.26, No.1, pp. 26-51, 2009.
  16. [16] N. Fairfield, G. A. Kantor, and D. Wettergreen, “Real-Time SLAM with Octree Evidence Grids for Exploration in Underwater Tunnels,” J. of Field Robotics, Vol.24, No.1/2, pp. 3-21, 2007.
  17. [17] K. Konolige and M. Agrawal, “FrameSLAM: From Bundle Adjustment to Real-Time Visual Mapping,” IEEE Trans. on Robotics, Vol.24, No.5, pp. 1066-1077, 2008.
  18. [18] A. J. Davison, “Real-Time Simultaneous Localisation and Mapping with a Single Camera,” Proc. of the 9th ICCV, pp. 1-8, 2003.
  19. [19] S. Se, D. Lowe, and J. Little, “Mobile Robot Localization and Mapping with Uncertainty using Scale-Invariant Visual Landmarks,” Int. J. of Robotics Research, Vol.21, pp. 735-758, 2002.
  20. [20] A. I. Mourikis, N. Trawny, S. I. Roumeliotis, A. E. Johnson, A. Ansar, and L. Matthies, “Vision-Aided Inertial Navigation for Spacecraft Entry, Descent, and Landing,” IEEE Trans. on Robotics, Vol.25, No.2, pp. 264-280, 2009.
  21. [21] T. Kubota, T. Hashimotoa, S. Sawaib, J. Kawaguchib, K. Ninomiya, M. Uoc, and K. Babac, “An autonomous navigation and guidance system for MUSES-C asteroid landing,” Acta Astronautica, Vol.52, pp. 125-131, 2003.
  22. [22] H. Morita, K. Shirakawa, T. Hashimoto, T. Kubota, and J. Kawaguchi, “Hayabusa Descent Navigation based on Accurate Landmark Tracking Scheme,” The J. of Space Technology and Science, 2006 spring, Vol.22, No.1, pp. 21-31, 2007.
  23. [23] M. Uo, K. Shirakawa, T. Hashimoto, T. Kubota, and J. Kawaguchi, “Hayabusa Touching-Down to Itokawa – Autonomous Guidance and Navigation,” The J. of Space Technology and Science, 2006 spring, Vol.22, No.1, pp. 32-41, 2007.
  24. [24] C. Cocaud and T. Kubota, “Probabilistic Motion Estimation for Near Real-Time Navigation and Landing on Small Celestial Bodies,” Proc. of 22nd Int. Symp. on Space Flight Dynamics, pp. 1-13, 2011.
  25. [25] H. Bay, A. Ess, T. Tuytelaars, and L. V. Gool, “SURF: Speeded Up Robust Features,” Computer Vision and Image Understanding (CVIU), Vol.110, No.3, pp. 346-359, 2008.

*This site is desgined based on HTML5 and CSS3 for modern browsers, e.g. Chrome, Firefox, Safari, Edge, Opera.

Last updated on Apr. 22, 2024