single-rb.php

JRM Vol.30 No.3 pp. 373-379
doi: 10.20965/jrm.2018.p0373
(2018)

Paper:

Integrated Navigation for Autonomous Drone in GPS and GPS-Denied Environments

Satoshi Suzuki

Faculty of Textile Science and Technology, Shinshu University
3-15-1 Tokida, Ueda-shi, Nagano 386-8567, Japan

Received:
December 12, 2017
Accepted:
May 4, 2018
Published:
June 20, 2018
Keywords:
autonomous drone, GPS-denied environments, integrated navigation, SLAM, Kalman filter
Abstract
Integrated Navigation for Autonomous Drone in GPS and GPS-Denied Environments

Autonomous drone entering the tunnel

In this study, a novel robust navigation system for a drone in global positioning system (GPS) and GPS-denied environments is proposed. In general, the drone uses position and velocity information from GPS for guidance and control. However, GPS cannot be used in several environments; for example, GPS exhibits huge errors near buildings and trees, indoor environments. In such GPS-denied environments, a Laser Imaging Detection and Ranging (LIDAR) sensor-based navigation system has generally been used. However, the LIDAR sensor also has a weakness, and it cannot be used in an open outdoor environment where GPS can be used. Therefore, it is advantageous to develop an integrated navigation system that operates seamlessly in both GPS and GPS-denied environments. In this study, an integrated navigation system for the drone using GPS and LIDAR was developed. The design of the navigation system is based on the extended Kalman filter, and the effectiveness of the developed system is verified by numerical simulation and experiment.

Cite this article as:
S. Suzuki, “Integrated Navigation for Autonomous Drone in GPS and GPS-Denied Environments,” J. Robot. Mechatron., Vol.30, No.3, pp. 373-379, 2018.
Data files:
References
  1. [1] K. Nonami, F. Kendoul, S. Suzuki, W. Wang, and D. Nakazawa, “Autonomous Flying Robots: Unmanned Aerial Vehicles and Micro Aerial Vehicles,” Springer, 2010.
  2. [2] D. Mellinger and V. Kumar, “Minimum snap trajectory generation and control for quadrotors,” IEEE Int. Conf. on Robotics and Automation (ICRA), p. 2520, Shanghai, China, 2011.
  3. [3] M. Hehn and R. D’Andrea, “A flying inverted pendulum,” IEEE Int. Conf. on Robotics and Automation (ICRA) p. 763, Shanghai, China, 2011.
  4. [4] K. Schauwecker and A. Zell, “On-board dual-stereo-vision for autonomous quadrotor navigation,” Int. Conf. on Unmanned Aircraft Systems (ICUAS), p. 333, 2013.
  5. [5] I. Sa, H. He, V. Huynh, and P. Corke, “Monocular vision based autonomous navigation for a cost-effective MAV in GPS-denied environments,” IEEE/ASME Int. Conf. on Advanced Intelligent Mechatronics, p. 1355, 2013.
  6. [6] S. Shaojie, N. Michael, and V. Kumar, “Autonomous multi-floor indoor navigation with a computationally constrained MAV,” IEEE Int. Conf. on Robotics and Automation (ICRA), Shanghai, China, p. 20, 2011.
  7. [7] K. Ito, J. Han, and A. Ohya, “Localization using uniaxial laser rangefinder and IMU for MAV,” IEEE/SICE Int. Symp. on System Integration (SII), p. 712, 2014.
  8. [8] J. B. Kuipers “Quaternion and Rotation Sequence,” Princeton Univ Press, 2002
  9. [9] S. Shaojie, Y. Mulgaonkar, N. Michael, and V. Kumar, “Multi-sensor fusion for robust autonomous flight in indoor and outdoor environments with a rotorcraft MAV,” IEEE Int. Conf. on Robotics and Automation (ICRA), Hongkong, p. 4974, 2014.

*This site is desgined based on HTML5 and CSS3 for modern browsers, e.g. Chrome, Firefox, Safari, Edge, IE9,10,11, Opera.

Last updated on Jul. 19, 2018