JRM Vol.33 No.2 pp. 242-253
doi: 10.20965/jrm.2021.p0242


Indoor Unmanned Aerial Vehicle Navigation System Using LED Panels and QR Codes

Hiroyuki Ukida

Tokushima University
2-1 Minamijosanjima-cho, Tokushima, Tokushima 770-8506, Japan

October 12, 2020
March 5, 2021
April 20, 2021
unmanned aerial vehicle, indoor non-GPS environment, on-board camera, LED panel, QR code

In this study, we propose an unmanned aerial vehicle (UAV) navigation system using LED panels and QR codes as markers in an indoor environment. An LED panel can display various patterns; hence, we use it as a command presentation device for UAVs, and a QR code can embed various pieces of information, which is used as a sign to estimate the location of the UAV on the way of the flight path. In this paper, we present a navigation method from departure to destination positions in which an obstacle lies between them. In addition, we investigate the effectiveness of our proposed method using an actual UAV.

UAV navigation by LED and QR panels

UAV navigation by LED and QR panels

Cite this article as:
H. Ukida, “Indoor Unmanned Aerial Vehicle Navigation System Using LED Panels and QR Codes,” J. Robot. Mechatron., Vol.33 No.2, pp. 242-253, 2021.
Data files:
  1. [1] K. Nonami et al., “Drones Making Great Strides,” NTS Inc., 2016 (in Japanese).
  2. [2] Y. Fujino et al., “Large Structure Health Monitoring,” NTS Inc., 2015 (in Japanese).
  3. [3] Y. Hada et al., “Development of a Bridge Inspection Support System Using Two-Wheeled Multicopter and 3D Modeling Technology,” J. Disaster Res., Vol.12, No.3, pp. 593-606, 2017.
  4. [4] S. Abiko, “Significance of Indoor Drone and Introduction of Its Research Topics,” J. Japan Society of Photogrammetry and Remote Sensing, Vol.55, No.4, pp. 250-253, doi: 10.4287/jsprs.55.250, 2016 (in Japanese).
  5. [5] Y. Hane and A.Takeda, “Attitude Control for AR.Drone based on Camera Images,” Information Processing Society of Japan (IPSJ) Tohoku Branch SIG Technical Report, Vol.2014, pp. 1-3, 2015 (in Japanese).
  6. [6] M. F. Sani and G. Karimian, “Automatic Navigation and Landing of an Indoor AR.Drone Quadrotor Using ArUco Marker and Inertial Sensors,” 2017 Int. Conf. on Computer and Drone Applications (IConDA), pp. 102-107, doi: 10.1109/ICONDA.2017.8270408, 2017.
  7. [7] H. Nakanishi and H. Hashimoto, “AR-Marker/IMU Hybrid Navigation System for Tether-Powered UAV,” J. Robot. Mecharoton., Vol.30, No.1, pp. 76-85, 2018.
  8. [8] S. T. Moon, D. H. Cho, S. Han, D. Y. Rew, and E.-S. Sim, “Development of Multiple AR.Drone Control System for Indoor Aerial Choreography,” Trans. of the Japan Society for Aeronautical and Space Sciences, Aerospace Technology Japan, Vol.12, No.APISAT-2013, pp. a59-a67, 2014.
  9. [9] N. Hatakeyama, T. Sasaki, K. Terabayashi, M. Funato, and M. Jindai, “Position and Posture Measurement Method of the Omnidirectional Camera Using Identification Markers,” J. Robot. Mechatron., Vol.30, No.3, pp. 354-362, 2018.
  10. [10] H. Ukida and M. Miwa, “Development of Information Communication System Using LED Panel and Video Camera,” IEEJ Trans. on Electronics, Information and Systems, Vol.133, No.1, pp. 8-17, 2013.
  11. [11] H. Ukida and M. Miwa, “LED Panel Detection and Pattern Discrimination Using UAV’s On-Board Camera for Autoflight Control,” J. Robot. Mecharoton., Vol.28, No.3, pp. 295-303, 2016.
  12. [12] H. Ukida, “Flight Control of UAV Using Panels and Onboard Camera,” J. Japan Society for Design Engineering, Vol.54, No.10, pp. 635-640, 2019 (in Japanese).
  13. [13] A. Rohan, M. Rabah, and S.-H. Kim, “Convolutional Neural Network-Based Real-Time Object Detection and Tracking for Parrot AR Drone 2,” IEEE Access, Vol.7, pp. 69575-69584, doi: 10.1109/ACCESS.2019.2919332, 2019.
  14. [14] H. Yamashita and I. Mitsugami, “Automatic Tracking Drone for Video Analysis of Moving People in Wide Area,” Information Processing Society of Japan (IPSJ) SIG Technical Report, Vol.2020-CVIM-222, No.38, 2020 (in Japanese).
  15. [15] T. Suzuki et al., “3D Terrain Reconstruction by Small Unmanned Aerial Vehicle Using SIFT-Based Monocular SLAM,” J. Robot. Mecharoton., Vol.23, No.2, pp. 292-301, 2011.
  16. [16] K. Nakaya, “Separation assessment between power-line and trees using small drone for maintenance of transmission facility,” 2019 Annual Meeting Record I.E.E. Japan, pp. S21(16)-S21(19), 2019.
  17. [17] D. He et al., “3-D Spatial Spectrum Fusion Indoor Localization Algorithm Based on CSI-UCA Smoothing Technique,” IEEE Access, Vol.6, pp. 59575-59588, doi: 10.1109/ACCESS.2018.2873686, 2018.
  18. [18] T. Kasahara and K. Sato, “An autonomous hovering control of multicopter using only USB camera,” Proc. of the 6th Int. Conf. on Positioning Technology (ICPT2014), pp. 355-357, 2014.
  19. [19] K. Hidaka, D. Fujimoto, and K. Sato, “Autonomous Adaptive Flight Control of a UAV for Practical Bridge Inspection Using Multiple-Camera Image Coupling Method,” J. Robot. Mechatoron., Vol.31, No.6, pp. 845-854, 2019.
  20. [20] H. Deng et al., “Indoor Multi-Camera-Based Testbed for 3-D Tracking and Control of UAVs,” IEEE Trans. on Instrumentation and Measurement, Vol.69, No.6, pp. 3139-3156, 2020.
  21. [21] R. A. Pratama and A. Ohya, “State Estimation and Control of an Unmanned Air Vehicle from a Ground-Based 3D Laser Scanner,” J. Robot. Mechatoron., Vol.28, No.6, pp. 878-886, 2016.

*This site is desgined based on HTML5 and CSS3 for modern browsers, e.g. Chrome, Firefox, Safari, Edge, Opera.

Last updated on Jun. 03, 2024