single-rb.php

JRM Vol.28 No.4 pp. 451-460
doi: 10.20965/jrm.2016.p0451
(2016)

Paper:

Recognition Method Applied to Smart Dump 9 Using Multi-Beam 3D LiDAR for the Tsukuba Challenge

Yoshihiro Takita*, Shinya Ohkawa*, and Hisashi Date**

*Department of Computer Science, National Defense Academy of Japan
1-10-20 Hashirimizu, Yokosuka, Kanagawa 239-8686, Japan

**Faculty of Engineering, Information and Systems, University of Tsukuba
1-1-1 Tennodai, Tsukuba, Ibaraki 305-8577, Japan

Received:
March 9, 2016
Accepted:
April 21, 2016
Published:
August 20, 2016
Keywords:
Real World Robotics Challenge, mobile robot, 3D LiDAR, identification
Abstract
The Tsukuba Challenge course includes a pedestrian road in which walkers, bicyclists, and mobile robots coexist. As a result, mobile robots encounter potentially dangerous situations when faced with moving bicycles. Navigating the challenge course involves locating target individuals in the search area and paying attention to the safety of bicyclists. Target individuals involve those who typically wear a cap and a refracted vest and are seated on chairs. This study proposes a method to identify pedestrians, bicyclists, and seated individuals by using a 3D LiDAR on Smart Dump 9. The SVM method was employed to identify the target seated individuals. An experiment was conducted on the challenge course to illustrate the advantages of the proposed method.
Smart Dump 9 started at the Tsukuba Challenge 2015 final

Smart Dump 9 started at the Tsukuba Challenge 2015 final

Cite this article as:
Y. Takita, S. Ohkawa, and H. Date, “Recognition Method Applied to Smart Dump 9 Using Multi-Beam 3D LiDAR for the Tsukuba Challenge,” J. Robot. Mechatron., Vol.28 No.4, pp. 451-460, 2016.
Data files:
References
  1. [1] K. Yamauchi, N. Akai, R. Unai, K. Inoue, and K. Ozaki, “Person Detection Method Based on Color Layout in Real World Robot Challenge 2013,” J. of Robotics and Mechatronics, Vol.26, No.2, pp. 151-157, 2014.
  2. [2] K. Yamaguchi, N. Akai, and K. Ozaki, “Color Extraction Using Multiple Photographs Taken with Different Exposure Time in RWRC,” J. of Robotics and Mechatronics, Vol.26, No.2, pp. 158-165, 2014.
  3. [3] J. Eeguchi and K. Ozaki, “Development of the Autonomous Mobile Robot for Target-Searching in Urban Areas in the Tsukuba Challenge 2013,” J. of Robotics and Mechatronics, Vol.26, No.2, pp. 166-176, 2014.
  4. [4] S. A. Rahok, H. Oneda, A. Tanaka, and K. Ozaki, “A Robust Navigation Method for Mobile Robots in Real-World Environment,” J. of Robotics and Mechatronics, Vol.26, No.2, pp. 177-184, 2014.
  5. [5] H. Date and Y. Takita, “Real World Experiments of Autonomous Mobile Robot Smart Dump – Influence and Countermeasure of Human Crowd Behavior in a Pedestrian Environment –,” J. of the Robotics Society of Japan, Vol.30, No.3, pp. 305-313, 2012.
  6. [6] M. Saito, K. Kiuchi, S. Shogo, T. Yokota, Y. Fujino, T. Saito, and Y. Kuroda, “Pre-Driving Needless System for Autonomous Mobile Robots Navigation in Real World Robot Challenge 2013,” J. of Robotics and Mechatronics, Vol.26, No.2, pp. 185-195, 2014.
  7. [7] K. Okawa, “Three Tiered Self-Localization of Two Position Estimation Using Three Dimensional Environment Map and Gyro-Odometry,” J. of Robotics and Mechatronics, Vol.26, No.2, pp. 196-203, 2014.
  8. [8] S. Muramatsu, T. Tomizawa, S. Kudoh, and T. Suehiro, “Development of Intelligent Mobile Cart in a Crowded Environment – Robust Localization Technique with Unknown Objects –,” J. of Robotics and Mechatronics, Vol.26, No.2, pp. 204-213, 2014.
  9. [9] T. Suzuki, M. Kitamura, Y. Amano, and N. Kubo, “Autonomous Navigation of a Mobile Robot Based on GNSS/DR Integration in Outdoor Environments,” J. of Robotics and Mechatronics, Vol.26, No.2, pp. 214-224, 2014.
  10. [10] T. Shioya, K. Kogure, and N. Ohta, “Minimal Autonomous Mover – MG-11 for Tsukuba Challenge –,” J. of Robotics and Mechatronics, Vol.26, No.2, pp. 225-235, 2014.
  11. [11] M. Yokozuka and O. Matsumoto, “A Reasonable Path Planning via Path Energy Minimization,” J. of Robotics and Mechatronics, Vol.26, No.2, pp. 236-224, 2014.
  12. [12] F. Moosmann and C. Stiller, “Joint Self-Localization and Tracking of Generic Objects in 3D Range Data,” Proc. of 2013 IEEE Int. Conf. on Robotics and Automation, pp. 1138-1144, 2014.
  13. [13] Y. Takita and H. Date, “Actualized Autonomous Functions by Smart Dump 3,” J. of the Society Instrument and Control Engineers, Vol.49, No.9, pp. 636-636, 2010.
  14. [14] J. Kikuchi, H. Date, S. Ohkawa, V. Lbt, and Y. Taktia, “Hierarchical approach using camera and LIDAR for searching people by autonomous mobile robot,” System Integration 2013, 1A2-2, 2013.
  15. [15] Y. Takita, “High-speed Driving of a Lateral Guided Vehicle with Sensor Steering Mechanism,” Trans. of The Japan Society of Mechanical En-ginners, Series C, Vol.65, No.630, pp. 622-629, 1999.
  16. [16] Y. Takita, “Drift Turning of Lateral Guided Vehicle with Sensor Steering Mechanism (application of a Variable Kinetic Friction Model),” Trans. of The Japan Society of Mechanical Engineers, Series C, Vol.68, No.675, pp. 3170-3177, 2002.
  17. [17] H. Date, S. Ohkawa, Y. Takita, and J. Kikuchi, “High Precision Localization of Mobile Robot Using LIDAR Intensity of Surface,” Trans. of the Japan Society of Mechanical Engineers, Series C, Vol.79, No.806, pp. 3389-3398, 2013.
  18. [18] K. Kobayashi, H. Date, S. Ohkawa, and Y. Takita, “Detection and tracking of moving objects in the promenade,” 20th Robotics Symposia, 6B4, pp. 546-551, 2015 (in Japanese).
  19. [19] S. Ohkawa, Y. Takita, H. Date, and K. Kobayashi, “Development of Autonomous Mobile Robot Using Articulated Steering Vehicle and Lateral Guiding Method,” J. of Robotics and Mechatronics, Vol.27, No.4, pp. 337-345, 2015.
  20. [20] B. E. Boser, I. Guyon, and V. Vanik, “A Training algorithm for optimal margin classifiers,” Proc. of the Fifth Annual Workshop on Computational Learning Theory, pp. 144-152, 1992.
  21. [21] C. Cortes and V. Vapnik, “Support-Vector Networks,” Machine Learning, Vol.20, pp. 273-297, 1995.

*This site is desgined based on HTML5 and CSS3 for modern browsers, e.g. Chrome, Firefox, Safari, Edge, Opera.

Last updated on Apr. 22, 2024