JRM Vol.33 No.6 pp. 1216-1222
doi: 10.20965/jrm.2021.p1216


Field Robotics: Applications and Fundamentals

Takanori Fukao

University of Tokyo
7-3-1 Hongo, Bunkyo-ku, Tokyo 113-8656, Japan

October 4, 2021
October 12, 2021
December 20, 2021
field robotics, path planning, robust control, recognition, artificial intelligence
Automated cabbage harvester

Automated cabbage harvester

Field robotics is an area that is impelled by an application-driven approach by its nature. In this paper, I first review certain actual application areas of field robotics. Then, I discuss the current status of the application of field robotics in three common technologies: (1) mapping and path planning; (2) self-localization, recognition, and decision-making; and (3) dynamics and control. I then conclude by presenting future perspectives.

Cite this article as:
T. Fukao, “Field Robotics: Applications and Fundamentals,” J. Robot. Mechatron., Vol.33 No.6, pp. 1216-1222, 2021.
Data files:
  1. [1] M. Buehler, K. Iagnemma, and S. Singh (Eds.), “The DARPA Urban Challenge Autonomous Vehicles in City Traffic,” Springer, 2009.
  2. [2] E. Guizzo, “How Google’s Self-Driving Car Works,” IEEE Spectrum, 2011.
  3. [3] C. Cadena et al., “Past, Present, and Future of Simultaneous Localization and Mapping: Toward the Robust-Perception Age,” IEEE Trans. on Robotics, Vol.32, No.6, pp. 1309-1332, 2016.
  4. [4] S. Grigorescu, B. Trasnea, T. Cocias, and G. Macesanu, “A survey of deep learning techniques for autonomous driving,” J. of Field Robotics, Vol.37, Issue 3, pp. 362-386, 2019.
  5. [5] K. J. Shin and M. Shunsuke, “Consumer Demand for Fully Automated Driving Technology: Evidence from Japan,” RIETI Discussion Paper Series, 17-E-032, 2017.
  6. [6] T. Sugimachi, T. Fukao, Y. Suzuki, and H. Kawashima “Development of Autonomous Platooning System for Heavy-Duty Trucks,” IFAC Proc. Volumes, Vol.46, Issue 21, pp. 52-57, 2013.
  7. [7] Y. Orita and T. Fukao, “Robust Human Tracking of a Crawler Robot,” J. Robot. Mechatron., Vol.31, No.2, pp. 194-202, 2019.
  8. [8] H. Saiki, T. Kobayashi, T. Fukao, T. Urakubo, K. Araiba, and H. Amano, “Control for Suppressing Roll Motion of Outdoor Blimp Robots for Disaster Surveillance,” AIAA Infotech at Aerospace, AIAA 2015-0714, 2015.
  9. [9] N. Noguchi, “Agricultural Vehicle Robot,” J. Robot. Mechatron., Vol.30, No.2, pp. 165-172, 2018.
  10. [10] R. Iinuma, Y. Kojima, H. Onoyama, T. Fukao, S. Hattori, and Y. Nonogaki, “Pallet Handling System with an Autonomous Forklift for Outdoor Fields,” J. Robot. Mechatron., Vol.32, No.5, pp. 1071-1079, 2020.
  11. [11] Y. Onishi, T. Yoshida, H. Kurita, T. Fukao, H. Arihara, and A. Iwai, “An automated fruit harvesting robot by using deep learning,” ROBOMECH J., Vol.6, No.13, 2019.
  12. [12] H. Shakhatreh et al., “Unmanned Aerial Vehicles (UAVs): A Survey on Civil Applications and Key Research Challenges,” IEEE Access, Vol.7, pp. 48572-48634, 2019.
  13. [13] M. Kulbacki et al., “Survey of Drones for Agriculture Automation from Planting to Harvest,” Proc. of 2018 IEEE 22nd Int. Conf. on Intelligent Engineering Systems (INES), pp. 353-358, 2018.
  14. [14] N. Inoue, G. Hayashida, T. Urakubo, and T. Fukao, “Development of a Tilt-rotor UAV for Information Gathering,” Proc. of the 2nd Int. Conf. on Maintenance Science and Technology, pp. 239-240, 2014.
  15. [15] C. Toth and G. Jóźków, “Remote sensing platforms and sensors: A survey,” ISPRS J. of Photogrammetry and Remote Sensing, Vol.115, pp. 22-36, 2016.
  16. [16] T. Yoshida and T. Fukao, “Dense 3D Reconstruction Using a Rotational Stereo Camera,” Proc. of 2011 IEEE/SICE Int. Symp. on System Integration, pp. 985-990, 2011.
  17. [17] M. Risqi, U. Saputra, A. Markham, and N. Trigon, “Visual SLAM and Structure from Motion in Dynamic Environments: A Survey,” ACM Computing Surveys, Vol.51, No.2, 37, 2018.
  18. [18] S. Zhang, L. Zheng, and W. Tao, “Survey and Evaluation of RGB-D SLAM,” IEEE Access, Vol.9, pp. 21367-21387, 2021.
  19. [19] H. Xu and J. Zhang, “AANet: Adaptive Aggregation Network for Efficient Stereo Matching,” Proc. of 2020 IEEE/CVF Conf. on Computer Vision and Pattern Recognition (CVPR), pp. 1959-1968, 2020.
  20. [20] T. Yoshida, T. Fukao, and T. Hasegawa, “Fast Detection of Tomato Peduncle Using Point Cloud with a Harvesting Robot,” J. Robot. Mechatron., Vol.30, No.2, pp. 180-186, 2018.
  21. [21] T. Sugimachi, T. Fukao, T. Ario, Y. Suzuki, and H. Kawashima, “Practical Lateral Control for Autonomous Platooning System of Heavy-Duty Trucks,” Proc. of the 20th ITS World Congress, 4100, 2013.
  22. [22] H. Inou, T. Fukao, S. Totsuka, and Y. Okafuji, “Development of Automatic Steering Control System Based on Optical Flow Model,” Proc. of the 12nd Int. Symp. on Advanced Vehicle Control (AVEC’14), 2014.
  23. [23] Y. Okafuji, T. Fukao, Y. Yokokohji, and H. Inou, “Design of a Preview Driver Model Based on Optical Flow,” IEEE Trans. Intelligent Vehicle, Vol.1, No.3, pp. 266-276, 2016.
  24. [24] Y. Okafuji, C. D. Mole, N. Merat, T. Fukao, Y. Yokokohji, H. Inou, and R. M. Wilkie, “Steering bends and changing lanes: The impact of optic flow and road edges on two point steering control,” J. of Vision, Vol.18, No.9, pp. 1-19, 2018.

*This site is desgined based on HTML5 and CSS3 for modern browsers, e.g. Chrome, Firefox, Safari, Edge, Opera.

Last updated on Jun. 07, 2023