single-rb.php

JRM Vol.38 No.2 pp. 562-577
(2026)

Development Report:

Development of a Tomato Harvesting Robot: Integration of Manipulator Configuration, Recognition, and End Effector

Toru Kuga*, Toshiyuki Yokoue*, Keita Kitano*, Hiroki Asano*, and Hisashi Sugiura*,**,*** ORCID Icon

*Yanmar Holdings Co., Ltd.
2481 Umegahara, Maibara, Shiga 521-8511, Japan

**Kyushu Institute of Technology
2-4 Hibikino, Wakamatsu-ku, Kitakyushu, Fukuoka 808-0196, Japan

***Fukushima University
1 Kanayagawa, Fukushima 960-1248, Japan

Received:
September 26, 2025
Accepted:
February 22, 2026
Published:
April 20, 2026
Keywords:
agricultural robotics, tomato harvesting, collaborative robot, computer vision, robotic manipulation
Abstract

This study presents a comprehensive tomato harvesting robot system addressing three critical technical aspects: 1) optimal manipulator configuration design, 2) robust environmental recognition, and 3) efficient end effector control. For the manipulator configuration design, four different mounting configurations were systematically evaluated, with the vertical configuration featuring an offset end effector achieving the highest target reachability of 97.7%. For environmental recognition, a multi-sensor system that combines RGB and depth (RGBD) cameras and light detection and ranging (LiDAR) was implemented, utilizing depth filtering to suppress outliers. The end effector integrates suction and cutting mechanisms, employing a suction pad with conforming motion and Bowden cable-driven scissors. A bunch model was developed based on actual fruit bunches to create a testing environment with diversity and reproducibility. Field experiments conducted in a commercial greenhouse demonstrated continuous harvesting operations with a 68% suction success rate and a 45% overall harvesting success rate across 159 target fruits from 200 bunches. Additionally, the fruit position distribution in the field was measured, which can be utilized for layout optimization. This study contributes to advancing practical agricultural robotics by providing validated solutions for the three fundamental challenges in robotic crop manipulation.

Developed tomato harvesting robot

Developed tomato harvesting robot

Cite this article as:
T. Kuga, T. Yokoue, K. Kitano, H. Asano, and H. Sugiura, “Development of a Tomato Harvesting Robot: Integration of Manipulator Configuration, Recognition, and End Effector,” J. Robot. Mechatron., Vol.38 No.2, pp. 562-577, 2026.
Data files:
References
  1. [1] Ministry of Agriculture, Forestry and Fisheries of Japan (MAFF), “Annual report on food, agriculture and rural areas in Japan FY2021 (Summary),” 2022.
  2. [2] International Labour Organization (ILO), “Working on a warmer planet: The impact of heat stress on labour productivity and decent work,” 2019.
  3. [3] S. A. Rustad, “Conflict Trends: A global overview, 1946–2024,” PRIO Paper, Oslo: Peace Research Institute Oslo (PRIO), 2025.
  4. [4] Ministry of Agriculture, Forestry and Fisheries (MAFF), “Annual report on food, agriculture and rural areas in Japan FY2023,” 2024.
  5. [5] Ministry of Agriculture, Forestry and Fisheries (MAFF), “Promotion of smart agriculture,” 2025.
  6. [6] Ministry of Agriculture, Forestry and Fisheries (MAFF), “Promotion of smart agriculture,” 2020.
  7. [7] T. Saito, A. Shinjyo, M. Wada, M.Ishihara, and S. Hayashi, “Agricultural Data Collaboration Platform: WAGRI-System Structure and Operation,” Food and Fertilizer Technology Center for the Asian and Pacific Region (FFTC-AP), 2019. https://doi.org/10.56669/TIWC5797
  8. [8] Ministry of Agriculture, Forestry and Fisheries (MAFF), “Strategy for sustainable food systems, MIDORI,” Policy summary, 2021.
  9. [9] Japanese Law Translation Office, Government of Japan, “Outline of the act on the promotion of the development and introduction of smart agricultural technologies,” 2024.
  10. [10] Food and Agriculture Organization of the United Nations (FAO), International Fund for Agricultural Development (IFAD), the United Nations Children’s Fund (UNICEF), World Food Programme (WFP), World Health Organization (WHO), “The State of Food Security and Nutrition in the World 2025,” p. 234, 2025.
  11. [11] J. F. Reid, Q. Zhang, N. Noguchi, and M. Dickson, “Agricultural automatic guidance research in North America,” Computers and Electronics in Agriculture, Vol.25, Issues 1-2, pp. 155-167, 2000. https://doi.org/10.1016/S0168-1699(99)00061-7
  12. [12] M. Kise, N. Noguchi, K. Ishii, and H. Terao, “Development of the Agricultural Autonomous Tractor with an RTK-GPS and a Fog,” IFAC Proc. Volumes, Vol.34, No.19, pp. 99-104, 2001. https://doi.org/10.1016/S1474-6670(17)33120-8
  13. [13] D. M. Bevly, J. C. Gerdes, and B. W. Parkinson, “A new yaw dynamic model for improved high speed control of a farm tractor,” J. of Dynamic Systems, Measurement, and Control, Vol.124, No.4, pp. 659-667, 2002. https://doi.org/10.1115/1.1515329
  14. [14] R. Takai, O. Barawid Jr., and N. Noguchi, “Autonomous navigation system of crawler-type robot tractor,” IFAC Proc. Volumes, Vol.44, Issue 1, pp. 14165-14169, 2011. https://doi.org/10.3182/20110828-6-IT-1002.03355
  15. [15] N. Noguchi and O. C. Barawid Jr., “Robot farming system using multiple robot tractors in Japan,” IFAC Proc. Volumes, Vol.44, Issue 1, pp. 633-637, 2011. https://doi.org/10.3182/20110828-6-IT-1002.03838
  16. [16] R. Takai, L. Yang, and N. Noguchi, “Development of a crawler-type robot tractor using RTK-GPS and IMU,” Engineering in Agriculture, Environment and Food, Vol.7, Issue 4, pp. 143-147, 2014. https://doi.org/10.1016/j.eaef.2014.08.004
  17. [17] M. O’Connor, T. Bell, G. Elkaim, and B. Parkinson, “Automatic Steering of Farm Vehicles Using GPS,” Proc. of the Third Int. Conf. on Precision Agriculture, pp. 767-777, 1996. https://doi.org/10.2134/1996.precisionagproc3.c91
  18. [18] Y. Nagasaka, N. Umeda, Y. Kanetai, K. Taniwaki, and Y. Sasaki, “Autonomous guidance for rice transplanting using global positioning and gyroscopes,” Computers and Electronics in Agriculture, Vol.43, Issue 3, pp. 223-234, 2004. https://doi.org/10.1016/j.compag.2004.01.005
  19. [19] C. Zhang and J. M. Kovacs, “The application of small unmanned aerial systems for precision agriculture: A review,” Precision Agriculture, Vol.13, No.6, pp. 693-712, 2012. https://doi.org/10.1007/s11119-012-9274-5
  20. [20] W. H. Maes and K. Steppe, “Perspectives for remote sensing with unmanned aerial vehicles in precision agriculture,” Trends in Plant Science, Vol.24, Issue 2, pp. 152-164, 2019. https://doi.org/10.1016/j.tplants.2018.11.007
  21. [21] H. Aasen, E. Honkavaara, A. Lucieer, and P. J. Zarco-Tejada, “Quantitative remote sensing at ultra-high resolution with UAV spectroscopy: A review of sensor technology, measurement procedures, and data correction workflows,” Remote Sensing, Vol.10, Issue 7, Article No.1091, 2018. https://doi.org/10.3390/rs10071091
  22. [22] G. Kootstra, X. Wang, P. M. Blok, J. Hemming, and E. van Henten, “Selective harvesting robotics: Current research, trends, and future directions,” Current Robotics Reports, Vol.2, No.1, pp. 95-104, 2021. https://doi.org/10.1007/s43154-020-00034-1
  23. [23] C. W. Bac, E. J. van Henten, J. Hemming, and Y. Edan, “Harvesting robots for high-value crops: State-of-the-art review and challenges ahead,” J. of Field Robotics, Vol.31, Issue 6, pp. 888-911, 2014. https://doi.org/10.1002/rob.21525
  24. [24] E. J. van Henten, J. Hemming, B. A. J. van Tuijl, J. G. Kornet, J. Meuleman, J. Bontsema, and E. A. van Os, “An autonomous robot for harvesting cucumbers in greenhouses,” Autonomous Robots, Vol.13, No.3, pp. 241-258, 2002. https://doi.org/10.1023/A:1020568125418
  25. [25] C. Lehnert, A. English, C. McCool, A. W. Tow, and T. Perez, “Autonomous Sweet Pepper Harvesting for Protected Cropping Systems,” IEEE Robotics and Automation Letters, Vol.2, No.2, pp. 872-879, 2017. https://doi.org/10.1109/LRA.2017.2655622
  26. [26] T. Wang, W. Du, L. Zeng, L. Su, Y. Zhao, F. Gu, L. Liu, and Q. Chi, “Design and testing of an end-effector for tomato picking,” Agronomy, Vol.13, Issue 3, Article No.947, 2023. https://doi.org/10.3390/agronomy13030947
  27. [27] T. Kuroda, M. Yamamoto, and H. Saito, “Accuracy assessment of tomato harvest working time predictions,” Agriculture, Vol.14, Issue 12, Article No.2257, 2024. https://doi.org/10.3390/agriculture14122257
  28. [28] T. Yoshida, T. Fukao, and T. Hasegawa, “Fast detection of tomato peduncle using point cloud with a harvesting robot,” J. Robot. Mechatron., Vol.30, No.2, pp. 180-186, 2018. https://doi.org/10.20965/jrm.2018.p0180
  29. [29] S. Oda, R. Fukumoto, K. Hirata, S. Tahara, K. Yoshida, S. Yasukawa, and K. Ishii, “Development of a tomato harvesting robot for farm field,” Proc. of Int. Conf. on Artificial Life and Robotics, Vol.28, pp. 479-483, 2023. https://doi.org/10.5954/ICAROB.2023.OS20-4
  30. [30] K. Koe, P. K. Shah, B. Walt, J. Westphal, S. Marri, S. Kamtikar et al., “Precision harvesting in cluttered environments: Integrating end effector design with dual camera perception,” arXiv preprint, arXiv:2501.19395, 2025. https://doi.org/10.48550/arXiv.2501.19395
  31. [31] J. Redmon, S. Divvala, R. Girshick, and A. Farhadi, “You only look once: Unified, real-time object detection,” Proc. of the IEEE Conf. on Computer Vision and Pattern Recognition (CVPR), pp. 779-788, 2016. https://doi.org/10.1109/CVPR.2016.91
  32. [32] T. Fujinaga, S. Yasukawa, and K. Ishii, “Tomato Growth State Map for the Automation of Monitoring and Harvesting,” J. Robot. Mechatron., Vol.32, No.6, pp. 1279-1291, 2020. https://doi.org/10.20965/jrm.2020.p1279
  33. [33] T. Ikeda, R. Fukuzaki, M. Sato, S. Furuno, and F. Nagata, “Tomato Recognition for Harvesting Robots Considering Overlapping Leaves and Stems,” J. Robot. Mechatron., Vol.33, No.6, pp. 1274-1283, 2021. https://doi.org/10.20965/jrm.2021.p1274
  34. [34] J. Davidson, S. Bhusal, C. Mo, M. Karkee, and Q. Zhang, “Robotic manipulation for specialty crop harvesting: A review of manipulator and end-effector technologies,” Global J. of Agricultural and Allied Sciences, Vol.2, No.1, pp. 25-41, 2020. https://doi.org/10.35251/gjaas.2020.004
  35. [35] Q. Pan, D. Wang, J. Lian, Y. Dong, and C. Qiu, “Development of an automatic sweet pepper harvesting robot and experimental evaluation,” 2024 IEEE Int. Conf. on Robotics and Automation (ICRA), 2024. https://doi.org/10.1109/ICRA57147.2024.10610866
  36. [36] V. Bloch, A. Bechar, and A. Degani, “Fitness of diverse orchard architectures on optimal robot manipulator,” IROS Workshop on Agri-Food Robotics, 2015.
  37. [37] B. Arad, J. Balendonck, R. Barth, O. Ben-Shahar, Y. Edan, T. Hellström, J. Hemming, P. Kurtser, O. Ringdahl, T. Tielen, and B. van Tuijl, “Development of a sweet pepper harvesting robot,” J. of Field Robotics, Vol.37, Issue 6, pp. 1027-1039, 2020. https://doi.org/10.1002/rob.21937
  38. [38] T. Yoshikawa, “Manipulability of Robotic Mechanisms,” The Int. J. of Robotics Research, Vol.4, No.2, pp. 3-9, 1985. https://doi.org/10.1177/027836498500400201
  39. [39] P. Chiacchio, S. Chiaverini, L. Sciavicco, and B. Siciliano, “Global task space manipulability ellipsoids for multiple-arm systems,” IEEE Trans. on Robotics and Automation, Vol.7, No.5, pp. 678-685, 1991. https://doi.org/10.1109/70.97880
  40. [40] C. Gosselin and J. Angeles, “A global performance index for the kinematic optimization of robotic manipulators,” J. of Mechanical Design, Vol.113, No.3, pp. 220-226, 1991. https://doi.org/10.1115/1.2912772
  41. [41] D. Guri and G. Kantor, “A systematic robot design optimization methodology with application to redundant dual-arm manipulators,” arXiv preprint, arXiv:2507.21896, 2025. https://doi.org/10.48550/arXiv.2507.21896
  42. [42] A. Silwal, J. R. Davidson, M. Karkee, C. Mo, Q. Zhang, and K. Lewis, “Design, integration, and field evaluation of a robotic apple harvester,” J. of Field Robotics, Vol.34, No.6, pp. 1140-1159, 2017. https://doi.org/10.1002/rob.21715
  43. [43] M. Zaharia et al., “Accelerating the machine learning lifecycle with MLflow,” IEEE Data Engineering Bulletin, Vol.41, No.4, pp. 39-45, 2018.
  44. [44] T. Akiba, S. Sano, T. Yanase, T. Ohta, and M. Koyama, “Optuna: A next-generation hyperparameter optimization framework,” Proc. of the 25th ACM SIGKDD Int. Conf. on Knowledge Discovery and Data Mining, pp. 2623-2631, 2019. https://doi.org/10.1145/3292500.3330701
  45. [45] T. Matsu, N. Fukushima, and Y. Ishibashi, “Depth map refinement with weighted cross bilateral filter,” The J. of the Institute of Image Information and Television Engineers, Vol.66, No.11, pp. 434-443, 2012 (in Japanese). https://doi.org/10.3169/itej.66.J434
  46. [46] E. Vasudevan, S. N. Sridhara, E. Pavez, A. Ortega, R. Singh, and S. Kalluri, “Color-guided flying pixel correction in depth images,” arXiv preprint, arXiv:2410.08084, 2024. https://doi.org/10.48550/arXiv.2410.08084
  47. [47] T. Schairer, B. Huhle, P. Jenke, and W. Straßer, “Parallel non-local denoising of depth maps,” Proc. of the Int. Workshop on Local and Non-Local Approximation in Image Processing (EUSIPCO Satellite Event), 2008.
  48. [48] Y. Song and Y.-S. Ho, “Time-of-flight image enhancement for depth map generation,” Proc. of the 2016 Asia-Pacific Signal and Information Processing Association Annual Summit and Conf. (APSIPA), 2016. https://doi.org/10.1109/APSIPA.2016.7820774
  49. [49] R. B. Rusu, Z. C. Marton, N. Blodow, M. Dolha, and M. Beetz, “Towards 3D point cloud based object maps for household environments,” Robotics and Autonomous Systems, Vol.56, Issue 11, pp. 927-941, 2008. https://doi.org/10.1016/j.robot.2008.08.005
  50. [50] T. Kuga, T. Yokoue, M. Saiki, and H. Sugiura, “A cutting-suctioning-feeding end effector for large tomato harvesting,” Proc. of the JSME The 8th Int. Conf. on Advanced Mechatronics (ICAM2024), 2024.
  51. [51] M. Saiki, T. Yokoue, T. Kuga, K. Kitano, and H. Sugiura, “Development of suction harvesting hand ‘Trun-cone pad’ for spherical crops,” SICE System Integration Division Annual Conf., 2021 (in Japanese).
  52. [52] A. Takata, G. Endo, K. Suzumori, and H. Nabae, “Basic study for drive mechanism with synthetic fiber rope – Third report: Creep testing machine and preliminary experiments –,” 2017 JSME Conf. on Robotics and Mechatronics, 2017 (in Japanese). https://doi.org/10.1299/jsmermd.2017.1P2-G07
  53. [53] IGUS, “Arm clamp for cobot – COB.” https://www.universal-robots.com/marketplace/products/01tP40000071NP6IAM/ [Accessed September 22, 2025]
  54. [54] Point Cloud Library (PCL), “pcl::StatisticalOutlierRemoval Class Template Reference,” 2017. https://pointclouds.org/documentation/classpcl_1_1_statistical_outlier_removal.html [Accessed January 30, 2026]

*This site is desgined based on HTML5 and CSS3 for modern browsers, e.g. Chrome, Firefox, Safari, Edge, Opera.

Last updated on Apr. 19, 2026