single-rb.php

JRM Vol.38 No.2 pp. 513-524
(2026)

Paper:

End-to-End Position Prediction for Robotic Grape Berry Thinning

Yin Suan Tan*, Prawit Buayai** ORCID Icon, Dear Moeurn**, Hiromitsu Nishizaki** ORCID Icon, Koji Makino** ORCID Icon, and Xiaoyang Mao** ORCID Icon

*Integrated Graduate School of Medicine, Engineering, and Agricultural Sciences, University of Yamanashi
4-3-11 Takeda, Kofu, Yamanashi 400-8511, Japan

**Graduate Faculty of Interdisciplinary Research, University of Yamanashi
4-3-11 Takeda, Kofu, Yamanashi 400-8511, Japan

Received:
September 19, 2025
Accepted:
January 14, 2026
Published:
April 20, 2026
Keywords:
smart agricultural, berry thinning, robotic arm, depth sensing, depth neural network
Abstract

Berry thinning is essential for producing high-quality grape varieties such as Shine Muscat because it directly impacts fruit size and quality. To address the labor-intensive nature of this task, this study presents an autonomous robotic arm system that integrates depth sensing with a learning-based transformation method and is implemented using a ResNet-18 convolutional neural network to predict berry coordinates and execute cutting actions. Its performance was compared with a geometric transformation method based on Robot Operating System 2 (ROS2) coordinate transformations in both indoor and outdoor environments. In indoor trials, the learning-based transformation approach achieved an approach accuracy of 96.8% and cutting accuracy of 78.5%, outperforming the geometric transformation approach, which achieved 94.6% for approach and 69.6% for cutting. On outdoor slopes, environmental challenges degraded the performance of both the approaches; however, the learning-based transformation method maintained higher accuracies, achieving 75.6% for approach and 60.3% for cutting, compared with the geometric transformation approach, which achieved 63.1% approach accuracy and 44.1% cutting accuracy. The complete thinning cycle required an average of 3.67 min to process 10 berries, confirming its feasibility for practical use. Limitations in the curved scissor end-effector reduced cutting effectiveness, highlighting the need for improved blade design. This study demonstrates the potential of combining geometric and learning-based transformation methods for artificial intelligence-driven robotic thinning to achieve efficient vineyard management.

Robotic system for grape berry thinning

Robotic system for grape berry thinning

Cite this article as:
Y. Tan, P. Buayai, D. Moeurn, H. Nishizaki, K. Makino, and X. Mao, “End-to-End Position Prediction for Robotic Grape Berry Thinning,” J. Robot. Mechatron., Vol.38 No.2, pp. 513-524, 2026.
Data files:
References
  1. [1] E. Gursoy, B. Navarro, A. Cosgun, D. Kulić, and A. Cherubini, “Towards vision-based dual arm robotic fruit harvesting,” arXiv:2306.08729, 2023. https://doi.org/10.48550/arXiv.2306.08729
  2. [2] J. Jun, J. Kim, J. Seol, J. Kim, and H. I. Son, “Towards an efficient tomato harvesting robot: 3D perception, manipulation, and end-effector,” IEEE Access, Vol.9, pp. 17631-17640, 2021. https://doi.org/10.1109/ACCESS.2021.3052240
  3. [3] T. Yoshida, T. Fukao, and T. Hasegawa, “Cutting point detection using a robot with point clouds for tomato harvesting,” J. Robot. Mechatron., Vol.32, No.2, pp. 437-444, 2020. https://doi.org/10.20965/jrm.2020.p0437
  4. [4] D. A. Zhao, J. Lv, W. Ji, Y. Zhang, and Y. Chen, “Design and control of an apple harvesting robot,” Biosyst. Eng., Vol.110, No.2, pp. 112-122, 2011. https://doi.org/10.1016/j.biosystemseng.2011.07.005
  5. [5] L. Mu, G. Cui, Y. Liu, Y. Cui, L. Fu, and Y. Gejima, “Design and simulation of an integrated end-effector for picking kiwifruit by the robot,” Inf. Process. Agric., Vol.7, No.1, pp. 58-71, 2020. https://doi.org/10.1016/j.inpa.2019.05.004
  6. [6] Y. Yu, K. Zhang, H. Liu, L. Yang, and D. Zhang, “Real-Time Visual Localization of the Picking Points for a Ridge-Planting Strawberry Harvesting Robot,” IEEE Access, Vol.8, pp. 116556-116568, 2020. https://doi.org/10.1109/ACCESS.2020.3003034
  7. [7] L. Tituaña, A. Gholami, Z. He, Y. Xu, M. Karkee, and R. Ehsani, “A small autonomous field robot for strawberry harvesting,” Smart Agricultural Technology, Vol.8, Article No.100454, 2024. https://doi.org/10.1016/j.atech.2024.100454
  8. [8] L. Luo, B. Liu, M. Chen, J. Wang, H. Wei, Q. Lu, and S. Luo, “DRL-enhanced 3D detection of occluded stems for robotic grape harvesting,” Computers and Electronics in Agriculture, Vol.229, Article No.109736, 2025. https://doi.org/10.1016/j.compag.2024.109736
  9. [9] P. Buayai, Y. S. Tan, M. F. B. Kamarudzaman, K. Makino, H. Nishizaki, and X. Mao, “Automating grape thinning: Predicting robotic arm end-effector positions using depth sensing technology and neural networks,” 2023 IEEE Int. Workshop on Metrology for Agriculture and Forestry (MetroAgriFor), pp. 76-80, 2023. https://doi.org/10.1109/MetroAgriFor58484.2023.10424399
  10. [10] S. Fahmida Islam, M. S. Uddin, and J. C. Bansal, “Harvesting robots for smart agriculture,” M. S. Uddin and J. C. Bansal (Eds.), “Computer vision and machine learning in agriculture, Volume 2,” Springer, 2022. https://doi.org/10.1007/978-981-16-9991-7_1
  11. [11] D. SepúLveda, R. Fernández, E. Navas, M. Armada, and P. González-De-Santos, “Robotic aubergine harvesting using dual-arm manipulation,” IEEE Access, Vol.8, pp. 121889-121904, 2020. https://doi.org/10.1109/ACCESS.2020.3006919
  12. [12] Y. Zhao, L. Gong, C. Liu, and Y. Huang, “Dual-arm robot design and testing for harvesting tomato in greenhouse,” IFAC-PapersOnLine, Vol.49, No.16, pp. 161-165, 2016. https://doi.org/10.1016/j.ifacol.2016.10.030
  13. [13] J. R. Davidson, C. J. Hohimer, C. Mo, and M. Karkee, “Dual robot coordination for apple harvesting,” 2017 ASABE Annual Int. Meeting, 2017. https://doi.org/10.13031/aim.201700567
  14. [14] Y. Jiang, J. Liu, J. Wang, W. Li, Y. Peng, and H. Shan, “Development of a dual-arm rapid grape-harvesting robot for horizontal trellis cultivation,” Frontiers in Plant Science, Vol.13, Article No.881904, 2022. https://doi.org/10.3389/fpls.2022.881904
  15. [15] B. Gursoy, G. Alenya, and C. Torras, “Vision-based dual-arm fruit harvesting: cutting and catching with collaborative robots,” IEEE Robotics and Automation Letters, Vol.5, No.4, pp. 6089-6096, 2020.
  16. [16] T. Yoshida, Y. Onishi, T. Kawahara, and T. Fukao, “Automated harvesting by a dual-arm fruit harvesting robot,” ROBOMECH J., Vol.9, Article No.19, 2022. https://doi.org/10.1186/s40648-022-00233-9
  17. [17] C. D. Vo, D. A. Dang, and P. H. Le, “Development of multi-robotic arm system for sorting system using computer vision,” J. Robot Control, Vol.3, No.5, pp. 690-698, 2022. https://doi.org/10.18196/jrc.v3i5.15661
  18. [18] R. Singh, L. Seneviratne, and I. Hussain, “A deep learning-based approach to strawberry grasping using a telescopic-link differential drive mobile robot in ROS-Gazebo for greenhouse digital twin environments,” IEEE Access, Vol.13, pp. 361-381, 2025. https://doi.org/10.1109/ACCESS.2024.3520233
  19. [19] Z. Yan and Y. Yang, “6D pose estimation and grasping based on Deep learning with MBM,” Proc. 2025 7th Int. Conf. Control and Computer Vision (ICCCV ’25), pp. 51-61, 2025. https://doi.org/10.1145/3732353.3732362
  20. [20] S. Parsa, B. Debnath, M. A. Khan, and A. Ghalamzan E., “Modular autonomous strawberry picking robotic system,” J. of Field Robotics, Vol.41, No.7, pp. 2226-2246, 2024. https://doi.org/10.1002/rob.22229
  21. [21] P. Buayai, K. R. Saikaew, and X. Mao, “End-to-end automatic berry counting for table grape thinning,” IEEE Access, Vol.9, pp. 4829-4842, 2021. https://doi.org/10.1109/ACCESS.2020.3048374
  22. [22] P. Buayai, K. Yok-In, D. Inoue, H. Nishizaki, K. Makino, and X. Mao, “Supporting table grape berry thinning with deep neural network and augmented reality technologies,” SSRN Electronic J., 2022. https://doi.org/10.2139/ssrn.4110968

*This site is desgined based on HTML5 and CSS3 for modern browsers, e.g. Chrome, Firefox, Safari, Edge, Opera.

Last updated on Apr. 19, 2026