single-rb.php

JRM Vol.38 No.2 pp. 449-459
(2026)

Paper:

LiDAR–Camera Fusion for 3D Fruit Counting and Density Mapping in Horizontal Trellis Orchards

Jaehwan Lee, Meguna Ohata, Hiromichi Itoh, and Eiji Morimoto ORCID Icon

Graduate School of Agricultural Science, Kobe University
1-1 Rokkodai-cho, Nada-ku, Kobe, Hyogo 657-8501, Japan

Received:
October 7, 2025
Accepted:
February 22, 2026
Published:
April 20, 2026
Keywords:
fruit counting, LiDAR–camera fusion, SLAM, instance-level fusion, spatial density mapping
Abstract

This study presents a real-time fruit monitoring system that integrates light detection and ranging (LiDAR) and RGB camera data for 3D fruit counting and spatial density mapping in horizontal trellis pear orchards. The system employs instance-level sensor fusion, combining YOLO-based 2D fruit detection with SLAM-generated 3D point clouds to localize and track individual fruits. A customized temporal tracking algorithm mitigates duplicate counts, while center-based spatial filtering improves detection accuracy. Among the four evaluated YOLO models, YOLOv11s was selected based on its F1-score, lowest false negatives (FN) count, and real-time performance. Field validation in a 6 m × 70 m orchard plot demonstrated high counting accuracy (96.2%) and reliable spatial density estimation, with a mean absolute error of 0.64 fruits/m2. The system effectively identified yield variations across different orchard regions. These findings support the use of LiDAR–camera fusion for scalable, high-precision fruit monitoring in orchard environments, particularly in labor-intensive horizontal trellis systems.

UGV with LiDAR-camera fusion system

UGV with LiDAR-camera fusion system

Cite this article as:
J. Lee, M. Ohata, H. Itoh, and E. Morimoto, “LiDAR–Camera Fusion for 3D Fruit Counting and Density Mapping in Horizontal Trellis Orchards,” J. Robot. Mechatron., Vol.38 No.2, pp. 449-459, 2026.
Data files:
References
  1. [1] R. Imai, “Farm management characteristics and structural reorganization in deciduous fruit production,” J. of Rural Problems, Vol.21, No.4, pp. 183-189, 1985 (in Japanese). https://doi.org/10.7310/arfe1965.21.183
  2. [2] H. Hayama et al., “Early yield and fruit quality of Japanese pear ‘Hosui’ trained with a V-shaped trellis system,” Horticultural Research (Japan), Vol.22, No.1, pp. 55-61, 2023 (in Japanese). https://doi.org/10.2503/hrj.22.55
  3. [3] J. Gené-Mola et al., “Fruit detection and 3D location using instance segmentation neural networks and structure-from-motion photogrammetry,” Computers and Electronics in Agriculture, Vol.169, Article No.105165, 2020. https://doi.org/10.1016/j.compag.2019.105165
  4. [4] H. Mirhaji, M. Soleymani, A. Asakereh, and S. A. Mehdizadeh, “Fruit detection and load estimation of an orange orchard using the YOLO models through simple approaches in different imaging and illumination conditions,” Computers and Electronics in Agriculture, Vol.191, Article No.106533, 2021. https://doi.org/10.1016/j.compag.2021.106533
  5. [5] A. B. Payne, K. B. Walsh, P. P. Subedi, and D. Jarvis, “Estimation of mango crop yield using image analysis – Segmentation method,” Computers and Electronics in Agriculture, Vol.91, pp. 57-64, 2013. https://doi.org/10.1016/j.compag.2012.11.009
  6. [6] A. Koirala, K. B. Walsh, Z. Wang, and C. McCarthy, “Deep learning – Method overview and review of use for fruit detection and yield estimation,” Computers and Electronics in Agriculture, Vol.162, pp. 219-234, 2019. https://doi.org/10.1016/j.compag.2019.04.017
  7. [7] U.-O. Dorj, M. Lee, and S.-S. Yun, “An yield estimation in citrus orchards via fruit detection and counting using image processing,” Computers and Electronics in Agriculture, Vol.140, pp. 103-112, 2017. https://doi.org/10.1016/j.compag.2017.05.019
  8. [8] P. Chu, Z. Li, K. Lammers, R. Lu, and X. Liu, “Deep learning-based apple detection using a suppression mask R-CNN,” Pattern Recognition Letters, Vol.147, pp. 206-211, 2021. https://doi.org/10.1016/j.patrec.2021.04.022
  9. [9] R. Sapkota et al., “Comprehensive performance evaluation of YOLOv12, YOLO11, YOLOv10, YOLOv9 and YOLOv8 on detecting and counting fruitlet in complex orchard environments,” arXiv:2407.12040, 2024. https://doi.org/10.48550/arXiv.2407.12040
  10. [10] K. Kurashiki, K. Kono, and T. Fukao, “LiDAR based road detection and control for agricultural vehicles,” J. Robot. Mechatron., Vol.36, No.6, pp. 1516-1526, 2024. https://doi.org/10.20965/jrm.2024.p1516
  11. [11] S. Shen, T. Ito, and T. Hirose, “Spatio-temporal gradient flow for efficient motion estimation in sparse point clouds,” J. Robot. Mechatron., Vol.37, No.6, pp. 1327-1342, 2025. https://doi.org/10.20965/jrm.2025.p1327
  12. [12] T. Shan et al., “LIO-SAM: Tightly-coupled lidar inertial odometry via smoothing and mapping,” 2020 IEEE/RSJ Int. Conf. on Intelligent Robots and Systems, pp. 5135-5142, 2020. https://doi.org/10.1109/IROS45743.2020.9341176
  13. [13] S. Vora, A. H. Lang, B. Helou, and O. Beijbom, “PointPainting: Sequential fusion for 3D object detection,” 2020 IEEE/CVF Conf. on Computer Vision and Pattern Recognition, pp. 4603-4611, 2020. https://doi.org/10.1109/CVPR42600.2020.00466
  14. [14] C. R. Qi, W. Liu, C. Wu, H. Su, and L. J. Guibas, “Frustum PointNets for 3D object detection from RGB-D data,” 2018 IEEE Conf. on Computer Vision and Pattern Recognition, pp. 918-927, 2018. https://doi.org/10.1109/CVPR.2018.00102
  15. [15] H. Kurita, M. Oku, T. Nakamura, T. Yoshida, and T. Fukao, “Localization method using camera and LiDAR and its application to autonomous mowing in orchards,” J. Robot. Mechatron., Vol.34, No.4, pp. 877-886, 2022. https://doi.org/10.20965/jrm.2022.p0877
  16. [16] H. Kang, X. Wang, and C. Chen, “Accurate fruit localisation using high resolution LiDAR-camera fusion and instance segmentation,” Computers and Electronics in Agriculture, Vol.203, Article No.107450, 2022. https://doi.org/10.1016/j.compag.2022.107450
  17. [17] W. Xu, Y. Cai, D. He, J. Lin, and F. Zhang, “FAST-LIO2: Fast direct LiDAR-inertial odometry,” IEEE Trans. on Robotics, Vol.38, No.4, pp. 2053-2073, 2022. https://doi.org/10.1109/TRO.2022.3141876
  18. [18] K. Koide, S. Oishi, M. Yokozuka, and A. Banno, “General, single-shot, target-less, and automatic LiDAR-camera extrinsic calibration toolbox,” arXiv:2302.05094, 2023. https://doi.org/10.48550/arXiv.2302.05094
  19. [19] R. Khanam and M. Hussain, “YOLOv11: An overview of the key architectural enhancements,” arXiv:2410.17725, 2024. https://doi.org/10.48550/arXiv.2410.17725
  20. [20] Y. Tian, Q. Ye, and D. Doermann, “YOLOv12: Attention-centric real-time object detectors,” arXiv:2502.12524, 2025. https://doi.org/10.48550/arXiv.2502.12524
  21. [21] D. Xu et al., “Improving passion fruit yield estimation with multi-scale feature fusion and density-attention mechanisms in smart agriculture,” Computers and Electronics in Agriculture, Vol.239, Part A, Article No.110958, 2025. https://doi.org/10.1016/j.compag.2025.110958
  22. [22] S. Jin, L. Zhou, and H. Zhou, “CO-YOLO: A lightweight and efficient model for Camellia oleifera fruit object detection and posture determination,” Computers and Electronics in Agriculture, Vol.235, Article No.110394, 2025. https://doi.org/10.1016/j.compag.2025.110394
  23. [23] T. Yu, C. Hu, Y. Xie, J. Liu, and P. Li, “Mature pomegranate fruit detection and location combining improved F-PointNet with 3D point cloud clustering in orchard,” Computers and Electronics in Agriculture, Vol.200, Article No.107233, 2022. https://doi.org/10.1016/j.compag.2022.107233
  24. [24] T. Yin, X. Zhou, and P. Krähenbühl, “Center-based 3D object detection and tracking,” 2021 IEEE/CVF Conf. on Computer Vision and Pattern Recognition, pp. 11779-11788, 2021. https://doi.org/10.1109/CVPR46437.2021.01161
  25. [25] C. Xin, T. Motz, A. Hartel, and E. Kasneci, “OCDet: Object center detection via bounding box-aware heatmap prediction on edge devices with NPUs,” arXiv:2411.15653, 2024. https://doi.org/10.48550/arXiv.2411.15653
  26. [26] S. Shen, M. Saito, Y. Uzawa, and T. Ito, “Optimal clustering of point cloud by 2D-LiDAR using Kalman filter,” J. Robot. Mechatron., Vol.35, No.2, pp. 424-434, 2023. https://doi.org/10.20965/jrm.2023.p0424
  27. [27] Ministry of Agriculture, Forestry and Fisheries of Japan (MAFF), “2015 census of agriculture and forestry in Japan,” 2016 (in Japanese). https://www.e-stat.go.jp/stat-search/files?page=1&layout=datalist&toukei=00500209&tstat=000001032920&cycle=7&year=20150&month=0&tclass1=000001077437&tclass2=000001077396&tclass3=000001089555&tclass4val=0 [Accessed January 20, 2026]
  28. [28] MAFF, “2020 census of agriculture and forestry in Japan,” 2021 (in Japanese). https://www.e-stat.go.jp/stat-search/files?page=1&layout=datalist&toukei=00500209&tstat=000001032920&cycle=7&year=20200&month=0&tclass1=000001147146&tclass2=000001155386&tclass3=000001159186&tclass4val=0 [Accessed January 20, 2026]
  29. [29] MAFF, “2015 fruit production and shipment statistics,” 2016 (in Japanese). https://www.e-stat.go.jp/stat-search/files?page=1&toukei=00500215&tstat=000001013427&tclass1=000001032287&tclass2=000001032927&tclass3=000001088895 [Accessed January 20, 2026]
  30. [30] MAFF, “2024 fruit production and shipment statistics,” 2025 (in Japanese). https://www.e-stat.go.jp/stat-search/files?page=1&toukei=00500215&tstat=000001013427&tclass1=000001032287&tclass2=000001032927&tclass3=000001236584 [Accessed January 20, 2026]
  31. [31] MAFF, “Promotion of smart agriculture,” 2025. https://www.maff.go.jp/e/policies/tech_res/smaagri/PDF/Promotion_of_SmartAgriculture_250131.pdf [Accessed January 20, 2026]

*This site is desgined based on HTML5 and CSS3 for modern browsers, e.g. Chrome, Firefox, Safari, Edge, Opera.

Last updated on Apr. 19, 2026