single-jc.php

JACIII Vol.20 No.1 pp. 49-56
doi: 10.20965/jaciii.2016.p0049
(2016)

Paper:

Vision-Based Mowing Boundary Detection Algorithm for an Autonomous Lawn Mower

Tomoya Fukukawa*, Kosuke Sekiyama**, Yasuhisa Hasegawa**, and Toshio Fukuda***

*Department of Mechanical Science and Engineering, Nagoya University
Furo-cho, Chikusa-ku, Nagoya 464-8603, Japan

**Department of Micro-Nano Systems Engineering, Nagoya University
Furo-cho, Chikusa-ku, Nagoya 464-8603, Japan

***Faculty of Science and Engineering, Meijo University
1-501 Shiogamaguchi, Tempaku-ku, Nagoya 468-8502, Japan

Received:
April 1, 2015
Accepted:
November 5, 2015
Online released:
January 19, 2016
Published:
January 20, 2016
Keywords:
autonomous lawn mower, texture classification, a bank of filters, Gabor filter, RANSAC
Abstract
This study proposes a vision-based mowing boundary detection algorithm for an autonomous lawn mower. An autonomous lawn mower requires high moving accuracy for efficient mowing. This problem is solved by using a vision system to detect the boundary of two regions, i.e., before and after the lawn mowing process. The mowing boundary cannot be detected directly because it is ambiguous. Therefore, we utilize a texture classification method with a bank of filters for classifying the input image of the lawn field into two regions as mentioned above. The classification is performed by threshold processing based on a chi-squared statistic. Then, the boundary line is detected from the classified regions by using Random sample consensus (RANSAC). Finally, we apply the proposed method to 12 images of the lawn field and verified that the proposed method can detect a mowing boundary line with centimeter accuracy in a dense lawn field.
Cite this article as:
T. Fukukawa, K. Sekiyama, Y. Hasegawa, and T. Fukuda, “Vision-Based Mowing Boundary Detection Algorithm for an Autonomous Lawn Mower,” J. Adv. Comput. Intell. Intell. Inform., Vol.20 No.1, pp. 49-56, 2016.
Data files:
References
  1. [1] M. Norremark, H. W. Griepentrog, J. Nielsen, and H. T. Sogaard, “Evaluation of an autonomous GPS-based system for intra-row weed control by assessing the tilled area,” Precision Agriculture, Vol.13, No.2, pp. 149-162, 2012.
  2. [2] B. Thuilot, C. Cariou, P. Martinet, and M. Berducat, “Automatic guidance of a farm tractor relying on a single CP-DGPS,” Autonomous Robots, Vol.13, No.1, pp. 53-71, 2002.
  3. [3] C. Cariou, R. Lenain, B. Thuilot, and M. Berducat, “Automatic Guidance of a Four-Wheel-Steering Mobile Robot for Accurate Field Operations,” J. of Field Robotics, Vol.26, No.6-7, pp. 504-518, 2009.
  4. [4] C. C. Lin, and R. L. Tummala, “Mobile Robot Navigation Using Artificial Landmarks,” J. of Robotic Systems, Vol.14, No.2, pp. 93-106, 1997.
  5. [5] S. Se, D. G. Low, and J. J. Little, “Vision-based global localization and mapping for mobile robots,” IEEE Trans. on Robotics, Vol.21, No.3, pp. 364-375, 2005.
  6. [6] J. Canny, “A Computational Approach to Edge-detection,” IEEE Trans. on Pattern Analysis and Machine Intelligence, Vol.8, No.6, pp. 679-698, 1986.
  7. [7] K. A. Jian and F. Farrokhnia, “Unsupervised Texture Segmentation Using Gabor Filters,” Pattern Recognition, Vol.24, No.12, pp. 1167- 1186, 1991.
  8. [8] X. Liu and D. Wang, “Texture Classification Using Spectral Histograms,”IEEE Trans. on Image Processing, Vol.12, No.6, pp. 661- 670, 2003.
  9. [9] M. Varma and A. Zisserman, “A Statistical Approach to Texture Classification from Single Images,” Int. J. of Computer Vision, Vol.62, No.1-2, pp. 61-81, 2005.
  10. [10] Y. Hamamoto, S. Uchimura, M. Watanabe, T. Yasuda, Y. Mitani, and S. Tomita, “A Gabor Filter-based Method for Recognizing Handwritten Numerals,” Pattern Recognition, Vol.31, No.4, pp. 395-400, 1998.
  11. [11] M. A. Fischler, and R. C. Bolles, “Random Sample Consensus: a Paradigm for Model Fitting with Applications to Image Analysis and Automated Cartography,” Communications of the ACM, Vo.24, No.6, pp. 381-395, 1981.

*This site is desgined based on HTML5 and CSS3 for modern browsers, e.g. Chrome, Firefox, Safari, Edge, Opera.

Last updated on Apr. 19, 2024