single-rb.php

JRM Vol.22 No.2 pp. 212-220
doi: 10.20965/jrm.2010.p0212
(2010)

Paper:

Generation of Large Mosaic Images for Vegetation Monitoring Using a Small Unmanned Aerial Vehicle

Taro Suzuki*, Yoshiharu Amano*, Takumi Hashizume*,
Shinji Suzuki**, and Atsushi Yamaba***

*Research Institute for Science and Engineering, Waseda University, 17 Kikui-cho, Shinjuku-ku, Tokyo 162-0044, Japan

**Department of Aeronautics and Astronautics, School of Engineering, The University of Tokyo

***Forestry Research Center, Hiroshima Pref. Technology Research Inst.

Received:
October 5, 2009
Accepted:
January 25, 2010
Published:
April 20, 2010
Keywords:
unmanned aerial vehicle, mosaic image, vegetation monitoring, remote sensing
Abstract
This paper describes low-cost flexible vegetation monitoring and compares it to with conventional remote sensing systems such as airplanes and satellites. The small lightweight Unmanned Aerial Vehicle (UAV) we have developed has visible and near-infrared cameras that create a high-resolution wide-area mosaic image for observing vegetation. We propose integrating a GPS receiver, inertial sensors, and an image sensor to accurately estimate the UAV location and altitude to generate a mosaic image. The vegetation index is then calculated from the generated mosaic image to evaluate vegetation status. Monitoring experiment results at the Yawata moor in Hiroshima Prefecture showed that our small UAV both effectively and usefully provided low-cost flexible vegetation monitoring.
Cite this article as:
T. Suzuki, Y. Amano, T. Hashizume, S. Suzuki, and A. Yamaba, “Generation of Large Mosaic Images for Vegetation Monitoring Using a Small Unmanned Aerial Vehicle,” J. Robot. Mechatron., Vol.22 No.2, pp. 212-220, 2010.
Data files:
References
  1. [1] J. LeBoeuf, “Practical applications of remote sensing technology— an industry perspective,” HortTechnology, Vol.10, pp. 475-480, 2000.
  2. [2] M. A. Wulder et al., “Comparison of Airborne and Satellite High Spatial Resolution Data for the Identification of Individual Trees with Local Maxima Filtering,” Int. J. of Remote Sensing, Vol.25, No.11, pp. 2225-2232, 2004.
  3. [3] S. Labbe et al., “An Operational Solution to Acquire Multispectral Images with Standard Light Cameras: Characterization and Acquisition Guidelines,” Remote Sensing and Photogrammetry Society, 2007.
  4. [4] S. R. Herwitz et al., “Imaging from an unmanned aerial vehicle: agricultural surveillance and decision support,” Computers and Electronics in Agriculture, Vol.44, pp. 49-61, 2004.
  5. [5] M. Funaki, N. Hirasawa, and the Ant-Plane Group, “Outline of a Small Unmanned Aerial Vehicle (Ant-Plane) Designed for Antarctic Research,” Polar Science, Vol.2, Issue 2, pp. 129-142, 2008.
  6. [6] L. Merino, F. Caballero, J. R. Martinez-de Dios, J. Ferruz, and A. Ollero, “A Cooperative Perception System for Multiple UAVs: Application to Automatic Detection of Forest Fires,” J. of Field Robotics, Vol.23, No.34, pp. 165-184, 2006.
  7. [7] C. C. D. Lelong, P. Burger, G. Jubelin, B. Roux, S. Labbé, and F. Baret, “Assessment of Unmanned Aerial Vehicles Imagery for Quantitative Monitoring of Wheat Crop in Small Plots,” Sensors 2008, Vol.8, pp. 3557-3585, 2008.
  8. [8] H. Eisenbeiss and L. Zhang, “Comparison of DSMs generated from mini UAV imagery and terrestrial laser scanner in a cultural heritage application,” Int. Archives of Photogrammetry, Remote Sensing and Spatial Information Sciences, Vol.XXXVI-Part5, pp. 90-96, 2006.
  9. [9] R. Sugiura, N. Noguchi, and K. Ishii, “Remote sensing technology for vegetation monitoring using an unmanned helicopter,” Biosystems Engineering, Vol.90, pp. 369-379, 2005.
  10. [10] M. T. Bryson, M. Johnson-Roberson, and S. Sukkarieh, “Airborne smoothing and mapping using vision and inertial sensors,” Proc. of the 2009 IEEE Int. Conf. on Robotics & Automation, pp. 2037-2042, 2009.
  11. [11] R. Hirokawa, “Guidance and Control System Design and Evaluation for a Small UAV,” 2nd Int. Symposium on Innovative Aerial/Space Flyer Systems, pp. 105-108, 2005.
  12. [12] D. G. Lowe, “Distinctive Image Features from Scale-Invariant Keypoints,” Int. J. of Computer Vision, Vol.60, pp. 91-110, 2004.
  13. [13] P. H. S. Torr, “Bayesian model estimation and selection for epipolar geometry and generic manifold fitting,” Int. J. of Computer Vision Vol.50, pp. 35-61, 2002.
  14. [14] B. Triggs, P. McLauchlan, R. Hartley, and A. Fitzgibbon, “Bundle Adjustment – A Modern Synthesis,” Vision Algorithms: Theory & Practice, Springer-Verlag LNCS Vol.1883, pp. 298-372, 2000.
  15. [15] Nature restoration project in Yawata moor Web site,
    http://www.pref.hiroshima.lg.jp/eco/j/yawata/
  16. [16] N. Yuba, “Toward Application of UAV in Aerial Observation at Forest Area,” The 2nd Int. Symposium on Innovative Aerial/Space Flyer Systems, pp. 109-112, 2005.

*This site is desgined based on HTML5 and CSS3 for modern browsers, e.g. Chrome, Firefox, Safari, Edge, Opera.

Last updated on Apr. 19, 2024