single-au.php

IJAT Vol.8 No.4 pp. 584-591
doi: 10.20965/ijat.2014.p0584
(2014)

Paper:

Study on Underwater Dual-Laser Structured-Light System for ROV Guiding

Xi Zhang*, Laiwei Li*, and Junyi Yang**

*Shanghai University, Yanchang Rd.149, Zhabei District, Shanghai, China

**Second Institute of Oceanography, SOA, No.36 baochubei Road, Hangzhou, China

Received:
April 14, 2014
Accepted:
June 4, 2014
Published:
July 5, 2014
Keywords:
ROV, underwater manipulator, 3D measurement, guiding
Abstract
Remotely Operated underwater Vehicles (ROVs) equipped with manipulators have increasingly been used for underwater operation. An ROV is usually operated manually with the aid of an underwater camera for approaching and grasping a target using its manipulator. Owing to the low quality of underwater imaging, it is quite difficult for the human operator to determine accurate distances and orientations between the ROV and the target of interest. This paper presents a proposal for developing an automatic three-dimensional measurement and guidance system for ROVs in an effort to facilitate this process. Based on optical triangulation principles, dual laser lines and a camera are utilized to calculate the position and orientation of a cylindrical target. A measurement model considering refraction compensation and a joint system calibration method are proposed. The experimental study shows that the proposed system is feasible for automatically determining the position and orientation of a cylindrical target in an accurate and efficient manner. The accuracy of the measurement system is verified in air and underwater, respectively, by a prototype system.
Cite this article as:
X. Zhang, L. Li, and J. Yang, “Study on Underwater Dual-Laser Structured-Light System for ROV Guiding,” Int. J. Automation Technol., Vol.8 No.4, pp. 584-591, 2014.
Data files:
References
  1. [1] K. Yan and L.Wu, “A survey on the key technologies for underwater AUV docking,” Robot, Vol.29, No.3, pp. 267-273, 2007.
  2. [2] J. P. Peyronnet, R. Person, and F. Rybicki, “Posidonia 6000: a new long range highly accurate ultra short base line positioning system,” OCEANS’98 Conf. Proc., Vol.3, pp. 1721-1727, 1998.
  3. [3] D. Thomson and S. Elson, “New generation acoustic positioning system,” OCEANS’02 MTS/IEEE, Vol.3, pp. 1312-1318, 2002.
  4. [4] M. Audric, “GAPS, a new concept for USBL,” OCEANS’04 MTTS/IEEE, Vol.2, pp. 786-788, 2004.
  5. [5] S. Li, G. Bao, and S. Wu, “A practical overview and prospect of acoustic positioning technology,” Ocean Technology, Vol.24, No.1, pp. 130-135, 2005.
  6. [6] H. Singh and G. James, “Docking for an autonomous ocean sampling network,” Oceanic Engineering, Vol.26, No.4, pp. 498-514, 2001.
  7. [7] T. Fukasawa, T. Noguchi, T. Kawasaki, et al., ““MARINE BIRD,” a new experimental AUV with underwater docking and recharging system,” OCEANS 2003, Vol.4, pp. 2195-2200, 2003.
  8. [8] T. Kawasaki, T. Fukasawa, T. Noguchi, et al., “Development of AUV “Marine Bird” with underwater docking and recharging system,” Scientific Use of Submarine Cables and Related Technologies, pp. 166-170, 2003.
  9. [9] M. D. Feezor, F. Y. Sorrell, P. R. Blankinship, et al., “Autonomous underwater vehicle homing/docking via electromagnetic guidance,” Oceanic Engineering, Vol.26, No.4, pp. 515-521, 2001.
  10. [10] S. Sagara, R. B. Ambar, and F. Takemura, “A stereo vision system for underwater vehicle-manipulator systems-proposal of a novel concept using pan-tilt-slide cameras,” J. of Robotics and Mechatronics, Vol.25, No.5, pp. 785-794, 2013.
  11. [11] J. Evans, P. Redmond, C. Plakas, et al., “Autonomous docking for Intervention-AUVs using sonar and video-based real-time 3D pose estimation,” Oceans 2003, Vol.4, pp. 2201-2210, 2003.
  12. [12] J. Y. Park, B. H. Jun, P. M. Lee, et al., “Experiment on underwater docking of an autonomous underwater vehicle “ISiMI” using optical terminal guidance,” OCEANS 2007-Europe, pp. 1-6, 2007.
  13. [13] J. Y. Park, B. Jun, P. Lee, et al., “Experiments on vision guided docking of an autonomous underwater vehicle using one camera,” Ocean Engineering, Vol.36, No.1, pp. 48-61, 2009.
  14. [14] S. Ishibashi, “The stereo vision system for an underwater vehicle,” OCEANS 2009-EUROPE, pp. 1-6, 2009.
  15. [15] G. C. Karras and K. J. Kyriakopoulos, “Visual servo control of an underwater vehicle using a laser vision system,” Intelligent Robots and Systems, pp. 4116-4122, 2008.
  16. [16] E. Yann, Y. Nose, and T. Ura, “Autonomous underwater sampling using a manipulator and stereo visual servoing,” OCEANS 2005-EUROPE, Vol.2, pp. 731-736, 2005.
  17. [17] H. Kondo and T. Ura, “Visual observation of underwater objects by autonomous underwater vehicles,” Scientific Use of Submarine Cables and Related Technologies, pp. 145-150, 2003.
  18. [18] Z. Xie, X. Li, S. Xin, et al., “Underwater line structured-light selfscan three-dimension measuring technology,” Chinese J. of Lasers, Vol.37, No.8, pp. 2010-2014, 2010.
  19. [19] Z. Xie and P. Liu, “Underwater 3D structured light system based on refraction compensation,” J. of Optoelectronics ? Laser, Vol.23, No.4, pp. 745-750, 2012.
  20. [20] M. Zhang, L. Zhang, and Y. Li, “A three-dimensional locating method for underwater robot based on sensors fusion,” Mechatronics and Automation, pp. 1207-1212, 2009.
  21. [21] Z. Zhu and S. Li, “Lens distortion and proofreading technology,” Optical Technique, Vol.31, No.1, pp. 136-138, 2005.
  22. [22] Z. Zhang, “A flexible new technique for camera calibration,” Pattern Analysis and Machine Intelligence, Vol.22, No.11, pp. 1330-1334, 2000.

*This site is desgined based on HTML5 and CSS3 for modern browsers, e.g. Chrome, Firefox, Safari, Edge, Opera.

Last updated on Dec. 02, 2024