single-au.php

IJAT Vol.13 No.6 pp. 796-802
doi: 10.20965/ijat.2019.p0796
(2019)

Paper:

Gap Detection Using Convolutional Neural Network and Adaptive Control in Robotic Plasma Welding

Satoshi Yamane and Kouki Matsuo

Department of Environmental Science and Technology, Saitama University
255 Shimo-okubo, Sakura-ku, Saitama-shi, Saitama 338-8570, Japan

Corresponding author

Received:
May 17, 2019
Accepted:
September 2, 2019
Published:
November 5, 2019
Keywords:
convolutional neural network, visual sensor, robotic welding, plasma arc welding, image processing
Abstract

Welding is an essential technology for joining metal plates. In general, gas metal arc welding (GMAW) generates a large amount of fumes in the welding of thick metal plates. In contrast, the butt joining of thick metal plates can be achieved using plasma arc welding (PAW) with a lower amount of fumes. Further, the improvement of the welding environment is critical in welding. In particular, if there are gaps between the base metals, the welding conditions are adjusted based on the gap. A visual sensor, such as a complementary metal-oxide-semiconductor (CMOS) camera, is useful for observing the welding situation. In this study, such a camera was attached to a plasma torch. During welding, we obtained weld pool images using the camera and detected the gaps by processing the images. As the arc light is very intense, it is difficult to obtain a clear image of the weld pool in PAW. In conventional welding, a constant current is used; however, pulsed welding current is used herein to obtain a clear image. The frequency of the current is 20 Hz, which indicates that the interval time is 50 ms. Moreover, the welding current was reduced to 30 A to minimize the effect of the intense arc light while the shutter of the CMOS camera was opened. The exposure time of the CMOS camera is 1 ms. Furthermore, gaps can be detected through image processing. It is necessary to identify the base metals with or without a gap. It was observed that the gap is darker than the solid area of the base metal. Moreover, a gap can be detected through the binarization method. The center area is not dark in the image of the weld pool without the gap. As the image of the weld pool is uneven without a gap, the binarization method can provide a detection result with some errors. Hence, it is challenging to identify whether there is a gap. A convolutional neural network (CNN) is useful for analyzing images. Thus, we applied a CNN to the weld pool image. If the gap is identified using the CNN, the binarization method is used to obtain the gap width. Hence, in PAW, welding conditions are adjusted based on the gap.

Cite this article as:
S. Yamane and K. Matsuo, “Gap Detection Using Convolutional Neural Network and Adaptive Control in Robotic Plasma Welding,” Int. J. Automation Technol., Vol.13 No.6, pp. 796-802, 2019.
Data files:
References
  1. [1] J. N. Pires, A. Loureiro, and G. Bölmsjo, “Welding Robots: Technology, System Issues and Application,” Springer Science & Business Media, doi: 10.1007/1-84628-191-1, 2006.
  2. [2] Y. Hirota, Y. Mukai, A. Kawamoto, and J. Fujiwara, “Newly Developed Controls for Arc Welding Robot,” Int. J. Automation Technol., Vol.7, No.1, pp. 95-102, doi: 10.20965/ijat.2013.p0095, 2013.
  3. [3] J. Hicks, “Welded Joint Design,” Industrial Press, 1999.
  4. [4] T. Nakamura and K. Hiraoka, “Improvement of Welding Stability and Toughness Using Gas Metal Arc Welding in Pure Ar Shielding Gas,” Int. J. Automation Technol., Vol.7, No.1, pp. 109-113 doi: 10.20965/ijat.2013.p0109, 2013.
  5. [5] I. D. Harris et al., “Welding Handbook,” 9th edition, Vol.2, American Welding Society, 2004.
  6. [6] W. Lucas, “Tig and Plasma Welding,” Woodhead Publishing, 1990.
  7. [7] J. B. Song and D. E. Hardt, “Dynamic Modeling and Adaptive Control of the Gas Metal Arc Welding Process,” J. Dyn. Sys. Meas. Control, Vol.116, No.3, pp. 405-413, doi: 10.1115/1.2899235, 1994.
  8. [8] Y. M. Zhang and S. B. Zhang, “Observation of the keyhole during plasma arc welding,” Weld. J., Vol.78, No.2, pp. 53-s-58-s, 1999.
  9. [9] G. Zhang, C. S. Wu, and Z. Liu, “Experimental observation of both keyhole and its surrounding thermal field in plasma arc welding,” Int. J. Heat. Mass. Trans., Vol.70, No.3, pp. 439-448, doi: 10.1016/j.ijheatmasstransfer.2013.11.036, 2014.
  10. [10] A. Matsushita, M. Yamanaka, S. Kaneko, H. Ohfuji, and K. Fukuda, “Basic Image Measurement for Laser Welding Robot Motion Control,” Int. J. Automation Technol., Vol.3, No.2, pp. 136-143, doi: 10.20965/ijat.2010.p0536, 2009.
  11. [11] Y. Zou, Y. Li, L. Jiang, and L. Xue, “Weld Pool Image Processing Algorithm for Seam Tracking of Welding Robot,” 6th IEEE Conf. on Industrial Electronics and Applications, pp. 161-165, 2011.
  12. [12] S. Yamane, K. Shirota, S. Tsukano, and D. L. Wang, “Image Processing of the Weld Pool and Tracking of the Welding Line in Pulsed MAG Welding,” Quarterly J. of JWS, Vol.33, pp. 156-s-160-s, 10.2207/qjjws.33.156s, 2015.
  13. [13] J. P. Lopera, J. S. Motta, and S. A. Alfaro, “Real-Time Measurement of Width and Height of Weld Beads in GMAW Processes,” Sensors, Vol.16, No.9, doi: 10.3390/s16091500, 2016.
  14. [14] X. F. Liu, C. S. Wu, C. B. Jia, and G. K. Zhang, “Visual Sensing of the Weld Pool Geometry from the Topside View in Keyhole Plasma Arc Welding,” J. Manuf. Process., Vol.26, pp. 74-83, 2017.
  15. [15] X. F. Liu, C. B. Jia, C. S. Wu, G. K. Zhang, and J. Q. Gao, “Measurement of the Keyhole Entrance and Topside Weld Pool Geometries in Keyhole Plasma Arc Welding with Dual CCD Cameras,” J. Mater. Process. Tech., Vol.248, pp. 39-48, 2017.
  16. [16] W. Wang, Q. Wang, S. Yamane, and T. Hirano, “Tracking using pattern matching of keyhole in visual robotic plasma welding,” Int. J. Adv. Manuf. Technol., Vol.98, Nos.5-8, pp. 2127-2136, doi: 10.1007/s00170-018-2358-2, 2018.
  17. [17] J. K. Martikainen and T. J. I. Moisio, “Investigation of the Effect of Welding Parameters on Weld Quality of Plasma Arc Keyhole Welding of Structural Steels,” Weld. J., Vol.72, pp. 329-s-340-s, 1993.
  18. [18] D. E. Rumelhart, G. E. Hinton, and R. J. Williams, “Learning Representations by Back-Propagating Errors,” Nature, Vol.323, pp. 533-536, doi: 10.1038/323533a0, 1986.
  19. [19] H. Kitano and T. Nakamura, “Predicting Residual Weld Stress Distribution with an Adaptive Neuro-Fuzzy Inference System,” Int. J. Automation Technol., Vol.12, No.3, pp. 290-296, doi: 10.20965/ijat.2018.p0290, 2018.
  20. [20] S. Pal, S. K. Pal, and A. K. Samantaray, “Artificial Neural Network Modeling of Weld Joint Strength Prediction of a Pulsed Metal Inert Gas Welding Process Using Arc Signals,” J. Mater. Process. Technol., Vol.202, Nos.1-3, pp. 464-474, doi: 10.1016/j.jmatprotec.2007.09.039, 2008.
  21. [21] G. E. Cook, R. J. Barnett, and K. Andersen, “Weld Modeling and Control Using Artificial Neural Networks,” IEEE Trans. on Industry Applications, Vol.31, No.6, pp. 1484-1491, doi: 10.1109/IAS.1993.299170, 1995.
  22. [22] H. Yamamoto, Y. Takano, K. Eguchi, S. Yamane, and K. Oshima, “Estimation of arc length and wire extension using neural network,” Quarterly J. of JWS, Vol.20, pp. 378-385, doi: 10.2207/qjjws.20.378, 2002 (in Japanese).
  23. [23] G. E. Hinton and R. R. Salakhutdinov, “Reducing the Dimensionality of Data with Neural Networks,” Science, Vol.313, No.5786, pp. 504-507, doi: 10.1126/science.1127647, 2006.
  24. [24] O. Russakovsky, J. Deng, H. Su et al., “ImageNet Large Scale Visual Recognition Challenge,” Int. J. Comput. Vis., Vol.115, No.3, pp. 211-252, doi: 10.1007/s11263-015-0816-y, 2015.
  25. [25] Caffe. http://caffe.berkeleyvision.org/ [Accessed March 10, 2019]
  26. [26] NDIVIA Cuda. https://developer.nvidia.com/cuda-gpus [Accessed March 10, 2019]
  27. [27] N. Otsu, “A Threshold Selection Method from Gray-Level Histograms,” IEEE Trans. on Systems, Man, and Cybernetics, Vol.9, No.1, pp. 62-66, doi: 10.1109/TSMC.1979.4310076, 1979.

*This site is desgined based on HTML5 and CSS3 for modern browsers, e.g. Chrome, Firefox, Safari, Edge, Opera.

Last updated on Apr. 22, 2024