JRM Vol.24 No.4 pp. 666-676
doi: 10.20965/jrm.2012.p0666


Vision-Force Guided Monitoring for Mating Connectors in Wiring Harness Assembly Systems

Pei Di, Fei Chen, Hironobu Sasaki, Jian Huang,
Toshio Fukuda, and Takayuki Matsuno

Department of Micro-Nano Systems Engineering, Nagoya University, Furo-cho, Chikusa-ku, Nagoya, Aichi 464-8603, Japan

November 22, 2011
February 6, 2012
August 20, 2012
robotic assembly, vision-force guided monitoring, mating connectors

Correctly mating connectors is vital in robotic harness wiring systems. Although a static force model has been established based on force and position information during mating, there are difficulties to be overcome to implement a mating process monitoring task with insufficient training samples. It is difficult to obtain consistent features from successful mating samples under different conditions using existing models, which makes it difficult to recognize whether the mating process is correctly completed by learning few successful training samples. In this study, a simple new model is proposed to describe the connector mating process. More robust features are chosen as model parameters. It turns out that the model is more appropriate for the monitoring task, despite varied assembly conditions that affect the mating process. Multiple sensors, including a force sensor, encoders and two CCD cameras are used to acquire the necessary information for the model. A real-time algorithm based on a set-membership approach is used to implement the monitoring task. The effectiveness of these methods is confirmed through experiments. This study provides a simple but effective monitoring approach for a realtime industrial system integration application.

Cite this article as:
P. Di, F. Chen, H. Sasaki, J. Huang, <. Fukuda, and T. Matsuno, “Vision-Force Guided Monitoring for Mating Connectors in Wiring Harness Assembly Systems,” J. Robot. Mechatron., Vol.24, No.4, pp. 666-676, 2012.
Data files:
  1. [1] M. T. Zhang, S. Niu, S. Deng et al, “Hierarchical Capacity Planning With Reconfigurable Kits in Global Semiconductor Assembly and Test Manufacturing,” IEEE Trans. Automat. Sci. Eng., Vol.4, pp. 543-552, Oct. 2007.
  2. [2] W. J. Chen and W. Li, “Fiber Assembly of MEMS Optical Switches With U-Groove Channels,” IEEE Trans. Automat. Sci. Eng., Vol.5, No.2, pp. 207-215, Apr. 2008.
  3. [3] H. Chen, N. Xi, and G. Li, “CAD-guided automated nanoassembly using atomic force microscopy-based nonrobotics,” IEEE Trans. Automat. Sci. Eng., Vol.3, No.3, pp. 208-217, Jul. 2006.
  4. [4] H. Okuda, A. Noda, K. Sumi, T. Matsuno, S. Kaneko, and T. Fukuda, “Development of Production Robot System that can Assemble Products with Cable and Connector,” J. of Robotics and Mechatronics, Vol.23, No.6, 2011.
  5. [5] K. Koo, X. Jiang, A. Konno, and M. Uchiyama, “Development of a Wire Harness Assembly Motion Planner for Redundant Multiple Manipulators,” J. of Robotics and Mechatronics, Vol.23, No.6, 2011.
  6. [6] T. Matsuno, D. Tamaki, F. Arai, and T. Fukuda, “Manipulation of deformable linear objects using knot invariants to classify the object condition based on image sensor information,” IEEE Trans.Mechatronics, Vol.11, No.4, pp. 401-408, 2006.
  7. [7] S. R. Chhatpar and M. S. Branicky, “Search strategies for pegin-hole assemblies with position uncertainty,” Proc. IEEE/RSJ Int. Conf. Intelligent Robots and Systems, pp. 1465-1470, Maui, HI, Oct. 2001.
  8. [8] F. Duan, M. Morioka, J. Tan, and T. Arai, “Multi-Modal Assembly-Support System for Cell Production,” Int. J. of Automation Technology, Vol.2, No.5, 2008.
  9. [9] Y. Q. Fei and X. F. Zhao, “An Assembly Process Modeling and Analysis for Robotic Multiple Peg-in-hole,” J. of Intelligent & Robotic Systems, Vol.36, No.2, pp. 175-189, 2003.
  10. [10] P. Frank, “Diagnosis in dynamic systems using analytical and knowledge-based redundancy-A survey,” Automatica, Vol.26, pp. 459-474, 1990.
  11. [11] T. Yamada, A. Tanaka, M. Yamada, Y. Funahashi, and H. Yamamoto, “Identification of Contact Conditions by Active Force Sensing Estimated Parameter Uncertainty and Experimental Verification,” J. of Robotics and Mechatronics, Vol.23, No.1, 2011.
  12. [12] M. Sugi, I. Matsumura, Y. Tamura, M. Nikaido, J. Ota, T. Arai, K. Kotani, K. Takamasu, H. Suzuki, A. Yamamoto, Y. Sato, S. Shin, and F. Kimura, “Quantitative Evaluation of Automatic Parts Delivery in Attentive Workbench Supporting Workers in Cell Production,” J. of Robotics and Mechatronics, Vol.21, No.1, 2009.
  13. [13] D. Lee, X. Tao, H. Cho, and Y. Cho, “A Dual Imaging System for Flip-Chip Alignment Using Visual Servoing,” J. of Robotics and Mechatronics, Vol.18, No.6, 2006.
  14. [14] R. Kato and T. Arai “Assessment ofMental Stress on Human Operators Induced by the Assembly Support in a Robot-Assisted “Cellular Manufacturing” Assembly System,” Int. J. of Automation Technology, Vol.3, No.5, 2009.
  15. [15] J. Huang, T. Fukuda, and T. Matsuno, “Model-based intelligent fault detection and diagnosis for mating electric connectors in robotic wiring harness assembly systems,” IEEE/ASME Trans. Mechatronics, Vol.13, No.1, pp. 86-94, 2008.
  16. [16] J. Huang, P. Di, T. Fukuda, and T. Matsuno, “Robust Model-based Online Fault Detection forMating Process of Electric Connectors in Robotic Wiring Harness Assembly Systems,” IEEE Trans on Contr. Syst. Technol., Vol.18, No.5, pp. 1207-1215, Sep. 2010.
  17. [17] J. P. Lewis, “Fast Template Matching,” Vision Interface 95, Canadian Image Processing and Pattern Recognition Society, Quebec City, Canada, pp. 120-123, May 15-19, 1995.
  18. [18] M. Sonka, V. Hlavac, and R. Boyle, “Image Processing, Analysis and Machine Vision,” CL-Engineering; 2 edition, 1998.

  19. Supporting Online Materials:
  20. [a] M. Kvasnica, P. Grieder, and M. Baotic, “MPT Multi-Parametric Toolbox,”˜mpt/

*This site is desgined based on HTML5 and CSS3 for modern browsers, e.g. Chrome, Firefox, Safari, Edge, Opera.

Last updated on Oct. 23, 2020