single-rb.php

JRM Vol.22 No.1 pp. 50-64
doi: 10.20965/jrm.2010.p0050
(2010)

Paper:

Recognition and Removal of Interior Facilities by Vision-Based Robot System

S. Rolando Cruz-Ramírez, Tatsuo Arai, Yasushi Mae,
Tomohito Takubo, and Kenichi Ohara

Department of Systems Innovation, Graduate School of Engineering Science, Osaka University, 1-3 Machikaneyama, Toyonaka, Osaka 560-8531, Japan

Received:
July 18, 2009
Accepted:
September 8, 2009
Published:
February 20, 2010
Keywords:
dismantling robot system, human-robot collaboration, multiple viewpoints for recognition, active lighting, on-site experiment
Abstract
For future dismantling jobs in the renovation of the interiors of office buildings, we propose a robotic dismantling system that will assist human workers with the hard works. As an application of the robotic system, this paper presents the process of removing ceiling fixtures, such as Lamp Panels (LPs) and Air Conditioning Vents (ACVs), by man and robot. In this collaboration, a robot arm provides assistance by holding and collecting the fixtures, and the human worker only removes screws and/or nuts. In order to lead the robot to a holding position, the human worker indicates a position on the fixture to the robot with brief and simple instructions. The robot estimates the pose of the fixture through 3D model-based object recognition with a hand-mounted stereo camera. The integration of multiple viewpoints for the robot with an active lighting system enhances the recognition performance against both natural lighting changes at the site and the variability in the pose between the camera and the object to be recognized. As a verification experiment, the sequential removal of several different ceiling fixtures is presented. In this experiment, robust recognition is achieved with an average accuracy of 10 mm. The feasibility of the system is verified by using the completion time and the precision requirements in a practical environment.
Cite this article as:
S. Cruz-Ramírez, T. Arai, Y. Mae, T. Takubo, and K. Ohara, “Recognition and Removal of Interior Facilities by Vision-Based Robot System,” J. Robot. Mechatron., Vol.22 No.1, pp. 50-64, 2010.
Data files:
References
  1. [1] Ministry of Land, Infrastructure and Transport (MLIT, Japan), Task Committee on New Construction Industry, Bureau of Construction Economy.
    URL: http://www.mlit.go.jp/sogoseisaku/const/sinko/kumiai/honbun01.htm, (in Japanese).
  2. [2] J. Naito, G. Obinta, A. Nakayama, and K. Hase, “Development of a wearable robot for assisting carpentry workers, Advanced Robotic Systems,” Vol.4, No.4, pp. 431-436, 2007.
  3. [3] S. N. Yu, S. Y. Lee, C. S. Han, K. Y. Lee, and S. H. Lee, “Development of the curtain wall installation robot: Performance and efficiency tests at a construction site,” Autonomous Robots, Vol.22, No.3, pp. 281-291, 2007.
  4. [4] C. Gordon, F. Boukamp, D. Huber, E. Latimer, K. Park, and B. Akinci, “Combining reality capture technologies for construction defect detection: a case study,” in: EIA9 E-Activities and Intelligent Support in Design and the Built Environment 2003, 9th EuropIA Int. Conf., Istanbul, Turkey, pp. 99-108, 2003.
  5. [5] J. Maeda, H. Takada, and Y. Abe, “Applicable possibility studies on a humanoid robot to cooperative work on construction site with a human worker,” in: Proc. ISARC 2004 21st Int. Symposium on Automation and Robotics in Construction, Jeju, Korea, pp. 334-339, 2004.
  6. [6] K. Tanaka, M. Kajitani, C. Kanamori, and Y. Abe, “Development of a Construction Robot for Marking on Ceiling Boards (3rd Report, Prototype of the Laser Pointer System),” Trans. of the Japan Society of Mechanical Engineers, Vol.69, No.679, pp. 676-682, 2003.
  7. [7] M. J. Bakari, K. M. Zied, and D. W. Seward, “Development of a Multi-Arm Mobile Robot for Nuclear Decommissioning Tasks, Advanced Robotic Systems,” Vol.4, No.4, pp. 387-406, 2007.
  8. [8] G. Dini, F. Failli, and M. Santochi, “A disassembly planning software system for the optimization of recycling processes,” Production Planning and Control, 12, pp. 2-12, 2001.
  9. [9] U. Büuker, S. Drüue, N. Güotze, G. Hartmann, B. Kalkreuter, R. Stemmer, and R. Trapp, “Vision-based control of an autonomous disassembly station,” Robotics and Autonomous Systems, Vol.35, pp. 179-189, 2001.
  10. [10] Y. Shimoi, K. Kanemaru, T. Morita, K. Fujishima, K. Tomita, and A. Iwasaki, “Operability Assessment of Indoor Dismantlement Assistance Machine,” in: Proc. ISARC 2006 23rd Int. Symposium on Automation and Robotics in Construction, Tokyo, Japan, pp. 822-827, 2006.
  11. [11] M. Kamezaki, H. Iwata, and S. Sugano, “Development of an Operation Skill-Training Simulator for Double-Front Construction Machinery –Training Effect for a House Demolition Work–,” J. of Robotics and Mechatronics, Vol.20, No.4, pp. 602-609, 2008.
  12. [12] S. R. Cruz-Ramírez, Y. Ishizuka, Y. Mae, T. Takubo, and T. Arai, “Dismantling interior facilities in buildings by human robot collaboration,” in: Proc. ICRA 2008 IEEE Int. Conf. on Robotics and Automation, Pasadena, CA, USA, pp. 2583-2590, 2008.
  13. [13] S. Kunimitsu, H. Asama, K. Kawabata, and T. Mishima, “Development of Crane Vision for Positioning Container,” J. of Robotics and Mechatronics, Vol.16, No.2, pp. 186-193, 2004.
  14. [14] F. Tomita, T. Yoshimi, T. Ueshiba, Y. Kawai, Y. Sumi, T. Matsushita, N. Ichimura, K. Sugimoto, and Y. Ishiyama, “R&D of versatile 3D vision system VVV,” in: Proc. SMC 1998 IEEE Int. Conf. on Systems, Man, and Cybernetics, San Diego, CA, pp. 4510-4516, 1998.
  15. [15] Y. Sumi, Y. Kawai, T. Yoshimi, and F. Tomita, “3D object recognition in cluttered environments by segment-based stereo vision,” Computer Vision, 46, pp. 5-23, 2002.
  16. [16] T. Oomichi, T. Arai, K. Inoue, T. Kotoku, T. Tanikawa, and J. Maeda, “Dismantle robot system for categorizing wastes of building renewal,” in: Proc. JSME 2007 Conf. on Robotics and Mechatronics, Akita, Japan, 1P1-M06, 2007 (in Japanese).
  17. [17] Y. Ishizuka, S. R. Cruz-Ramírez, Y. Mae, T. Takubo, and T. Arai, “Usability of interface devices for human robot collaboration,” in: Proc. ISFA 2008 Int. Symposium on Flexible Automation, Atlanta, GA, USA, JL036, 2008.
  18. [18] J. Maeda, T. Oomichi, T. Arai, K. T. Kotoku, and T. Tanikawa, “System for demolishment of ceiling materials with ID-tags in renewal,” in: Proc. SICE 2008 9th System Integration Division, Gifu, Japan, pp. 267-268, 2008 (in Japanese).
  19. [19] T. Kotoku, T. Tanikawa, G. Biggs, B. K. Kim, and K. Ohba, “Cooperative task support system with ID-tag,” in: Proc. SICE 2008 9th System Integration Division, Gifu, Japan, pp. 269-270, 2008 (in Japanese).
  20. [20] M.Wakita, T. Nawa, S. Asizawa, K. Inaba, Y. Kuromiya, T.Watanabe, T. Oomichi, “Development of dismantling system for ceiling board using water jet,” in Proc. of SICE 2008 9th System Integration Division, Gifu, Japan, pp. 275-276, 2008 (in Japanese).
  21. [21] K. Hashimoto, “A Review on Vision-based Control of Robot Manipulators, Advanced Robotics,” Vol.17, No.10, pp. 969-991, 2003.
  22. [22] J. Zhu, Y. Mae, and M. Minami, “Finding and quantitative evaluation of minute flaws on metal surface using hairline,” IEEE Trans. on Industrial Electronics, Vol.54, No.3, pp. 1420-1429, June 2007.
  23. [23] O. Morel, C. Stolz, F. Meriaudeau, and P. Gorria, “Active lighting applied to three-dimensional reconstruction of specular metallic surfaces by polarization imaging,” Applied Optics, Vol.45, No.17, pp. 4062-4068, June 2006.
  24. [24] S. Yi, R. M. Haralick, and L. G. Shapiro, “Optimal sensor and light source positioning for machine vision,” Computer Vision and Image Understanding, Vol.61, No.1, pp. 122-137, 1995.
  25. [25] S. K. Kopparapu, “Lighting design for machine vision application,” Image and Vision Computing, Vol.24, No.7, pp. 720-726, July 2006.
  26. [26] K. Takaki, T. Tomizawa, T. Tanikawa, K. Ohba, and M. Mizukawa, “Distributed actuation module for ubiquitous robot,” in: Proc. ICCAS 2008 Int. Conf. on Control, Automation and Systems, Seoul, Korea, pp. 2394-2399, 2008.

*This site is desgined based on HTML5 and CSS3 for modern browsers, e.g. Chrome, Firefox, Safari, Edge, Opera.

Last updated on Apr. 18, 2024