single-rb.php

JRM Vol.18 No.5 pp. 564-571
doi: 10.20965/jrm.2006.p0564
(2006)

Paper:

Tracing Manipulation in Clothes Spreading by Robot Arms

Khairul Salleh, Hiroaki Seki, Yoshitsugu Kamiya,
and Masatoshi Hikizu

Kanazawa University, Kakuma-machi, Kanazawa, Ishikawa 920-1192, Japan

Received:
October 14, 2005
Accepted:
December 27, 2005
Published:
October 20, 2006
Keywords:
edge tracing, deformable object, corner of clothes, clothes spreading, home service robot
Abstract

Edge tracing is important in manipulating deformable objects to reveal their original shape. In this paper, we propose a unique and improved tracing manipulation for towel spreading, an example of a deformable object using two robot arms with sensors-equipped grippers and a CCD camera. Tracing in this paper context involves tracing the towel’s edge. Robot arm movement is based on feedback from sensors and from images from the CCD camera. Our proposed tracing manipulation ensures that both corners grasped by robot are adjacent and not across, enabling the towel to be successfully spread. Experimental results from spreading rectangular towels with different thickness, stiffness, smoothness, and color using our improved tracing manipulation demonstrated that our proposal is also robust.

Cite this article as:
Khairul Salleh, Hiroaki Seki, Yoshitsugu Kamiya, and
and Masatoshi Hikizu, “Tracing Manipulation in Clothes Spreading by Robot Arms,” J. Robot. Mechatron., Vol.18, No.5, pp. 564-571, 2006.
Data files:
References
  1. [1] A. Arslan and I. Turkoglu, “An Edge Tracing Method for Object Recognition,” Proc. of Int. Conf. in Central Europe on Computer Graphics, Visualization and Interactive Digital Media, Vol.3, 1999.
  2. [2] S. Hirai, “Deformable Object Manipulation,” Journal of the Robotics Society of Japan, Vol.16, No.2, pp. 136-139, 1998.
  3. [3] H. Nakagaki, “Insertion Task of a Flexible Beam or a Flexible Wire,” Journal of the Robotics Society of Japan, Vol.16, No.2, pp. 159-162, 1998.
  4. [4] F. Abegg, D. Henrich, and H. Worn, “Manipulating deformable linear objects –Vision-based recognition of contact state transitions–,” Proc. of the 9th Int. Conf. on Advanced Robotics, 1999.
  5. [5] D. J. Balkcom and M. T. Mason, “Introducing robotic origami folding,” Proc. of IEEE Int. Conf. on Robotics and Automation, pp. 3245-3250, 2004.
  6. [6] W. Kraus Jr. and B. J. McCarragher, “Case Studies in the Manipulation of Flexible Parts Using a Hybrid Position/Force Approach,” Proc. of IEEE Int. Conf. on Robotics and Automation, pp. 367-372, 1997.
  7. [7] M. Desbrun, P. Schroder, and A. Barr, “Interactive animation of structured deformable objects,” Proc. of the Graphics Interface Conf., pp. 1-8, 1999.
  8. [8] K. Hamajima and M. Kakikura, “Planning Strategy for Task of Unfolding Clothes,” Proc. of the 16th Annual Conf. of the Robotics Society of Japan, pp. 389-390, 1998.
  9. [9] E. Ono, N. Kita, and S. Sakane, “Unfolding a Folded Using Information of Outline with Vision and Touch Sensor,” Journal of the Robotics Society of Japan, Vol.15, No.2, pp. 113-121, 1997.
  10. [10] P. W. Smith, N. Nandhakumar, and A. K. Tamadorai, “Vision Based Manipulation of Non Rigid Objects,” Proc. of IEEE Int. Conf. Automat, Vol.4, pp. 3191-3196, 1996.

*This site is desgined based on HTML5 and CSS3 for modern browsers, e.g. Chrome, Firefox, Safari, Edge, Opera.

Last updated on Jun. 08, 2021