JRM Vol.35 No.3 pp. 734-742
doi: 10.20965/jrm.2023.p0734


Through-Hole Detection and Finger Insertion Planning as Preceding Motion for Hooking and Caging a Ring-Shaped Objects

Koshi Makihara*, Takuya Otsubo**, and Satoshi Makita*** ORCID Icon

*Osaka University
1-3 Machikaneyama, Toyonaka, Osaka 560-8531, Japan

**National Institute of Technology, Sasebo College
1-1 Okishincho, Sasebo, Nagasaki 857-1193, Japan

***Fukuoka Institute of Technology
3-30-1 Wajiro-higashi, Higashi-ku, Fukuoka, Fukuoka 811-0295, Japan

December 1, 2022
March 29, 2023
June 20, 2023
caging, manipulation, objects with holes, perception, motion planning

This study investigated a pregrasp strategy for hooking and caging ring-shaped objects. Through-hole features enable the robot hand to hook an object with holes by inserting its finger into one of the holes. Compared to directly grasping the ring, an inserting motion is more convenient to allow the uncertainty of positioning errors and avoid collisions between the hand and the object. Instead of recognizing the exact shape of the object, we only detected its ring-shaped feature as a through-hole to be inserted and estimated its approximate center position and orientation from the point cloud of the object. The estimated geometric properties enabled the approaching motion of the robotic gripper to complete insertion. The proposed perception and motion-planning method was demonstrated for rigid and deformable objects with holes.

Planning of finger insertion motion

Planning of finger insertion motion

Cite this article as:
K. Makihara, T. Otsubo, and S. Makita, “Through-Hole Detection and Finger Insertion Planning as Preceding Motion for Hooking and Caging a Ring-Shaped Objects,” J. Robot. Mechatron., Vol.35 No.3, pp. 734-742, 2023.
Data files:
  1. [1] Y. Domae, “Recent trends in the research of industrial robots and future outlook,” J. Robot. Mechatron., Vol.31, No.1, pp. 57-62, 2019.
  2. [2] M. Shintani, Y. Fukui, K. Morioka, K. Ishihata, S. Iwaki, T. Ikeda, and T. Lüth, “Object grasping instructions to support robot by laser beam one drag operations,” J. Robot. Mechatron., Vol.33, No.4, pp. 756-767, 2021.
  3. [3] A. Saxena, J. Driemeyer, and A. Y. Ng, “Robotic grasping of novel objects using vision,” The Int. J. of Robotics Research, Vol.27, No.2, pp. 157-173, 2008.
  4. [4] I. Lenz, H. Lee, and A. Saxena, “Deep learning for detecting robotic grasps,” The Int. J. of Robotics Research, Vol.34, No.4-5, pp. 705-724, 2015.
  5. [5] Y. Domae, A. Noda, T. Nagatani, and W. Wan, “Robotic general parts feeder: Bin-picking, regrasping, and kitting,” IEEE Int. Conf. on Robotics and Automation, pp. 5004-5010, 2020.
  6. [6] J. Mahler, M. Matl, V. Satish, M. Danielczuk, B. DeRose, S. McKinley, and K. Goldberg, “Learning ambidextrous robot grasping policies,” Science Robotics, Vol.4, No.26, Article No.eaau4984, 2019.
  7. [7] S. Makita and W. Wan, “A survey of robotic caging and its applications,” Advanced Robotics, Vol.31, No.19-20, pp. 1071-1085, 2017.
  8. [8] E. Rimon and A. Blake, “Caging planar bodies by one-parameter two-fingered gripping systems,” The Int. J. of Robotics Research, Vol.18, No.3, pp. 299-318, 1999.
  9. [9] R. Diankov, S. S. Srinivasa, D. Ferguson, and J. Kuffner, “Manipulation planning with caging grasps,” 8th IEEE-RAS Int. Conf. on Humanoid Robots, pp. 285-292, 2008.
  10. [10] W. Wan, R. Fukui, M. Shimosaka, T. Sato, and Y. Kuniyoshi, “Grasping by caging: A promising tool to deal with uncertainty,” 2012 IEEE Int. Conf. on Robotics and Automation, pp. 5142-5149, 2012.
  11. [11] S. Makita and Y. Maeda, “3D multifingered caging: Basic formulation and planning,” Proc. of IEEE/RSJ Int. Conf. on Intelligent Robots and System, Nice, France, pp. 2697-2702, 2008.
  12. [12] S. Makita, K. Okita, and Y. Maeda, “3D two-fingered caging for two types of objects: Sufficient conditions and planning,” Int. J. of Mechatronics and Automation, Vol.3, pp. 263-277, 2013.
  13. [13] A. Varava, D. Kragic, and F. T. Pokorny, “Caging grasps of rigid and partially deformable 3-d objects with double fork and neck features,” IEEE Trans. on Robotics, Vol.32, No.6, pp. 1479-1497, 2016.
  14. [14] F. T. Pokorny, J. A. Stork, and D. Kragic, “Grasping objects with holes: A topological approach,” IEEE Int. Conf. on Robotics and Automation, pp. 1100-1107, 2013.
  15. [15] T.-H. Kwok, W. Wan, J. Pan, C. C. L. Wang, J. Yuan, K. Harada, and Y. Chen, “Rope Caging and Grasping,” Proc. of IEEE Int. Conf. on Robotics and Automation, Stockholm, Sweden, pp. 1980-1986, 2016.
  16. [16] T. Makapunyo, T. Phoka, P. Pipattanasomporn, N. Niparnan, and A. Sudsang, “Measurement framework of partial cage quality based on probabilistic motion planning,” Proc. of IEEE Int. Conf. on Robotics and Automation, Karlsruhe, Germany, pp. 1574-1579, 2013.
  17. [17] M. Welle, A. Varava, J. Mahler, K. Goldberg, D. Kragic, and F. Pokorny, “Partial caging: a clearance-based definition, datasets, and deep learning,” Autonomous Robots, Vol.45, pp. 647-664, 2021.
  18. [18] J. Mahler, F. T. Pokorny, Z. Mccarthy, A. F. V. D. Stappen, and K. Goldberg, “Energy-bounded caging : Formal definition and 2-d energy lower bound algorithm based on weighted alpha shapes,” IEEE Robotics and Automation Letters, Vol.1, No.1, pp. 508-515, 2016.
  19. [19] A. Shirizly, E. D. Rimon, and W. Wan, “Contact space computation of two-finger gravity based caging grasps security measure,” IEEE Robotics and Automation Letters, Vol.6, No.2, pp. 572-579, 2020.
  20. [20] J. A. Stork, F. T. Pokorny, and D. Kragic, “A Topology-based Object Representation for Clasping, Latching and Hooking,” Proc. of IEEE/RAS Int. Conf. on Humanoid Robots, pp. 138-145, 2013.
  21. [21] M. Yashima and T. Yamawaki, “Robotic nonprehensile catching: Initial experiments,” Proc. of IEEE/RSJ Int. Conf. on Intelligent Robots and Systems, pp. 5480-5486, 2014.
  22. [22] S. Akizuki and M. Hashimoto, “Stable position and pose estimation of industrial parts using evaluation of observability of 3D vector pairs,” J. Robot. Mechatron., Vol.27, No.2, pp. 174-181, 2015.
  23. [23] U. Asif, M. Bennamoun, and F. A. Sohel, “RGB-D object recognition and grasp detection using hierarchical cascaded forests,” IEEE Trans. on Robotics, Vol.33, No.3, pp. 547-564, 2017.
  24. [24] Y. Sakata and T. Suzuki, “Coverage motion planning based on 3D model’s curved shape for home cleaning robot,” J. Robot. Mechatron., Vol.35, No.1, pp. 30-42, 2023.
  25. [25] R. Iinuma, Y. Hori, H. Onoyama, Y. Kubo, and T. Fukao, “Robotic forklift for stacking multiple pallets with RGB-D cameras,” J. Robot. Mechatron., Vol.33, No.6, pp. 1265-1273, 2021.
  26. [26] B. Calli, A. Walsman, A. Singh, S. Srinivasa, P. Abbeel, and A. M. Dollar, “Benchmarking in manipulation research: The YCB object and model set and benchmarking protocols,” IEEE Robotics and Automation Magazine, Vol.22, pp. 36-52, 2015.
  27. [27] M. Quigley, K. Conley, B. P. Gerkey, J. Faust, T. Foote, J. Leibs, R. Wheeler, and A. Y. Ng, “ROS: an open-source robot operating system,” ICRA Workshop on Open Source Software, 2009.

*This site is desgined based on HTML5 and CSS3 for modern browsers, e.g. Chrome, Firefox, Safari, Edge, Opera.

Last updated on Sep. 29, 2023