single-rb.php

JRM Vol.23 No.3 pp. 328-337
doi: 10.20965/jrm.2011.p0328
(2011)

Paper:

Robot Hand Whose Fingertip Covered with Net-Shape Proximity Sensor – Moving Object Tracking Using Proximity Sensing –

Hiroaki Hasegawa*, Yosuke Suzuki*, Aiguo Ming*,
Masatoshi Ishikawa**, and Makoto Shimojo*

*Department of Mechanical Engineering and Intelligent Systems, The University of Electro-Communications, East-4th Bldg., 1-5-1 Chofugaoka, Chofu-city, Tokyo 182-8585, Japan

**Department of Mathematical Engineering and Information Physics, Faculty of Engineering, The University of Tokyo, #6 Bldg., 7-3-1 Hongo, Bunkyo-ku, Tokyo 113-0033, Japan

Received:
October 1, 2010
Accepted:
January 31, 2011
Published:
June 20, 2011
Keywords:
robot hand, proximity sensor, sensor fusion
Abstract

Occlusion in several millimeters from an object to be grasped made it difficult for a vision-sensor-based approach to detect relative positioning between this object and robot fingers joint grasping. The proximity sensor we proposed detects the object at a near range very effectively. We developed a thin proximity sensor sheet to cover the 3 fingers of a robot hand. Integrating sensors and hand control, we implemented an objecttracking controller. Using proximity sensory signals, the controller coordinates wrist positioning based on palm proximity sensors and grasping from fingertip sensors, enabling us to track and capture moving objects.

Cite this article as:
Hiroaki Hasegawa, Yosuke Suzuki, Aiguo Ming,
Masatoshi Ishikawa, and Makoto Shimojo, “Robot Hand Whose Fingertip Covered with Net-Shape Proximity Sensor – Moving Object Tracking Using Proximity Sensing –,” J. Robot. Mechatron., Vol.23, No.3, pp. 328-337, 2011.
Data files:
References
  1. [1] A. Namiki, T. Komuro, and M. Ishikawa, “High-speed sensorymotor fusion for robotic grasping,” Measurement Science and Technology, Vol.13, No.11, p. 1767, 2002.
  2. [2] S. Ueki, H. Kawasaki, and T. Mouri, “Adaptive Coordinated Control of Multi-Fingered Robot Hand,” The J. of Robotics and Mechatronics, Vol.21, No.1, pp. 36-43, 2009.
  3. [3] P. Michel, C. Scheurer, J. Kuffner, N. Vahrenkamp, and R. Dillmann, “Planning for robust execution of humanoid motions using future perceptive capability,” In Proc. IEEE/RSJ Int. Conf. Intelligent Robots and Systems (IROS 2007), pp. 3223-3228, 2007.
  4. [4] E. Marchand and G. D. Hager, “Dynamic sensor planning in visual servoing,” In Proc. IEEE Int. Robotics and Automation Conf., Vol.3, pp. 1988-1993, 1998.
  5. [5] Y. Mezouar and F. Chaumette, “Avoiding self-occlusions and preserving visibility by path planning in the image,” Robotics and Autonomous Systems, Vol.411, No.2-3, pp. 77-87, 2002.
  6. [6] D. E. WHITNEY, “Quasi-Static Assembly of Compliantly Supported Rigid Parts and Its Application,” J. Dynamic Systems, Measurement, Control, Trans. ASME, Vol.104, pp. 65-77, 1982.
  7. [7] Shadow Robot Company, “Design of a Dextrous Hand for advanced CLAWAR applications,” In Climbing and Walking Robots and the Supporting Technologies for Mobile Machines, pp. 691-698, 2003.
  8. [8] H. Iwata and S. Sugano, “Design of anthropomorphic dexterous hand with passive joints and sensitive soft skins,” In IEEE/SICE Int. Symposium on System Integration 2009 (SII 2009), pp. 129-134, Jan. 2009.
  9. [9] S. Walker, K. Loewke, M. Fischer, C. Liu, and J. Salisbury, “An Optical Fiber Proximity Sensor for Haptic Exploration,” In IEEE Int. Conf. on Robotics and Automation 2007, pp. 473-478, April 2007.
  10. [10] B. Mayton, L. LeGrand, and J. R. Smith, “An Electric Field Pretouch system for grasping and co-manipulation,” In Proc. IEEE Int. Conf. Robotics and Automation (ICRA), pp. 831-838, 2010.
  11. [11] J. Fujimoto, I. Mizuuchi, Y. Sodeyama, K. Yamamoto, N. Muramatsu, S. Ohta, T. Hirose, K. Hongo, K. Okada, and M. Inaba, “Picking up dishes based on active groping with multisensory robot hand,” In The 18th IEEE Int. Symposium on Robot and Human Interactive Communication 2009 (RO-MAN 2009), pp. 220-225, Sep. 2009.
  12. [12] K. Hsiao, P. Nangeroni, M. Huber, A. Saxena, and A. Y. Ng, “Reactive grasping using optical proximity sensors,” In Proc. IEEE Int. Conf. Robotics and Automation (ICRA ’09), pp. 2098-2105, 2009.
  13. [13] H. Hasegawa, Y. Mizoguchi, K. Tadakuma, A. Ming, M. Ishikawa, and M. Shimojo, “Development of intelligent robot hand using proximity, contact and slip sensing,” In IEEE Int. Conf. on Robotics and Automation (ICRA) 2010, pp. 777-784, May 2010.
  14. [14] M. Amamoto and S. Shimojo, “Mesh Structure Proximity Sensor Capable of Being Attached to Free-form Surface with a FewWires,” In Proc. of The 24th Annual Conf. of the Robotics Society of Japan, 2006. (in Japanese)
  15. [15] M. Shimojo, T. Araki, A. Ming, and M. Ishikawa, “A High-Speed Mesh of Tactile Sensors Fitting Arbitrary Surfaces,” Sensors Journal, IEEE, Vol.10, No.4, pp. 822-830, April 2010.
  16. [16] M. M. Ishikawa, “A Method for Measuring the Center Position of a Two Dimensional Load Using Pressure-Conductive Rubber,” Trans. of The Society of Instrument and Control Engineers, Vol.18, No.7, pp. 730-735, 1982. (in Japanese)
  17. [17] S. Teshigawara, K. Tadakuma, A. Ming, M. Ishikawa, and M. Shimojo, “High sensitivity initial slip sensor for dexterous grasp,” In IEEE Int. Conf. on Robotics and Automation (ICRA 2010), pp. 4867-4872, May 2010.

*This site is desgined based on HTML5 and CSS3 for modern browsers, e.g. Chrome, Firefox, Safari, Edge, Opera.

Last updated on May. 17, 2021