single-rb.php

JRM Vol.25 No.1 pp. 72-79
doi: 10.20965/jrm.2013.p0072
(2013)

Paper:

Haptic Augmentation Utilizing the Reaction Force of a Base Object

Yuichi Kurita*, Atsutoshi Ikeda**, Kazuyuki Nagata***,
and Tsukasa Ogasawara**

*Faculty of Engineering, Hiroshima University, 1-4-1 Kagamiyama, Higashi-Hiroshima, Hiroshima 739-8527, Japan

**Graduate School of Information Science, Nara Institute of Science and Technology, 8916-5 Takayama, Ikoma, Nara 630-0192, Japan

***National Institute of Advanced Industrial Science and Technology, 1-1-1 Umezono, Tsukuba, Ibaraki 305-8568, Japan

Received:
January 17, 2012
Accepted:
April 27, 2012
Published:
February 20, 2013
Keywords:
haptic interface, haptic augmentation, frequency analysis
Abstract
A haptic device is one of the interfaces promising as a human-computer interaction tool that provides users with information in a virtual reality environment. The proposed haptic augmentation presents the force response of an object by combining the force generated from a haptic device against the force response generated from the base object, which has material properties similar to those of the target object. In this paper, the concept of a haptic augmentation technique is described and the prototype of a haptic augmentation system is developed. Frequency characteristics and experiments by human participants show that the proposed method has better performance than a traditional device-only method.
Cite this article as:
Y. Kurita, A. Ikeda, K. Nagata, and T. Ogasawara, “Haptic Augmentation Utilizing the Reaction Force of a Base Object,” J. Robot. Mechatron., Vol.25 No.1, pp. 72-79, 2013.
Data files:
References
  1. [1] Y. Yokokohji, N. Muramori, Y. Sato, and T. Yoshikawa, “Designing an encountered-type haptic display for multiple fingertip contacts based on the observation of human grasping behavior,” Int. J. of Robotics Research, Vol.24, No.9, pp. 717-729, 2005.
  2. [2] G. Hwang, P. Chantanakajornfung, and H. Hashimoto, “Versatile robotic biomanipulation with haptic interface,” J. of Robotics and Mechatronics, Vol.19, No.5, pp. 585-591, 2007.
  3. [3] M. C. Yip,M. Tavakoli, and R. D. Howe, “Performance analysis of a haptic telemanipulation task under time delay,” Advanced Robotics, Vol.25, pp. 651-673, 2011.
  4. [4] I. Sharp, J. Patton, M. Listenberger, and E. Case, “Haptic/graphic rehabilitation: Integrating a robot into a virtual environment library and applying it to stroke therapy,” J. of Visualized Experiments, Vol.54, 2007.
  5. [5] P. Lam, D. Hebert, J. Boger, H. Lacheray, D. Gardner, J. Apkarian, and A. Mihailidis, “A haptic-robotic platform for upper-limb reaching stroke therapy: Preliminary design and evaluation results,” J. of NeuroEngineering and Rehabilitation, Vol.5, No.15, 2008.
  6. [6] C. Mendoza, K. Sundaraj, and C. Laugier, “Faithful haptic feedback in medical simulators,” In Int. Symposium on Experimental Robotics, pp. 414-423, 2002.
  7. [7] T. R. Coles, D. Meglan, and N. W. John, “The role of haptics in medical training simulators: A survey of the state-of-the-art,” IEEE Trans. on Haptics, Vol.19, pp. 51-66, 2010.
  8. [8] W. Wu and P. A. Heng, “An improved scheme of an interactive finite element model for 3d soft-tissue cutting and deformation,” The Visual Computer, Vol.21, Issue 8-10, pp. 707-716, 2005.
  9. [9] C. Choi, J. Kim, H. Han, B. Ahn, and J. Kim, “Graphic and haptic modeling of the oesophagus for vr-based medical simulation,” Int. J. of Medical Robotics and Computer Assisted Surgery, Vol.5, No.3, pp. 257-266, 2009.
  10. [10] Y. J. Lim, D. Deo, T. P. Singh, D. B. Jones, and S. De, “In situ measurement and modeling of biomechanical response of human cadaveric soft tissues for physics-based surgical simulation,” Int. J. of Medical Robotics and Computer Assisted Surgery, Vol.23, No.6, pp. 1298-1307, 2009.
  11. [11] B. Knorlein, M. Di Luca, and M. Harders, “Influence of visual and haptic delays on stiffness perception in augmented reality,” In Int. Symposium on Mixed and Augmented Reality, pp. 49-52, 2009.
  12. [12] M. D. Luca, B. Knorlein, M. O. Ernst, and M. Harders, “Effects of visual haptic asynchronies and loading unloading movements on compliance perception,” Brain Research Bulletin, Vol.85, No.5, pp. 245-259, 2010.
  13. [13] M. Vicentini and D. Botturi, “Perceptual Issues Improve Haptic Systems Performance,” Chapter 22, pp. 415-436, In-Tech, 2010.
  14. [14] K. Nagata, M. Tada, H. Iwasaki, and Y. Kida, “Development of haptic recorder 1st report: Development of basic system,” In SICE System Integration Symposium, pp. 484-485, 2006.
  15. [15] K. Nagata, M. Tada, H. Iwasaki, and Y. Kida, “Development of haptic recorder 2nd report: Constructing a virtual object model based on actual measurement,” In SICE System Integration Symposium, pp. 11-12, 2007.
  16. [16] S. Jeon, S. Choi, and M. Harders, “Rendering virtual tumors in real tissue mock-up using haptic augmented reality,” IEEE Trans. on Haptics, Vol.5, No.1, pp. 77-84, 2012.
  17. [17] A.W. Freeman and K. O. Johnson, “A model accounting for effects of vibratory amplitude on responses of cutaneous mechanoreceptors in macaque monkey,” J. of Physiology, Vol.323, pp. 42-64, 1982.
  18. [18] S. J. Bolanowski, G. A. Gescheider, R. T. Verrillo, and C.M. Checkosky, “Four channels mediate the mechanical aspects of touch,” J. of the Acoustical Society of America, Vol.84, pp. 1680-1694, 1988.
  19. [19] S. Okamoto, M. Konyo, and S. Tadokoro, “Vibrotactile stimuli applied to finger pads as biases for perceived inertial and viscous loads,” IEEE Trans. on Haptics, Vol.4, No.4, pp. 307-315, 2011.
  20. [20] Y. Kurita, A. Ikeda, J. Ueda, and T. Ogasawara, “A fingerprint pointing device utilizing the deformation of the fingertip during the incipient slip,” IEEE Trans. on Robotics, Vol.21, No.5, pp. 801-811, 2005.

*This site is desgined based on HTML5 and CSS3 for modern browsers, e.g. Chrome, Firefox, Safari, Edge, Opera.

Last updated on Oct. 01, 2024