single-rb.php

JRM Vol.20 No.6 pp. 872-879
doi: 10.20965/jrm.2008.p0872
(2008)

Paper:

Development of a Virtual Arm Wrestling System for Force Display Communication Analysis

Takashi Yamada* and Tomio Watanabe**

*Faculty of Education, Kagawa University, 1-1 Saiwai-cho, Takamatsu-shi, Kagawa 760-8522, Japan

**Faculty of Computer Science and System Engineering, Okayama Prefectural University, 111 Kuboki, Soja, Okayama 719-1197, Japan

Received:
March 12, 2008
Accepted:
June 4, 2008
Published:
December 20, 2008
Keywords:
force display communication, virtual arm wrestling, virtual human, friendly human interface, peripheral skin temperature
Abstract
The virtual arm wrestling system prototype we propose for force display communication analysis uses a 5-DOF force display system with 4 air cylinders and a force sensor we developed for virtual human affect display and interaction based on nonverbal human behavior and physiological measurement in arm wrestling. We evaluated the relationship between force display and a physiological index of peripheral finger skin temperature associated with circulation dynamics responding to forced action. We confirmed the system's effectiveness in analyzing force display communication.
Cite this article as:
T. Yamada and T. Watanabe, “Development of a Virtual Arm Wrestling System for Force Display Communication Analysis,” J. Robot. Mechatron., Vol.20 No.6, pp. 872-879, 2008.
Data files:
References
  1. [1] O. Hasegawa, S. Morishima, and M. Kaneko, “Processing of Facial Information by Computer,” The Transactions of the Institute of Electronics, Information and Communication Engineers. A, Vol.J80-A, No.8, pp. 1231-1249, 1997.
  2. [2] T. Kuroda and T.Watanabe, “Facial Color Image Analysis and Synthesis for the Virtual Face Image in Emotional Change,” Transactions of the Japan Society of Mechanical Engineers. C, Vol.65, No.638, pp. 232-238, 1999.
  3. [3] N. Tsumura, N. Ojima, K. Sato, M. Shiraishi, H. Shimizu, H. Nabeshima, S. Akazaki, K. Hori, and Y.Miyake, “Image-based skin color and texture analysis / synthesis by extracting hemoglobin and melanin information in the skin,” acm Transactions on Graphics, Vol.22, No.3, pp. 770-779, 2003.
  4. [4] H. Kobayashi and F. Hara, “Real Time Dynamic Control of 6 Basic Facial Expressions on Face Robot,” Journal of the Robotics Society of Japan, Vol.14, No.5, pp. 677-685, 1996.
  5. [5] H. Miwa, H. Takanobu, and A. Takanishi, “Development of a Human-like Head Robot for Emotional Communication with Human (1st Report, Development of Facial Expressions and Visual Sensation),” Transactions of the Japan Society of Mechanical Engineers. C, Vol.68, No.675, pp. 219-224, 2002.
  6. [6] Y. Maeda, T. Hara, and T. Araki, “Human-Robot Cooperative Manipulation with Motion Estimation Using Minimum-Jerk Model,” Transactions of the Japan Society of Mechanical Engineers. C, Vol.68, No.675, pp. 3367-3372, 1995.
  7. [7] Y. Matsuo and T. Inaba, “Control of Human-Machine Cooperation Systems,” Systems, Control and Information, Vol.45, No.11, pp. 645-650, 2001.
  8. [8] Y. Yamada, H. Daitoh, T. Sakai, and Y. Umetani, “Proposal of A Human Interface Technique for Reflecting The Operator’s Intentionality in Human/Robot Collaborative Conveyance Tasks,” Transactions of the Japan Society of Mechanical Engineers. C, Vol.67, No.656, pp. 1069-1076, 2001.
  9. [9] K. Kosuge, R. Suda, N. Kazamura, M. Sato, and H. Kakuya, “Cooperative Transportation of Object by Mobile Robot Helper and Human,” Transactions of the Japan Society of Mechanical Engineers. C, Vol.69, No.685, pp. 84-90, 2003.
  10. [10] T. Takeda and S. Kamohara, “Construction of VR System to the Arm Wrestling with Virtual Person,” Transactions of the Institute of Electronics, Information and Communication Engineers. A, Vol.J79-A, No.2, pp. 489-497, 1996.
  11. [11] M. Hashimoto, T. Hattori, M. Horiuchi, and T. Kamata, “Development of a Torque Sensing Robot Arm for Interactive Communication,” Proc. of the 2002 IEEE Int. Workshop on Robot and Human Interactive Communication, pp. 344-349, 2002.
  12. [12] S. Kitami, N. Yamamoto, K. Kano, and K. Tsutsumi, “Design and Prototype of a Remote Arm-Wrestling System,” Proc. of JSME Conf. on Robotics and Mechatronics, pp. 22, 2002.
  13. [13] T. Yamada and T.Watanabe, “Dynamic Analysis of Facial Color by Using the Force Feedback System for Virtual ArmWrestling,” Proc. of IEEE/ASME Int. Conf. on Advanced Intelligent Mechatronics, pp. 765-770, 2003.
  14. [14] T. Yamada and T. Watanabe, “Analysis and Synthesis of Facial Color for Facial Image Synthesis in a Virtual Arm Wrestling System,” Transactions of the Japan Society of Mechanical Engineers. C, Vol.70, No.635, pp. 139-146, 2004.
  15. [15] T. Tani, A. Koseki, A. Sakai, and S. Hattori, “Control Methods of Walk Training System,” Transactions of the Japan Society of Mechanical Engineers. C, Vol.62, No.597, pp. 1928-1934, 1996.
  16. [16] S. Kamohara, T. Yamada, Y. Ichinose, T. Takeda, and H. Takagi, “Rehabilitation Support by Multiaxis Force Display,” Journal of Robotics and Mechatronics, Vol.12, No.1, pp. 53-59, 2000.
  17. [17] T. Watanabe, M. Okubo, and T. Kuroda, “Evaluation of Emotion by Using Peripheral Skin Temperature – the Effectiveness of Measurement on the Back of Finger,” Proc. of the 7th IFAC/IFIP/IFORS/IEA Symposium on Analysis, Design and Evaluation of Man-Machine Systems, pp. 401-405, 1998.
  18. [18] S. Ohtsuka, “Evaluation of the Peripheral Skin Sympathetic Function by Cooling Thermography in Women with Climacteric Complaints,” Acta Obstetrica et Gynaecologica Japonica, Vol.47, No.1, pp. 49-54, 1995.
  19. [19] T. Yamada and T. Takeda, “Development of a Limb Force Display for Rehabilitation Game,” Proc. of the Fourteenth Symposium on Human Interface, pp. 443-446, 1998.

*This site is desgined based on HTML5 and CSS3 for modern browsers, e.g. Chrome, Firefox, Safari, Edge, Opera.

Last updated on Apr. 22, 2024