single-rb.php

JRM Vol.19 No.2 pp. 148-159
doi: 10.20965/jrm.2007.p0148
(2007)

Paper:

Development of Four Kinds of Mobile Robot with Preliminary-Announcement and Indication Function of Upcoming Operation

Takafumi Matsumaru

Bio-Robotics & Human-Mechatronics Laboratory, Shizuoka University, 3-5-1 Johoku, Hamamatsu, Shizuoka 432-8561, Japan

Received:
November 2, 2006
Accepted:
February 13, 2007
Published:
April 20, 2007
Keywords:
mutual coexistence, mobile robot, preliminary-announcement and indication, upcoming operation, informational affinity
Abstract
We propose approaches and equipment for preliminarily announcing and indicating to people the speed and direction of movement of mobile robots moving on a two-dimensional plane. We introduce the four approaches categorized into (1) announcing the state just after the present and (2) indicating operations from the present to some future time continuously. To realize the approaches, we use omni-directional display (PMR-2), flat-panel display (PMR-6), laser pointer (PMR-1), and projection equipment (PMR-5) for the announcement unit of protobots. The four protobots were exhibited at the 2005 International Robot Exhibition (iREX05). We had visitors answer questionnaires in a 5-stage evaluation. The projector robot PMR-5 received the highest evaluation score among the four. An examination of differences by gender and age suggested that some people prefer simple information, friendly expressions, and a minimum of information to be presented at one time.
Cite this article as:
T. Matsumaru, “Development of Four Kinds of Mobile Robot with Preliminary-Announcement and Indication Function of Upcoming Operation,” J. Robot. Mechatron., Vol.19 No.2, pp. 148-159, 2007.
Data files:
References
  1. [1] Research Committee on Human Friendly Robot, “Technical Targets of Human Friendly Robots,” J. of the Robotics Society of Japan, 16(3), pp. 288-294, 1998.
  2. [2] N. Sugimoto, “Robot-Safety and Intelligent Fail-Safe,” J. of the Robotics Society of Japan, 2(2), pp. 158-163, 1984.
  3. [3] N. Sugimoto, “Robot and Safety,” J. of the Robotics Society of Japan, 3(1), pp. 56-59, 1985.
  4. [4] E. Prassler, D. Bank, and B. Kluge, “Key Technologies in Robot Assistants: Motion Coordination Between a Human and a Mobile Robot,” Trans. on Control, Automation and Systems Engineering, 4(1), pp. 56-61, 2002.
  5. [5] J. M. H. Wandosell and B. Graf, “Non-Holonomic Navigation System of a Walking-Aid Robot,” Proc. of the 11th IEEE Int. Workshop on Robot and Human interactive Communication (ROMAN2002), pp. 518-523, 2002.
  6. [6] Y. Yamada and N. Sugimoto, “Evaluation of Human Pain Tolerance,” J. of the Robotics Society of Japan, 13(5), pp. 639-642, 1995.
  7. [7] M. Inaba, Y. Hoshino, and H. Inoue, “A Full-Body Tactile Sensor Suit Using Electrically Conductive Fabric,” J. of the Robotics Society of Japan, 16(1), pp. 80-86, 1998.
  8. [8] T. Morita, Y. Suzuki, T. Kawasaki, and S. Sugano, “Anticollision Safety Design and Control Methodology for Human-Symbiotic Robot Manipulator,” J. of the Robotics Society of Japan, 16(1), pp. 102-109, 1998.
  9. [9] D. Vischer and O. Khatib, “Design and Development of High Performance Torque Controlled Joints,” IEEE Trans. on Robotics and Automation, 11(4), pp. 537-544, 1995.
  10. [10] K. Koganezawa, M. Yamazaki, and N. Ishikawa, “Mechanical Stiffness Control of Tendon-Driven Joints,” J. of the Robotics Society of Japan, 18(7), pp. 101-108, 2000.
  11. [11] M. Sakaguchi, J. Furusho, G. Zhang, and Z. Wei, “Development of ER Actuator and Basic Study on its Force Control System,” J. of the Robotics Society of Japan, 16(8), pp. 1108-1114, 1998.
  12. [12] K. Mase, “Automatic Extraction and Recognition of face and gesture,” J. of the Robotics Society of Japan, 16(6), pp. 745-748, 1998.
  13. [13] E. Ueda, Y. Matsumoto, M. Imai, and T. Ogasawara, “Hand Pose Estimation for Vision Based Human Interface,” Proc. of 10th IEEE Int. Workshop on Robot and Human Communication (ROMAN 2001), pp. 473-478, 2001.
  14. [14] L. Bretzner, I. Laptev, and T. Lindeberg, “Hand gesture recognition using multi-scale colour features, hierarchical models and particle filtering,” Proc. 5th IEEE Int. Conf. on Automatic Face and Gesture Recognition, pp. 405-410, 2002.
  15. [15] H. Fei and I. Reid, “Dynamic Classifier for Non-rigid Human motion analysis,” British Machine Vision Conf., p. 118, 2004.
  16. [16] Y. Matsumoto, M. Inaba, and H. Inoue, “View-Based Approach to Robot Navigation,” J. of the Robotics Society of Japan, 20(5), pp. 44-52, 2002.
  17. [17] Y. Kuno, N. Shimada, and Y. Shirai, “Look where you’re going: A robotic wheelchair based on the integration of human and environmental observations,” IEEE Robotics and Automation Magazine, 10(1), pp. 26-34, 2003.
  18. [18] T. Ohno and N. Mukawa, “A Free-head, Simple Calibration, Gaze Tracking System That Enables Gaze-Based Interaction,” Proc. of Eye Tracking Research & Application Symposium 2004 (ETRA 2004), pp. 115-122, 2004.
  19. [19] Y. Liu, K. Schmidt, J. F. Cohn, and R. L. Weaver, “Human facial asymmetry for expression-invariant facial identification,” Proc. of the Fifth IEEE Int. Conf. on Automatic Face and Gesture Recognition (FG’02), pp. 208-214, 2002.
  20. [20] Y. -L. Tian, T. Kanade, and J. Cohn, “Facial expression analysis,” in S. Z. Li and A. K. Jain (Ed.), “Handbook of face recognition,” Springer, 2005.
  21. [21] P. D. Wellner, “The DigitalDesk Calculator: Tactile Manipulation on a Desk Top Display,” Proc. ACM Symp. on User Interface Software and Technology (UIST ’91), pp. 27-33, 1991.
  22. [22] H. Koike, Y. Sato, and Y. Kobayashi, “Integrating paper and digital information on EnhancedDesk: a method for realtime finger tracking on an augmented desk system,” ACM Transactions on Computer-Human Interaction (TOCHI), 8(4), pp. 307-322, 2001.
  23. [23] J. Rekimoto, “SmartSkin: An Infrastructure for Freehand Manipulation on Interactive Surfaces,” Proc. SIGCHI conf. on Human factors in computing systems (CHI ’02), pp. 113-120, 2002.
  24. [24] J. Patten, H. Ishii, J. Hines, and G. Pangaro, “Sensetable: A Wireless Object Tracking Platform for Tangible User Interfaces,” Proc. of the SIGCHI conference on Human factors in computing systems (CHI ’01), pp. 253-260, 2001.
  25. [25] B. Piper, C. Ratti, and H. Ishii, “Illuminating Clay: A 3-D Tangible Interface for Landscape Analysis,” Proc. of the SIGCHI conference on Human factors in computing systems (CHI ’02), pp. 355-362, 2002.
  26. [26] T. Matsui and M. Tsukamoto, “An Integrated Teleoperation Method for Robots using Multi-Media-Display,” J. Robotics Society of Japan, 6(4), pp. 301-310, 1988.
  27. [27] M. Terashima and S. Sakane, “A Human-Robot Interface Using an Extended Digital Desk Approach,” J. Robotics Society of Japan, 16(8), pp. 1091-1098, 1998.
  28. [28] H. K. Keskinpala, J. A. Adams, and K. Kawamura, “PDA-Based Human-Robotic Interface,” Proc. of the 2003 IEEE Int. Conf. on Systems, Man, and Cybernetics, pp. 3931-3936, 2003.
  29. [29] T. W. Fong, C. Thorpe, and B. Glass, “PdaDriver: A Handheld System for Remote Driving,” the 11th Int. Conf. on Advanced Robotics 2003, pp. 88-93, 2003.
  30. [30] T. Ogata and S. Sugano, “Emotional Communication between Humans and the Autonomous Robot WAMOEBA-2 (Waseda Amoeba) which has the Emotion Model,” JSME Int. J., Series C, 43(3), pp. 586-574, 2000.
  31. [31] Y. Wakita, S. Hirai, T. Suehiro, T. Hori, and K. Fujiwara, “Information Sharing via Projection Function for Coexistence of Robot and Human,” Autonomous Robots, 10(3), pp. 267-277, 2001.
  32. [32] Y. Kawakita, R. Ikeura, and K. Mizutani, “Previous notice method of robotic arm motion for suppressing threat to human,” Japanese J. of Ergonomics, 37(5), pp. 252-262, 2001.
  33. [33] A. Hagiwara, R. Ikeura, Y. Kawakita, and K. Mizutani, “Previous Notice Method of Robotic Arm Motion for Suppressing Threat to Human,” J. of the Robotics Society of Japan, 21(4), pp. 67-74, 2003.
  34. [34] TOYOTA, “Company – News Release: Toyota, Hino, Daihatsu to Jointly Exhibit at 11th ITS World Congress,” September 22, 2004.
    http://www.toyota.co.jp/en/news/04/0922_2.html
  35. [35] T. Matsumaru and K. Hagiwara, “Method and Effect of Preliminary-Announcement and Display for Translation of Mobile Robot,” Proc. of the 10th Int. Conf. on Advanced Robotics (ICAR 2001), pp. 573-578, 2001.
  36. [36] T. Matsumaru and K. Hagiwara, “Preliminary-Announcement and Display for Translation and Rotation of Human-Friendly Mobile Robot,” Proc. of 10th IEEE Int. Workshop on Robot and Human Communication (ROMAN 2001), pp. 213-218, 2001.
  37. [37] T. Matsumaru, H. Endo, and T. Ito, “Examination by Software Simulation on Preliminary-Announcement and Display of Mobile Robot’s Following Action by Lamp or Blowouts,” 2003 IEEE Int. Conf. on Robotics and Automation (2003 IEEE ICRA), pp. 362-367, 2003.
  38. [38] T. Matsumaru, S. Kudo, T. Kusada, K. Iwase, K. Akiyama, and T. Ito, “Simulation on Preliminary-Announcement and Display of Mobile Robot’s Following Action by Lamp, Party-blowouts, or Beam-light,” IEEE/ASME Int. Conf. on Advanced Intelligent Mechatronics (AIM 2003), pp. 771-777, 2003.
  39. [39] T. Matsumaru, S. Kudo, H. Endo, and T. Ito, “Examination on a Software Simulation of the Method and Effect of Preliminary-announcement and Display of Human-friendly Robot’s Following Action,” Trans. of SICE, 40(2), pp. 189-198, 2004.
  40. [40] Lumino GmbH, “Magicball,”
    http://www.magicball.de/
  41. [41] T. Matsumaru, K. Iwase, K. Akiyama, T. Kusada, and T. Ito, “Mobile robot with eyeball expression as the preliminary-announcement and display of the robot’s following motion,” Autonomous Robots, 18(2), pp. 231-246, 2005.
  42. [42] T. Matsumaru, “Mobile Robot with Preliminary-announcement and Indication Function of Forthcoming Operation using Flatpanel Display,” 2007 IEEE Int. Conf. on Robotics and Automation (ICRA’07), 2007.
  43. [43] T. Matsumaru, T. Kusada, and K. Iwase, “Mobile Robot with Preliminary-Announcement Function of Following Motion using Light-ray,” The 2006 IEEE/RSJ International Conference on Intelligent Robots and Systems (IROS 2006), pp. 1516-1523, 2006.
  44. [44] T. Matsumaru, “Mobile Robot with Preliminary-announcement and Display Function of Following Motion using Projection Equipment,” The 15th IEEE International Symposium on Robot and Human Interactive Communication (RO-MAN 06), pp. 443-450, 2006.
  45. [45] M. Akamatsu, “Establishing Driving Behavior Database and its Application to Active Safety Technologies,” J. of Society of Automotive Engineers of Japan, 57(12), pp. 34-39, 2003.

*This site is desgined based on HTML5 and CSS3 for modern browsers, e.g. Chrome, Firefox, Safari, Edge, Opera.

Last updated on Dec. 06, 2024