single-rb.php

JRM Vol.25 No.3 pp. 529-537
doi: 10.20965/jrm.2013.p0529
(2013)

Paper:

TouchMe: An Augmented Reality Interface for Remote Robot Control

Sunao Hashimoto*1,*5, Akihiko Ishida*2,*5, Masahiko Inami*3,*5,
and Takeo Igarashi*4,*5

*1Meiji University, 4-21-1 Nakano, Nakano-ku, Tokyo 164-8525, Japan

*2Tokyo University of Science, 1-3 Kagurazaka, Shinjuku-ku, Tokyo 162-8601, Japan

*3Keio University, 4-1-1 Hiyoshi, Kohoku-ku, Yokohama, Kanagawa 223-8521, Japan

*4The University of Tokyo, 7-3-1 Hongo, Bunkyo-ku, Tokyo 113-8656, Japan

*5JST ERATO Igarashi Design Interface Project, 7-3-1 Hongou, Bunkyo-ku, Tokyo 113-8656, Japan

Received:
October 18, 2012
Accepted:
April 22, 2013
Published:
June 20, 2013
Keywords:
remote robot control, augmented reality, touch screen, direct manipulation, third-person view
Abstract
General remote-control robots are manipulated by joysticks or game pads. These are difficult for inexperienced users, however, because the relationship between user input and the resulting robot movement may not be intuitive, e.g., tilting the joystick to the right to rotate the robot left. To solve this problem, we propose a touch-based interface called TouchMe for controlling a robot remotely from a third-person point of view. This interface allows the user to directly manipulate individual parts of a robot by touching it as seen by a camera. Our system provides intuitive operation allowing the user to use it with minimal training. In this paper, we describe TouchMe interaction and prototype implementation. We also introduce three types of movement for controlling the robot in response to user interaction and report on results of an empirical comparison of these methods.
Cite this article as:
S. Hashimoto, A. Ishida, M. Inami, and T. Igarashi, “TouchMe: An Augmented Reality Interface for Remote Robot Control,” J. Robot. Mechatron., Vol.25 No.3, pp. 529-537, 2013.
Data files:
References
  1. [1] M. Tani, K. Yamaashi, K. Tanikoshi, M. Futakawa, and S. Tanifuji, “Object-oriented video: interaction with real-world objects through live video,” In Proc. of the CHI ’92, pp. 593-598, 1992.
  2. [2] T. Seifried, M. Haller, S. D. Scott, F. Perteneder, C. Rendl, D. Sakamoto, and M. Inami, “CRISTAL: A Collaborative Home Media and Device Controller Based on aMulti-touch Display,” In Proc. of the Tabletop ’09, pp. 33-40, 2009.
  3. [3] D. Sakamoto, K. Honda, M. Inami, and T. Igarashi, “Sketch and Run: A Stroke-based Interface for Home Robots,” In Proc. of the CHI ’09, pp. 197-200, 2009.
  4. [4] J. Kato, D. Sakamoto, M. Inami, and T. Igarashi, “Multi-touch Interface for Controlling Multiple Mobile Robots,” In Proc. of the CHI’ 09, pp. 3443-3448, 2009.
  5. [5] C. Guo, J. E. Young, and E. Sharlin, “Touch and toys: new techniques for interaction with a remote group of robots,” In Proc. of the CHI’ 09, pp. 491-500, 2009.
  6. [6] T. Sekimoto, T. Tsubouchi, and S. Yuta, “A Simple Driving Device for a Vehicle Implementation and Evaluation,” In Proc. of the IROS ’97, pp. 147-154, 1997.
  7. [7] T. Fong, C. Thorpe, and B. Glass, “PdaDriver: A Handheld system for Remote Driving,” In Proc. of the ICAR ’03, 2003.
  8. [8] A. Correa, M. R. Walter, L. Fletcher, J. Glass, S. Teller, and R. Davis, “Multimodal Interaction with an Autonomous Forklift,” In Proc. of the HRI 2010, 2010.
  9. [9] K. Hosoi and M. Sugimoto, “Shepherd: A Mobile Interface for Robot Control from a User’s Viewpoint,” In Proc. of the ROBIO ’06, pp. 908-913, 2006.
  10. [10] M. Sugimoto, G. Kagotani, H. Nii, N. Shiroma, M. Inami, and F. Matsuno, “Time follower’s vision,” In Proc. of the SIGGRAPH ’04, p. 29, 2004.
  11. [11] A. Nawab, K. Chintamani, D. Ellis, G. Auner, and A. Pandya, “Joystick mapped Augmented Reality Cues for End-Effector controlled Tele-operated Robots,” In Proc. of the IEEE Virtual Reality ’07, pp. 263-266, 2007.
  12. [12] K. Kobayashi, K. Nishiwaki, S. Uchiyama, H. Yamamoto, S. Kagami, and T. Kanade, “Overlay what Humanoid Robot Perceives and Thinks to the Real-world by Mixed Reality System,” In Proc. of the ISMAR ’07, pp. 1-2, 2007.
  13. [13] I. Y. H. Chen, B. MacDonald, and B. Wünsche, “Mixed reality simulation for mobile robots,” In Proc. of the ICRA ’09, pp. 922-927, 2009.
  14. [14] D. Drascic, J. J. Grodski, P. Milgram, K. Ruffo, P. Wong, and S. Zhai, “ARGOS: A Display System for Augmenting Reality,” In Proc. of the INTERACT ’93, p. 521, 1993.
  15. [15] Y. Xiong, S. Li, and M. Xie, “Predictive display and interaction of telerobots based on augmented reality,” Robotica, Vol.24, pp. 447-453, 2006.
  16. [16] B. D. Conner, S. S. Snibbe, K. P. Herndon, D. C. Robbins, R. C. Zeleznik, and A. Van Dam, “Three-dimensional widgets,” In Proc. of the SI3D ’92, pp. 183-188, 1992.
  17. [17] H. Kato and M. Billinghurst, “Marker Tracking and HMD Calibration for a Video-based Augmented Reality Conferencing System,” In Proc. of the IWAR ’99, pp. 85-94, 1999.

*This site is desgined based on HTML5 and CSS3 for modern browsers, e.g. Chrome, Firefox, Safari, Edge, Opera.

Last updated on Apr. 19, 2024