single-jc.php

JACIII Vol.16 No.7 pp. 771-783
doi: 10.20965/jaciii.2012.p0771
(2012)

Paper:

Rebo: A Pet-Like Strokable Remote Control

Kazuki Kobayashi*1, Seiji Yamada*2,*3,*4, Shinobu Nakagawa*5,
and Yasunori Saito*6

*1Graduate School of Science and Technology, Shinshu University, 4-17-1 Wakasato, Nagano City 380-8553, Japan

*2National Institute of Informatics, 2-1-2 Hitotsubashi, Chiyoda, Tokyo 101-8430, Japan

*3SOKENDAI, Shonan Village, Hayama, Kanagawa 240-0193, Japan

*4Tokyo Institute of Technology, 4259 Nagatsuta-cho, Midori-ku, Yokohama, Kanagawa 226-8503, Japan

*5Design Department, Osaka University of Arts, 469 Higashiyama, Kanan-cho, Minami Kawachi-gun, Osaka 585-8555, Japan

*6Faculty of Engineering, Shinshu University, 4-17-1 Wakasato, Nagano City 380-8553, Japan

Received:
May 2, 2012
Accepted:
September 19, 2012
Published:
November 20, 2012
Keywords:
pet-like remote control, remote control agent, stroke manipulation, action sloping
Abstract
This paper describes a pet-like remote control called Rebo for home appliances and TVs. Rebo has three new advantages over conventional remote controls: user-friendliness, function awareness, and functional manipulation by stroking touch panels. Its pet-like presence and facial expressions make it seem friendly to users. Its function awareness makes users easily aware of its functions through expressive feedback that informs them of the meaning of their manipulations by showing part of the function that is to be executed. The ability to manipulate its functions by stroking it like one would do a pet also enables users to use Rebo without having to look for buttons to push. We conducted experiments in which we monitored the eye movements of users while they operated Rebo and another remote control and administered questionnaires to users afterwards. The experimental results revealed significant aspects of Rebo and confirmed its advantages.
Cite this article as:
K. Kobayashi, S. Yamada, S. Nakagawa, and Y. Saito, “Rebo: A Pet-Like Strokable Remote Control,” J. Adv. Comput. Intell. Intell. Inform., Vol.16 No.7, pp. 771-783, 2012.
Data files:
References
  1. [1] B. LaPlant, S. Trewin, G.Zimmermann, and G. Vanderheiden, “The universal remote console: A universal access bus for pervasive computing,” IEEE Pervasive Computing, Vol.3, No.1, pp. 76-80, 2004.
  2. [2] C. Bartneck and M. Okada, “Robotic User Interfaces,” In Proc. of the Human and Computer Conf., pp. 130-140, 2001.
  3. [3] M. Heerink, B. Kröse, V. Evers, and B.Wielinga, “Relating conversational expressiveness to social presence and acceptance of an assistive social robot,” Virtual Reality, Vol.14, No.1, pp. 77-84, 2010.
  4. [4] M. Poel, D. Heylen, A. Nijholt, M. Meulemans, and A. van Breemen, “Gaze behaviour, believability, likability and the iCat,” AI & Society, Vol.24, No.1, pp. 61-73, 2009.
  5. [5] N. Matsumoto, H. Ueda, T. Yamazaki, and H. Murai, “Life with a Robot Companion: Video Analysis of 16-Days of Interaction with a Home Robot in a “Ubiquitous Home” Environment,” In Proc. of the Int. Conf. on Human-Computer Interaction, Part II: Novel Interaction Methods and Techniques, pp. 341-350, 2009.
  6. [6] D. Sekiguchi, M. Inami, N. Kawakami, and S. Tachi, “The design of internet-based RobotPHONE,” In Proc. of the Int. Conf. on Artificial Reality and Tele-existence, 2004.
  7. [7] N. Shimizu, N. Koizumi, M. Sugimoto, H. Nii, D. Sekiguchi, and M. Inami, “A teddy-bear-based robotic user interface,” Computers in Entertainment, Vol.4, No.3, p. 8, 2006.
  8. [8] S.Marti and C. Schmandt, “Physical embodiments for mobile communication agents,” In Proc. of the annual ACM symposium on User interface software and technology, pp. 231-240, 2005.
  9. [9] J. Saldien, K. Goris, S. Yilmazyildiz, W. Verhelst, and D. Lefeber, “On the design of the huggable robot Probo,” J. of Physical Agents, Vol.2, No.2, pp. 3-10, 2008.
  10. [10] T. Shibata, K. Wada, Y. Ikeda, and S. Sabanovic, “Cross-Cultural Studies on Subjective Evaluation of a Seal Robot,” Advanced Robotics, Vol.23, No.4, pp. 443-458, 2009.
  11. [11] C. Stanton, P. Kahn Jr, R. Severson, J. Ruckert, and B. Gill, “Robotic animals might aid in the social development of children with autism,” In Proc. of the ACM/IEEE Int. Conf. on Human Robot Interaction, pp. 271-278, 2008.
  12. [12] S. Yohanan and K. E. MacLean, “A tool to study affective touch,” In Proc. of the Int. Conf. Extended Abstracts on Human Factors in Computing Systems, pp. 4153-4158, 2009.
  13. [13] K. Kobayashi, Y. Nakagawa, S. Yamada, S. Nakagawa, and Y. Saito, “Interaction Design for a Pet-Like Remote Control,” In Proc. of the Int. Conf. on Social Robotics, pp. 134-139, 2009.
  14. [14] K. Kobayashi, Y. Nakagawa, S. Yamada, S. Nakagawa, and Y. Saito, “Rebo: A Remote Control with Strokes,” In Proc. of the Int. Workshop on Robot and Human Interactive Communication, pp. 751-756, 2009.
  15. [15] J.-Y. Sung, L. Guo, R. E. Grinter, and H. I. Christensen, “My Roomba is Rambo: intimate home appliances,” In Proc. of the Int. Conf. on Ubiquitous Computing, pp. 145-162, 2007.
  16. [16] K. Kobayashi, Y. Kitamura, and S. Yamada, “Action Sloping as a Way for Users to Notice a Robot’s Function,” In Proc. of IEEE Int. Symposium on Robot and Human Interactive Communication, pp. 445-450, 2007.
  17. [17] K. Kobayashi, S. Yamada, and Y. Kitamura, “Action Sloping for Increasing Awareness of Robot’s Function,” Trans. of Human Interface Society, Vol.10, No.1, pp. 37-46, 2008.
  18. [18] K. Kobayashi, K. Funakoshi, S. Yamada, M. Nakano, T. Komatsu, and Y. Saito, “Blinking Light Patterns as Artificial Subtle Expressions in Human-Robot Speech Interaction,” In Proc. of the IEEE Int. Symposium on Robot and Human Interactive Communication, pp. 181-186, 2011.
  19. [19] A. Austermann, S. Yamada, K. Funakoshi, and M. Nakano, “How do users interact with a pet-robot and a humanoid,” In Proc. of the Int. Conf. Extended Abstracts on Human Factors in Computing Systems, pp. 3727-3732, 2010.
  20. [20] M. Wright and A. Freed, “Open Sound Control: A New Protocol for Communicating with Sound Synthesizers,” In Proc. of the Int. Computer Music Conf., pp. 101-104, 1997.
  21. [21] J. Nichols, D. H. Chau, and B. A. Myers, “Demonstrating the viability of automatically generated user interfaces,” In Proc. of the SIGCHI Conf. on Human Factors in Computing Systems, pp. 1283-1292, 2007.
  22. [22] J. Nichols, B. A. Myers, M. Higgins, J. Hughes, T. K. Harris, R. Rosenfeld, and M. Pignol, “Generating remote control interfaces for complex appliances,” In Proc. of the Annual ACM Symposium on User Interface Software and Technology, pp. 161-170, 2002.
  23. [23] N. R. N. Enns and I. MacKenzie, “Touchpad-based remote control devices,” In Proc. of the SIGCHI Conf. on Human Factors in Computing systems, pp. 229-230, 1998.
  24. [24] S. Zhao, P. Dragicevic, M. Chignell, R. Balakrishnan, and P. Baudisch, “Earpod: eyes-free menu selection using touch input and reactive audio feedback,” In Proc. of the SIGCHI Conf. on Human Factors in Computing Systems, pp. 1395-1404, 2007.
  25. [25] V. Buil, G. Hollemans, and S. van de Wijdeven, “Headphones with touch control,” In Proc. of the Int. Conf. on Human Computer Interaction with Mobile Devices & Services, pp. 377-378, 2005.
  26. [26] S. Brewster, J. Lumsden, M. Bell, M. Hall, and S. Tasker, “Multimodal ‘eyes-free’ interaction techniques for wearable devices,” In the SIGCHI Conf. on Human Factors in Computing Systems, pp. 473-480, 2003.
  27. [27] M. Micire, J. L. Drury, B. Keyes, and H. A. Yanco, “Multi-touch interaction for robot control,” In Proc. of the Int. Conf. on Intelligent User Interfaces, pp. 425-428, 2009.
  28. [28] J. Nichols, “Informing automatic generation of remote control interfaces with human designs,” In CHI’02 Extended Abstracts on Human Factors in Computing Systems, pp. 864-865, 2002.
  29. [29] R. Balchandran, M. E. Epstein, G. Potamianos, and L. Seredi, “A multi-modal spoken dialog system for interactive TV,” In Proc. of the Int. Conf. on Multimodal Interfaces, pp. 191-192, 2008.
  30. [30] H. Osawa, R. Ohmura, and M. Imai, “Using Attachable Humanoid Parts for Realizing Imaginary Intention and Body Image,” Int. J. of Social Robotics, Vol.1, No.1, pp. 109-123, 2009.

*This site is desgined based on HTML5 and CSS3 for modern browsers, e.g. Chrome, Firefox, Safari, Edge, Opera.

Last updated on Dec. 06, 2024