JRM Vol.25 No.5 pp. 855-862
doi: 10.20965/jrm.2013.p0855


Impression Difference Between Intelligent Medicine Case and Small Service Robot in Self-Medication Support Situations

Takuo Suzuki, Yuta Jose, and Yasushi Nakauchi

University of Tsukuba, 1-1-1 Tennodai, Tsukuba, Ibaraki 305-8573, Japan

February 14, 2012
July 23, 2013
October 20, 2013
management support, medicine organizers, animal-shaped robots, user interfaces, semantic differential

Medication management support systems were developed to prevent recipient’s mistakes such as forgetting to take medicine. The systems remind recipients to take medicine at the right time via a medicine case with a built-in speaker or display, and they must keep recipient motivation to continue doing so. In this research, the authors found factors to keep the motivation by evaluating recipient impression of a management support system. In addition to an intelligent medicine case, a small service robot was used as a reminder to supplement the above because some researchers reported that a robot has good features as a user interface. The authors defined three experimental situations and two conditions and compared the conditions in the situations based on a semantic differential. Questionnaires for overall evaluation and video analysis for objective evaluation were also used. Experimental results suggested that humane, friendly, flamboyant, sunny, and simple impressions improve recipient motivation and concentration.

Cite this article as:
Takuo Suzuki, Yuta Jose, and Yasushi Nakauchi, “Impression Difference Between Intelligent Medicine Case and Small Service Robot in Self-Medication Support Situations,” J. Robot. Mechatron., Vol.25, No.5, pp. 855-862, 2013.
Data files:
  1. [1] M. Governo, V. Riva, P. Fiorini, and C. Nugent, “MEDICATE Teleassistance System,” IEEE Proc. of 11th Int. Conf. on Advanced Robotics (ICAR2003), pp. 191-196, 2003.
  2. [2] A. Agarawala, S. Greenberg, and G. Ho, “The Context-Aware Pill Bottle and Medication Monitor,” ACM Proc. of 6th Int. Conf. on Ubiquitous Computing (UBICOMP2004), video, 2004.
  3. [3] V. Fook, J. Tee, K. Yap, A. Wai, J. Maniyeri, B. Jit, and P. Lee, “Smart Mote-Based Medical System for Monitoring and Handling Medication Among Persons with Dementia,” ACM Proc. of 5th Int. Conf. on Smart Homes and Health Telematics (ICOST2007), pp. 54-62, 2007.
  4. [4] R. Beer, R. Keijeers, S. Shahid, A. Mahmud, and O. Mubin, “PMD: Designing a Portable Medicine Dispenser for Persons Suffering from Alzheimer’s Disease,” Proc. of 12th Int. Conf. on Computers Helping People with Special Needs (ICCHP2010), pp. 332-335, 2010.
  5. [5] T. Suzuki and Y. Nakauchi, “Intelligent Medicine Case for Dosing Monitoring: Design and Implementation,” SICE J. of Control, Measurement, and System Integration (JCMSI), Vol.4, No.2, pp. 163-171, 2011.
  6. [6] R. M. Ryan and E. L. Deci, “Intrinsic and Extrinsic Motivations: Classic Definitions and New Directions,” Contemporary Educational Psychology, Vol.25, Issue 1, pp. 54-67, 2000.
  7. [7] Y. Fujita, “Personal Robot PaPeRo,” J. of Robotics and Mechatronics, Vol.14, No.1, pp. 60-63, 2002.
  8. [8] M. Imai, T. Ono, and H. Ishiguro, “Physical relation and expression: joint attention for human-robot interaction,” IEEE Trans. on Industrial Electronics, Vol.50, Issue 4, pp. 636-643, 2003.
  9. [9] H. Kobayashi, Y. Ichikawa, M. Senda, and T. Shiiba, “Toward rich facial expression by face robot,” IEEE Proc. of 2002 Int. Symposium on Micromechatronics and Human Science (MHS2002), pp. 139-145, 2002.
  10. [10] T. Nakata, T. Mori, and T. Sato, “Analysis of Impression of Robot Bodily Expression,” J. of Robotics and Mechatronics, Vol.14, No.1, pp. 27-36, 2002.
  11. [11] K. Sakata, T. Takubo, K. Inoue, S. Nonaka, Y. Mae, and T. Arai, “Psychological evaluation on shape and motions of real humanoid robot,” IEEE Proc. of 13th Int. Workshop on Robot and Human Interactive Communication (Ro-Man2004), pp. 29-34, 2004.
  12. [12] K. Berns and J. Hirth, “Control of facial expressions of the humanoid robot head ROMAN,” IEEE/RSJ Proc. of 2006 Int. Conf. on Intelligent Robots and Systems (IROS2006), pp. 3119-3124, 2006.
  13. [13] T. P. Spexard, M. Hanheide, and G. Sagerer, “Human-Oriented Interaction With an Anthropomorphic Robot,” IEEE Trans. on Robotics, Vol.23, No.5, pp. 852-862, 2007.
  14. [14] S. S. Ge, C. Wang, and C. C. Hang, “Facial expression imitation in human robot interaction,” IEEE Proc. of 17th Int. Symposium on Robot and Human Interactive Communication (Ro-Man2008), pp. 213-218, 2008.
  15. [15] T. Komatsu and M. Nambu, “Effects of the agents’ appearance on people’s estimations about the agents’ abilities: Questionnaire investigations for liberal arts and informatics students,” IEEE Proc. of 17th Int. Conf. on Robot and Human Interactive Communication (Ro-Man2008), pp. 142-147, 2008.
  16. [16] B. Mutlu, “Designing Embodied Cues for Dialogue with Robots,” AI MAGAZINE, Vol.32, Issue 4, pp. 17-30, 2011.
  17. [17] H. Ueda, M. Minoh, M. Chikama, J. Satake, A. Kobayashi, K. Miyawaki, and M. Kidode, “Human-Robot Interaction in the Home Ubiquitous Network Environment,” ACM Proc. of 12th Int. Conf. on Human-Computer Interaction (HCI07), pp. 990-997, 2007.
  18. [18] H. Osawa, R. Ohmura, and M. Imai, “Using Attachable Humanoid Parts for Realizing Imaginary Intention and Body Image,” Int. J. of Social Robotics, Vol.1, No.1, pp. 109-123, 2008.
  19. [19] H. Osawa, Y. Matsuda, R. Ohmura, and M. Imai, “Embodiment of an agent by anthropomorphization of a common object,” Int. J. of Web Intelligence and Agent Systems, Vol.10, pp. 345-358, 2012.
  20. [20] J. Holmes, “John Bowlby and Attachment Theory,” Routledge, 1993.
  21. [21] P. Ekman, “Expression Or Communication About Emotion?,” Uniting Psychology and Biology: Integrative Perspectives on Human Development, pp. 315-338, 1997.
  22. [22] A. Mehrabian, “Inference of attitude from the posture, orientation and distance of a communicator,” J. of Consulting and Clinical Psychology (JCCP), Vol.32, No.3, pp. 296-308, 1968.
  23. [23] A. Mehrabian, “Relationship of attitude to seated posture, orientation and distance,” J. of Personality and Social Psychology (JSPS), Vol.10, No.1, pp. 26-30, 1968.
  24. [24] A.Mehrabian and J. T. Friar, “Encoding of attitude by a seated communicator via posture and position cues,” J. of Consulting and Clinical Psychology (JCCP), Vol.33, No.3, pp. 330-336, 1969.
  25. [25] C. E. Osgood, G. J. Suci, and P. Tannenbaum, “The Measurement of Meaning,” Univ. Illinois Press, 1957.
  26. [26] S. Shibata and H. Inooka, “Psychological evaluations of robot motions,” Int. J. of Industrial Ergonomics, Vol.21, pp. 483-494, 1998.
  27. [27] T. Kanda, T. Miyashita, T. Osada, Y. Haikawa, and H. Ishiguro, “Analysis of Humanoid Appearances in Human-robot Interaction,” IEEE Trans. on Robotics, Vol.24, pp. 725-735, 2008.
  28. [28] K. Ohara, S. Negi, T. Takubo, Y. Mae, and T. Arai, “Evaluation of virtual and real robot based on human impression,” IEEE Proc. of 18th Int. Workshop on Robot and Human Interactive Communication (Ro-Man2009), pp. 873-878, 2009.

*This site is desgined based on HTML5 and CSS3 for modern browsers, e.g. Chrome, Firefox, Safari, Edge, Opera.

Last updated on Mar. 01, 2021