JACIII Vol.21 No.3 pp. 483-495
doi: 10.20965/jaciii.2017.p0483


Investigating Effectiveness of an Expression Education Support Robot That Nods and Gives Hints

Koki Suzuki* and Masayoshi Kanoh**

*Graduate School of Computer and Cognitive Sciences, Chukyo University
101-2 Yagoto Honmachi, Showa-ku, Nagoya 466-8666, Japan

**School of Engineering, Chukyo University
101-2 Yagoto Honmachi, Showa-ku, Nagoya 466-8666, Japan

July 4, 2016
January 10, 2017
Online released:
May 19, 2017
May 20, 2017
expression education, robot for supporting education, human symbiotic robot, human robot interaction
In recent years, expression education, which improves the imagination and communication skills, has been attracting attention. It is necessary to provide the appropriate support and evaluation to each expression education student based on their imagination level and individual learning skills. However, it is difficult to match the individual students’ needs in the present educational institutions, where the instructions are typically offered in group settings. In this study, a robot is used as a substitute to the human instructor. Previous studies in the field proposed the use of robots that offer hints to boost the students’ imagination. This paper suggested that the robot had an education-support effect similar to that of a human instructor. However, it is critical to ensure that the learners are interested in the robot in order to achieve positive learning effects. Thus, we performed an experiment to verify the effectiveness of an expression education support robot that nods, as a form of non-verbal communication. The study results indicate the success of the proposed robot in improving the students learning effect and suggest the effectiveness of nodding as well as providing hints.
Cite this article as:
K. Suzuki and M. Kanoh, “Investigating Effectiveness of an Expression Education Support Robot That Nods and Gives Hints,” J. Adv. Comput. Intell. Intell. Inform., Vol.21 No.3, pp. 483-495, 2017.
Data files:
  1. [1] B. Way, “Development through Drama,” London: Longman Group Ltd, 1967.
  2. [2] J. Han, M. Jo, V. Jones, and J. H. Jo, “Comparative Study on the Education Use of Home Robots for Children,” J. of Information Processing System, Vol.4, No.4, pp. 159-168, 2008.
  3. [3] T. Kanda, T. Hirano, D. Eaton, and H. Ishiguro, “Interactive Robots as Social Partners and Peer Tutors for Children: A Field Trial,” J. Human-Computer Interaction, Vol.19, No.1, pp. 61-84, 2004.
  4. [4] N. Yamazaki, Y. Tomita, Y. Hatano, N. Nagata, Y. Hirabayashi, M. Hirano, and J. Kurahachi, “Educational Video on Basic Vocabulary and Expressions Science Technology,” Japanese Language Education Methods, Vol.2, No.1, pp. 40-41, 1995 (in Japanese).
  5. [5] Y. Yamauchi, T. Sunaga, Y. Nagai, and M. Taguchi, “Future Board: Supporting Collaborative Learning with Design Activities,” Japan Society for Educational Technology, Vol.24, pp. 53-58, 2000 (in Japanese).
  6. [6] K. Suzuki and M. Kanoh, “Effectiveness of a Robot for Supporting Expression Education,” The 2015 Conf. on Technologies and Applications of Artificial Intelligence, pp. 498-501, 2015.
  7. [7] D. Wood, J. S. Bruner, and G. Ross, “The Role of Tutoring in Problem Solving,” J. of Child Psychology and Psychiatry, No.17, pp. 89-100, 1976.
  8. [8] T. Watanabe, “InterRobot: Speech-Driven Embodied Interaction Robot,” J. of Robotics Society of Japan, Vol.24, No.6, pp. 692-695, 2006 (in Japanese).
  9. [9] K. Suzuki, S. Yamada, and M. Kanoh, “Verifying Effectiveness of an Expression Education Support Robot that Nods and Gives Hints,” IEEE World Congress on Computational Intelligence, 2016.
  10. [10] M. Kanoh, S. Iwata, S. Kato, and H. Itoh, “Emotive Facial Expressions of Sensitivity Communication Robot “Ifbot”,” Kansei Engineering Int., Vol.5, No.3, pp. 35-42, 2005.
  11. [11] J. Lee, H. Takahashi, C. Nagaki, G. Obinata, and D. Stefanov, “Which Robot Features Can Stimulate Better Responses from Children with Autism in Robot-Assisted Therapy?,” Int. J. of Advanced Robotic Systems, Vol.9, pp. 72, 2012.
  12. [12] H. Shibata, M. kanoh, S. Kato, and H. Itoh, “Subjective Effects of Face Change with Emotion Transition by Using Emotional Space of Communication Robot “Ifbot”,” J. of Robotics Society of Japan Fuzzy Theory and Intelligent Informatics, Vol.21, No.5, pp. 630-639, 2009 (in Japanese).
  13. [13] F. Jimenez and M. Kanoh, “Research Trends on Educational-support Robots,” J. of Robotics Society of Japan Fuzzy Theory and Intelligent Informatics, Vol.26, No.1, pp. 2-8, 2014 (in Japanese).
  14. [14] F. Jimenez, T. Yoshikawa, T. Furuhashi, and M. Kanoh, “An Emotional Expression Model for Educational-Support Robots,” J. of Artificial Intelligence and Soft Computing Research, Vol.5, No.1, pp. 51-57, 2015.
  15. [15] F. Jimenez, M. Kanoh, T. Yoshikawa, T. Furuhashi, and T. Nakamura, “Effect of Robot Utterances Using Onomatopoeia on Collaborative Learning,” J. of Artificial Intelligence and Soft Computing Research, Vol.4, No.2, pp. 125-131, 2014.
  16. [16] M. Okamoto and T. Ishida, “Wizard of Oz Method for Learning Dialog Agents,” Lecture Notes in Artificial Intelligence, Vol.2182, pp. 20-25, 2001.
  17. [17] N. Dahlbäck, A. Jönsson, and L. Ahrenberg, “Wizard of Oz studies – Why and How,” Proc. of the Int. Workshop on Intelligent User Interfaces, pp. 193-200, 1992.
  18. [18] Y. Hirai and T. Inoue, “Collaboration Estimation in Pair-programming Learning: Conversation Differences between Success and Failure in Problem Solving,” Information Processing Society of Japan, Vol.53, No.1, pp. 72-80, 2012 (in Japanese).
  19. [19] P. Ekman and W. Friesen, “The Repertoire of Nonverbal Behavior:Categories, Origins, Usage, and Coding,” SEMIOTICA, Vol.1, No.1, pp. 49-98, 1969.
  20. [20] Y. Benjamini and Y. Hochberg, “Controlling the False Discovery Rate: A Practical and Powerful Approach to Multiple Testing,” J. of the Royal Statistical Society Series B (Methodological), Vol.57, No.1, pp. 289-300, 1995.

*This site is desgined based on HTML5 and CSS3 for modern browsers, e.g. Chrome, Firefox, Safari, Edge, Opera.

Last updated on Jul. 12, 2024