single-jc.php

JACIII Vol.29 No.4 pp. 796-802
doi: 10.20965/jaciii.2025.p0796
(2025)

Research Paper:

Design of a Human-Centric Robotic System for User Support Based on Gaze Information

Yuka Sone* and Jinseok Woo**,† ORCID Icon

*Sustainable Engineering Program, Graduate School of Engineering, Tokyo University of Technology
1404-1 Katakuracho, Hachioji, Tokyo 192-0982, Japan

**Department of Mechanical Engineering, School of Engineering, Tokyo University of Technology
1404-1 Katakuracho, Hachioji, Tokyo 192-0982, Japan

Corresponding author

Received:
December 12, 2024
Accepted:
March 27, 2025
Published:
July 20, 2025
Keywords:
human estimation, gaze analysis, human-centric system, human system interaction
Abstract

Recent advancements in mechanization and automation have significantly transformed households and retail environments, with automated services becoming increasingly prevalent. In general, smart appliances utilizing the IoT technology have gained widespread adoption, and computerized systems, such as self-checkout machines, are now commonplace in retail settings. However, these services require users to follow specific procedures and operate the systems according to predefined capabilities, which may exclude users who are unfamiliar with the systems or who require additional support. Although robots deliver essential services efficiently, their rigid designs limits their adaptability. By contrast, human service providers can flexibly tailor services by observing a customer’s condition through visual and auditory cues. For robots to offer more inclusive and user-friendly services, they must be capable of assessing user conditions and adapting their behaviors accordingly. Therefore, this paper proposes a control support system that analyzes user gaze behavior during interactions with smart appliances to provide context-aware support. Gaze data were collected using HoloLens 2, a mixed reality device, allowing the system to deliver information tailored to the user’s gaze direction. By providing an information support service through a robot based on an analysis of the user’s gaze, the user’s level of interest in the targeted environmental objects could be confirmed. Accordingly, a service that improves convenience and is tailored to the user could be provided. Finally, we discuss the effectiveness of the proposed human-centric robotic system through experiments.

Scene of the gaze measurement system

Scene of the gaze measurement system

Cite this article as:
Y. Sone and J. Woo, “Design of a Human-Centric Robotic System for User Support Based on Gaze Information,” J. Adv. Comput. Intell. Intell. Inform., Vol.29 No.4, pp. 796-802, 2025.
Data files:
References
  1. [1] J. Wirtz, W. Kunz, and S. Paluch, “The service revolution, intelligent automation and service robots,” European Business Review, Vol.29, No.5, pp. 38-44, 2021.
  2. [2] J. A. Aloysius, H. Hoehle, S. Goodarzi, and V. Venkatesh, “Big data initiatives in retail environments: Linking service process perceptions to shopping outcomes,” Annals of Operations Research, Vol.270, No.1, pp. 25-51, 2018. https://doi.org/10.1007/s10479-016-2276-3
  3. [3] M. Soori, B. Arezoo, and R. Dastres, “Artificial intelligence, machine learning and deep learning in advanced robotics, a review,” Cognitive Robotics, Vol.3, pp. 54-70, 2023. https://doi.org/10.1016/j.cogr.2023.04.001
  4. [4] S. H.-W. Chuah and J. Yu, “The future of service: The power of emotion in human-robot interaction,” J. of Retailing and Consumer Services, Vol.61, Article No.102551, 2021. https://doi.org/10.1016/j.jretconser.2021.102551
  5. [5] C. S. Song and Y.-K. Kim, “The role of the human-robot interaction in consumers’ acceptance of humanoid retail service robots,” J. of Business Research, Vol.146, pp. 489-503, 2022. https://doi.org/10.1016/j.jbusres.2022.03.087
  6. [6] A. Ajanki, D. R. Hardoon, S. Kaski, K. Puolamäki, and J. Shawe-Taylor, “Can eyes reveal interest? implicit queries from gaze patterns,” User Modeling and User-Adapted Interaction, Vol.19, pp. 307-339, 2009. https://doi.org/10.1007/s11257-009-9066-4
  7. [7] A. P. Bayliss, A. Frischen, M. J. Fenske, and S. P. Tipper, “Affective evaluations of objects are influenced by observed gaze direction and emotional expression,” Cognition, Vol.104, No.3, pp. 644-653, 2007. https://doi.org/10.1016/j.cognition.2006.07.012
  8. [8] F. Schrammel, S. Pannasch, S.-T. Graupner, A. Mojzisch, and B. M. Velichkovsky, “Virtual friend or threat? The effects of facial expression and gaze interaction on psychophysiological responses and emotional experience,” Psychophysiology, Vol.46, No.5, pp. 922-931, 2009. https://doi.org/10.1111/j.1469-8986.2009.00831.x
  9. [9] A. Plopski, T. Hirzle, N. Norouzi, L. Qian, G. Bruder, and T. Langlotz, “The eye in extended reality: A survey on gaze interaction and eye tracking in head-worn extended reality,” ACM Computing Surveys (CSUR), Vol.55, No.3, Article No.53, 2022. https://doi.org/10.1145/3491207
  10. [10] J. K. Haas, “A history of the unity game engine,” Worcester Polytechnic Institute, 2014.
  11. [11] P. Balakrishnan and H.-J. Guo, “Hololens 2 technical evaluation as mixed reality guide,” Int. Conf. on Human-Computer Interaction, pp. 145-165, 2024. https://doi.org/10.1007/978-3-031-61041-7_10
  12. [12] J. Woo, Y. Ohyama, and N. Kubota, “An expansion and application of human coexistence robot system using smart devices,” J. Adv. Comput. Intell. Intell. Inform., Vol.25, No.2, pp. 234-241, 2021. https://doi.org/10.20965/jaciii.2021.p0234
  13. [13] J. Woo, T. Sato, and Y. Ohyama, “Development of a human-centric system using an iot-based socially embedded robot partner,” J. Robot. Mechatron., Vol.35, No.3, pp. 859-866, 2023. https://doi.org/10.20965/jrm.2023.p0859
  14. [14] P. Bholowalia and A. Kumar, “Ebk-means: A clustering technique based on elbow method and k-means in wsn,” Int. J. of Computer Applications, Vol.105, No.9, pp. 17-24, 2014. https://doi.org/10.5120/18405-9674
  15. [15] A. M. Ikotun, A. E. Ezugwu, L. Abualigah, B. Abuhaija, and J. Heming, “K-means clustering algorithms: A comprehensive review, variants analysis, and advances in the era of big data,” Information Sciences, Vol.622, pp. 178-210, 2023. https://doi.org/10.1016/j.ins.2022.11.139

*This site is desgined based on HTML5 and CSS3 for modern browsers, e.g. Chrome, Firefox, Safari, Edge, Opera.

Last updated on Jul. 19, 2025