single-rb.php

JRM Vol.34 No.5 pp. 1192-1204
doi: 10.20965/jrm.2022.p1192
(2022)

Paper:

Waypoint-Based Human-Tracking Navigation for Museum Guide Robot

Kaito Ichihara*, Tadahiro Hasegawa*, Shin’ichi Yuta*, Hirohisa Ichikawa**, and Yoshihide Naruse**

*Shibaura Institute of Technology
3-7-5 Toyosu, Koto-ku, Tokyo 135-8548, Japan

**SANGO Co., Ltd.
3-1 Konosucho, Toyota-shi, Aichi 471-0836, Japan

Received:
February 4, 2022
Accepted:
August 5, 2022
Published:
October 20, 2022
Keywords:
guidance robot, human-following driving, waypoint navigation, 2D LiDAR, museum
Abstract

A visitor-following method that guides visitors as they move around was successfully developed without changing the structure of the waypoint navigation system. We previously developed a guidance robot, “EM-Ro,” to provide guidance services at the ECO35 Muffler Museum, and used the waypoint navigation system to implement a visitor-escort method along a predetermined route. With this visitor-following method, EM-Ro was able to follow a target visitor along visitor-derived waypoints, which were estimated using 2D LiDAR. Thus, the proposed navigation system for the guidance robot provides both visitor-escort and visitor-following guidance services. Using the same waypoint navigation system, it was possible to seamlessly switch between visitor-escort and visitor-following guidance. Switching between prepared or visitor-derived waypoints can make a visitor choose the preferred guidance method. Visitors can switch the guidance method anytime by providing EM-Ro requests from the remote controller. In addition, a guest redetecting method was developed when EM-Ro lost guests. The experimental results at the Muffler Museum showed that both visitor-escort and visitor-following driving by the EM-Ro were successfully demonstrated while guiding guests in the facility.

 museum guide robot ”EM-Ro”

museum guide robot ”EM-Ro”

Cite this article as:
K. Ichihara, T. Hasegawa, S. Yuta, H. Ichikawa, and Y. Naruse, “Waypoint-Based Human-Tracking Navigation for Museum Guide Robot,” J. Robot. Mechatron., Vol.34 No.5, pp. 1192-1204, 2022.
Data files:
References
  1. [1] S. J. Lee, J. Lim, G. Tewolde, and J. Kwon, “Autonomous tour guide robot by using ultrasonic range sensors and QR code recognition in indoor environment,” IEEE Int. Conf. on Electro/Information Technology, Milwaukee, pp. 410-415, doi: 10.1109/EIT.2014.6871799, 2014.
  2. [2] V. Alvarez-Santos, A. Canedo-Rodriguez, R. Iglesias, M. X. Pardo, V. C. Regueiro, and M. Fernandez-Delgado, “Route learning and reproduction in a tour-guide robot,” Robotics and Autonomous Systems, Vol.63, No.2, pp. 206-213, doi: 10.1016/j.robot.2014.07.013, 2015.
  3. [3] A. Al-Wazzan, R. Al-Farhan, F. Al-Ali, and M. El-Abd, “Tour-guide robot,” 2016 Int. Conf. on Industrial Informatics and Computer Systems, pp. 1-5, doi: 10.1109/ICCSII.2016.7462397, 2016.
  4. [4] M. Fiore, H. Khambhaita, G. Milliez, and R. Alami, “An Adaptive and Proactive Human-Aware Robot Guide,” 7th Int. Conf. on Social Robotics, Vol.9388, pp. 194-203, doi: 10.1007/978-3-319-25554-5_20, 2015.
  5. [5] S. Hemachandra, T. Kollar, N. Roy, and S. Teller, “Following and interpreting narrated guided tours,” IEEE Int. Conf. on Robotics and Automation, pp. 2574-2579, doi: 10.1109/ICRA.2011.5980209, 2011.
  6. [6] D. A. Diallo, S. Gobee, and V. Durairajah, “Autonomous Tour Guide Robot using embedded system control,” Procedia Computer Science, Vol.76, pp. 126-133, doi: 10.1016/j.procs.2015.12.302, 2015.
  7. [7] V. Alvarez-Santos, R. Iglesias, M. X. Pardo, V. C. Regueiro, and A. Canedo-Rodriguez, “Gesture-based interaction with voice feedback for a tour-guide robot,” J. of Visual Communication and Image Representation, Vol.25, Issue 2, pp. 499-509, doi: 10.1016/j.jvcir.2013.03.017, 2014.
  8. [8] B.-O. Han, Y.-H. Kim, K. Cho, and H. S. Yang, “Museum Tour Guide Robot With Augmented Reality,” 2010 16th Int. Conf. on Virtual Systems and Multimedia, pp. 223-229, doi: 10.1109/vsmm.2010.5665982, 2010.
  9. [9] C. V. Smith, J. C. Licato, M. V. Doran, and T. G. Thomas, “A Voice Operated Tour Planning System for Autonomous Mobile Robots,” J. of Systemics, Cybernetics and Informatics, Vol.8, No.3, pp. 72-79, 2010.
  10. [10] S. Wang and H. I. Christensen, “TritonBot: First Lessons Learned from Deployment of a Long-Term Autonomy Tour Guide Robot,” 2018 27th IEEE Int. Symp. on Robot and Human Interactive Communication, pp. 158-165, doi: 10.1109/ROMAN.2018.8525845, 2018.
  11. [11] G. M. Rashed, M. Suzuki, A. Lam, Y. Kobayashi, and Y. Kuno, “A vision based guide robot system: Initiating proactive social human robot interaction in museum scenarios,” Int. Conf. on Computer and Information Engineering, pp. 5-8, doi: 10.1109/CCIE.2015.7399316, 2015.
  12. [12] G. M. Rashed, R. Suzuki, A. Lam, Y. Kobayashi, and Y. Kuno, “Toward Museum Guide Robots Proactively Initiating Interaction with Humans,” Proc. of the Tenth Annual ACM/IEEE Int. Conf. on Human-Robot Interaction (HRI’15) Extended Abstracts, pp. 1-2, doi: 10.1145/2701973.2701974, 2015.
  13. [13] M. Nieuwenhuisen, J. Gaspers, O. Tischler, and S. Behnke, “Intuitive Multimodal Interaction and Predictable Behavior for the Museum Tour Guide Robot Robotinho,” 2010 IEEE-RAS Int. Conf. on Humanoid Robots, pp. 653-658, doi: 10.1109/ICHR.2010.5686848, 2010.
  14. [14] Y. Sumi, K. Matsumura, and T. Gompei, “Robot Guidance that Facilitates Visitor’s Experiences Sharing in Exhibition Space,” Information Processing Society of Japan, Vol.2014, pp. 1394-1399, 2014 (in Japanese).
  15. [15] T. Eda, T. Hasegawa, S. Nakamura, and S. Yuta, “Development of Autonomous Mobile Robot “MML-05” Based on i-Cart Mini for Tsukuba Challenge 2015,” J. Robot. Mechatron, Vol.28, No.4, pp. 461-469, doi: 10.20965/jrm.2016.p0461, 2016.
  16. [16] R. Gockley, J. Forlizzi, and R. Simmons, “Natural person-following behavior for social robots,” ACM/IEEE Int. Conf. on Human-Robot Interaction, pp. 17-24, doi: 10.1145/1228716.1228720, 2007.
  17. [17] J. E. Jung, H. J. Lee, J. B. Yi, J. Park, S. Yuta, and T. S. Noh, “Development of a Laser-Range-Finder-based human tracking and control algorithm for a marathoner service robot,” IEEE/ASME Trans. on Mechatronics, Vol.19, No.6, pp. 1963-1975, doi: 10.1109/TMECH.2013.2294180, 2014.
  18. [18] J. Satake and J. Miura, “Robust Stereo-Based Person Detection and Tracking for a Person Following Robot,” ICRA-2009 Workshop on Person Detaction and Tracking, pp. 1-6, 2009.
  19. [19] S. Sun, A. Ning, X. Zhao, and M. Tan, “Human recognition for following robots with a Kinect sensor,” 2016 IEEE Int. Conf. on Robotics and Biomimetics (ROBIO), pp. 1331-1336, doi: 10.1109/ROBIO.2016.7866511, 2016.
  20. [20] K. O. Arras, O. M. Mozos, and W. Burgard, “Using Boosted Features for the Detection of People in 2D Range Data,” IEEE Int. Conf. on Robotics and Automation, pp. 3402-3407, doi: 10.1109/ROBOT.2007.363998, 2007.
  21. [21] K. O. Arras, S. Grzonka, M. Luber, and W. Burgard, “Efficient people tracking in laser range data using a multi-hypothesis leg-tracker with adaptive occlusion probabilities,” IEEE Int. Conf. on Robotics and Automation (ICRA), pp. 1710-1715, doi: 10.1109/ROBOT.2008.4543447, 2008.
  22. [22] K. Furuzawa and Y. Hada, “Moving Objects Detection using LRF on Mobile Robot,” Abstracts of the 2015 Annual Conf. of the Robotics Society of Japan, RSJ2015C1K2-01, 2015 (in Japanese).

*This site is desgined based on HTML5 and CSS3 for modern browsers, e.g. Chrome, Firefox, Safari, Edge, Opera.

Last updated on Apr. 05, 2024