JRM Vol.25 No.3 pp. 484-496
doi: 10.20965/jrm.2013.p0484


Autonomous Pedestrian Push Button Activation by Outdoor Mobile Robot in Outdoor Environments

Aneesh N. Chand* and Shin’ichi Yuta**

*Institute of Mechatronic Systems, Zurich University of Applied Sciences, Technikumstrasse 5, CH 8401, Winterthur, Switzerland

**Department of Electrical Engineering, Shibaura Institute of Technology, 3-7-5 Toyosu, Koto-ku, Tokyo 135-8548, Japan

October 3, 2012
April 17, 2013
June 20, 2013
autonomous button activation, outdoor mobile robot, outdoor environment
The authors have developed an outdoor mobile robot that has the ability to cross roads at an intersection or pedestrian crossing fully autonomously while traveling along sidewalks in an urban environment. This gives the robot the capability to travel longer and complex routes as the robot is able to cross a road and continue with its path. The developed robot has the unique ability to autonomously approach and activate the pedestrian push button with a mechanical finger. We first briefly describe the overall operation of such a road crossing robot. The rest of this paper then discusses in detail how the robot can meticulously navigate to and activate the pedestrian push button with the on-board finger. The contribution of this work is that although there are robots existing that perform precision docking or button activation, this robot is one of the few that can perform such an action in a real world outdoor environment that is completely unmodified. We prove this by deploying the robot in a real world road-crossing and it was successfully able to engage the pedestrian push button.
Cite this article as:
A. Chand and S. Yuta, “Autonomous Pedestrian Push Button Activation by Outdoor Mobile Robot in Outdoor Environments,” J. Robot. Mechatron., Vol.25 No.3, pp. 484-496, 2013.
Data files:
  1. [1] M. Baker and H. A. Yanco, “Automated Street Crossing for Assistive Robots,” IEEE 9th Int. Conf. on Rehabilitation Robotics, pp. 187-192, Chicago, IL, USA, June 2005.
  2. [2] G. Lidoris, F. Rohrmuller, D. Wollherr, and M. Buss, “The Autonomous City Explorer Project Mobile robot navigation in highly populated urban environments,” 2009 IEEE Int. Conf. on Robotics and Automation, pp. 1416-1422, May 12-17, Kobe, 2009.
  3. [3] Q. Muhlbauer, S. Sosnowski, T. Xu, T. Zhang, K. Khnlenz, and M. Buss, “Navigation through Urban Environments by Visual Perception and Interaction,” 2009 IEEE Int Conf. on Robotics and Automation, pp. 3558-3564, Kobe, May 12-17, 2009.
  4. [4] K. Klasing, G. Lidoris, A. Bauer, F. Rohrmller, D. Wollherr, and M. Buss, “The autonomous city explorer: Towards semantic navigation in urbam environments,” 1st Int. Workshop on Cognition For Technical Systems, 2008.
  5. [5] G. Lidoris, K. Klasing, A. Bauer, T. Xu, K. Khnlenz, and M. Buss, “The autonomous city explorer project: Aims and systemsoverview,” Proc. of IEEE/RSJ Int. Conf. on Intelligent Robots and Systems, pp. 560-565, Oct. 29 to Nov. 2, San Diego, CA, USA, 2007.
  6. [6] A. Bauer, K. Klasing, G. Lidoris, Q. Mühlbauer, F. Rohrmller, S. Sosnowski, T. Xu, K. Kühnlenz, D. Wollherr, and M. Buss, “The Autonomous City Explorer: Towards Natural Human-Robot Interaction in Urban Environments,” Int. J. of Social Robotics, Vol.1, No.2, pp. 127-140, 2009.
  7. [7] Y. Morales, E. Takeuchi, A. Carballo, W. Tokunaga, H. Kuniyoshi, A. Aburadani, A. Hirosawa, Y. Nagasaka, Y. Suzuki, and T. Tsubouchi, “1Km Autonomous Robot Navigation on Outdoor Pedestrian Paths Running the Tsukuba Challenge 2007,” Proc. of the 2008 IEEE/RSJ Int. Conf. on Intelligent Robots and Systems, pp. 219-225, France, Sep. 22-26, 2008.
  8. [8] Y. Morales, A. Carballo, E. Takeuchi, A. Aburadani, and T. Tsubouchi, “Autonomous robot navigation in outdoor cluttered pedestrian walkways,” J. of Field Robotics, Vol.26, Issue 8, pp. 609-635, Aug. 2009.
  9. [9] S. Yuta, M. Mizukawa, and H. Hashimoto, “Tsukuba Challenge: The Purpose and Results,” Special Issue on Tsukuba Challenge, The J. of the Society of Instrument and Control Engineers, Vol.49. pp. 572-578, 2010 (in Japanese).
  10. [10] E. Trulls, A. Corominas Murtra, J. Prez-Ibarz, G. Ferrer, D. Vasquez, J. M. Mirats-Tur, and A. Sanfeliu, “Autonomous navigation for mobile service robots in urban pedestrian environments,” J. of Field Robotics, Vol.28, pp. 329-354, 2011. doi: 10.1002/rob.20386
  11. [11] W. Meeussen, M. Wise, S. Glaser, S. Chitta, C. McGann, P. Mihelich, E. Marder-Eppstein, M. Muja, V. Eruhimov, T. Foote, J. Hsu, R. B. Rusu, B. Marthi, G. Bradski, K. Konolige, B. Gerkey, and E. Berger, “Autonomous Door Opening and Plugging In with a Personal Robot,” Int Conf. on Robotics and Automation, Anchorage, Alaska, 2010.
  12. [12] M. C. Silverman, D. Nies, B. Jung, and G. S. Sukhatme, “Staying Alive: A Docking Station for Autonomous Robot Recharging,” 2002 IEEE Int. Conf. on Robotics and Automation, pp. 1050-1055, Washington D.C., May 11-15, 2002.
  13. [13] Y. Hada and S. Yuta, “Robust navigation and battery re-charging systems for long term activity of autonomous mobile robots,” The Ninth Int. Conf. on Advanced Robotics, Tokyo, Oct. 25-27, 1999.
  14. [14] A. Chand and S. Yuta, “Navigation strategy and path planning for autonomous road crossing by outdoor mobile robots,” Proc. of the 15th Int. Conf. on Advanced Robotics (ICAR 2011), Tallinn, Estonia, June 20-23, 2011.
  15. [15] I. Ulrich and I. Nourbakhsh, “Appearance-Based Obstacle Detection with Monocular Color Vision,” Proc. of the AAAI National Conf. on Artificial Intelligence, Austin, TX, July/August 2000.
  16. [16] D. York, “Least-Square Fitting of a Straight Line,” Canad. J. Phys., Vol.44, pp. 1079-1086, 1966.
  17. [17] Y. Kanayama and S. Yuta, “Vehicle path specification by a sequence of straight lines,” IEEE J. of Robotics and Automation, Vol.4. No.3. pp. 265-276, 1988.
  18. [18] S. Iida and S. Yuta, “Vehicle command system and trajectory control for autonomous mobile robots,” Proc. IEEE/RSJ Int. Conf. on Intelligent Robots and Intelligent Systems, Vol.1. pp. 212-217, Osaka, 1991.
  19. [19] A. Chand and S. Yuta, “Road Crossing Landmarks Detection by Outdoor Mobile Robots,” J. of Robotics and Mechatronics, Vol.22, No.6, pp. 708-717, December 2010.
  20. [20] P. J. Burt, T. H. Hong, and A. Rosenfeld, “A Segmentation and Estimation of Image Region Properties through Cooperative Hierarchical Computation,” IEEE Trans. on System, Man and Cybernetics, Vol.SMC-11, No.12, Dec. 1981.
  21. [21] R. Laganiere, “Compositing a bird’s eye view mosaic,” in Proc. Conf. Vision Interface, pp. 382-387, Montreal, Canada, 2000.
  22. [22] P. Viola and M. J. Jones, “Rapid Object Detection Using a Boosted Cascade of Simple Features,” IEEE Conf. Computer Vision and Pattern Recognition (CVPR), 2001.
  23. [23] A. Chand and S. Yuta, “Vision and Laser Sensor Data Fusion Technique for Target Approaching by Outdoor Mobile Robot,” 2010 IEEE Int. Conf. on Robotics and Biomemitics (ROBIO 2010), pp. 1624-1629, Tianjin, China, Dec. 14-18, 2010.
  24. [24] H. Kawata, W. Santosh, T. Mori, A. Ohya, and S. Yuta, “Development of ultra-small lightweight optical range sensor system,” Proc. of 2005 IEEE/RSJ Int. Conf. on Intelligent Robots and Systems, pp. 3277-3282, Alberta, Canada, 2005.
  25. [25] Hokuyo Automatic Co., Ltd., “URG Series Communication Protocol Specification SCIP-Version2.0,” 2006.
  26. [26] S. Yuta, S. Suzuki, and S. Iida, “Implementation of a Small Size Experimental Self-Contained Autonomous Robot Sensors, Vehicle Control and Description of Sensor Based Behavior,” Lecture Notes in Control and Information Sciences, The 2nd Int. Sym. on Experimental Robotics II, Vol.190, pp. 344-358, 1991.
  27. [27]
    Supporting Online Materials:[a] cvBlobsLib, Blob extraction library. [Accessed: February 17, 2011]
  28. [28] [b] Willow Garage, Open Source Computer Vision Library.

*This site is desgined based on HTML5 and CSS3 for modern browsers, e.g. Chrome, Firefox, Safari, Edge, Opera.

Last updated on May. 10, 2024