single-rb.php

JRM Vol.21 No.3 pp. 332-341
doi: 10.20965/jrm.2009.p0332
(2009)

Paper:

Dynamic Remodeling of Environmental Map Using Range Data for Remote Operation of Mobile Robot

Takafumi Matsumaru, Hiroshi Yamamori, and Takumi Fujita

Faculty of Mechanical Engineering and Graduate School of Engineering, Shizuoka University, Hamamatsu, Japan

Received:
October 20, 2008
Accepted:
February 21, 2009
Published:
June 20, 2009
Keywords:
remote operation, environmental map, range sensor, mobile robot, dynamic remodeling
Abstract

In studying dynamic remodeling of environmental mapping around a mobile robot operated remotely while data measured by the robot range sensor is sent from the robot to the operator, we introduce the Line & Hollow method and the Cell & Hollow method for environmental mapping. Results for the three types of environmental situation clarifies features and effectiveness of our approach. In Line & Hollow method, an isosceles triangle is set based on the range data. The base line is pulled to express obstacle shape and the inside is hollowed out to express vacant space. In Cell & Hollow method, the cell value corresponding to the range data is incremented, and an obstacle is assumed to be exist if the cell value exceeds the ascending threshold. The cell value is decremented on the line between the cell that measured data indicates and the cell located at the sensor, and the obstacle is deleted if the value drops below the descending threshold. We confirmed that environmental mapping for either reflects a dynamic environmental change.

Cite this article as:
Takafumi Matsumaru, Hiroshi Yamamori, and Takumi Fujita, “Dynamic Remodeling of Environmental Map Using Range Data for Remote Operation of Mobile Robot,” J. Robot. Mechatron., Vol.21, No.3, pp. 332-341, 2009.
Data files:
References
  1. [1] H. Yamamori, M. Lin, H. Sasahara, and T. Matsumaru, “Teleoperation of Human-Friendly Robot (39th report) – Examination of environmental presentation technique in teleoperation interface –,” In SICE-Chubu Shizuoka 2006, 2B4, Hamamatsu, Japan, 2006.
  2. [2] H. Yamamori, H. Sasahara, M. Lin, K. Akai, S. Suzuki, and T. Matsumaru, “Teleoperation of Human-Friendly Robot (44th report) – Study on presentation of surroundings using range data –,” In Proc. SCI'07, 1W4-3, pp. 139-140, Kyoto, Japan, 2007.
  3. [3] T. Fujita, M. Lin, Y. Ito, and T. Matsumaru, “Teleoperation of Human-Friendly Robot (48th report) – Examination of Line & Hollow Method and Cell & Hollow Method using Range Sensor to Present Remote Environment –,” In Proc. JSME ROBOMEC 2008, 1A1-F21, Nagano, Japan, 2008.
  4. [4] P. Wells and D. Deguire, “TALON: a universal unmanned ground vehicle platform, enabling the mission to be the focus,” In Proc. SPIE, Vol.5804: Unmanned Ground Vehicle Technology VII, pp. 747-757, Orlando, FL, 2005.
  5. [5] B. Yamauchi, “PackBot: A versatile platform for military robotics,” In Proc. SPIE, Vol.5422: Unmanned Ground Vehicle Technology VI, pp. 228-237, Orlando, FL, 2004.
  6. [6] KumoTek, LLC, “KRT-715: Quad Flipper Security Robot Platform,” In Techfest 2009, Bombay, India, 2009.
  7. [7] K. Komoriya, E. Oyama, and K. Tani, “Planning of Landmark Measurement for the Navigation of a Mobile Robot,” J of RSJ, Vol.11, No.4, pp. 533-540, 1993.
  8. [8] M. Hashimoto, F. Oba, Y. Fujikawa, K. Imamaki, and T. Nishida, “Planning of Landmark Measurement for the Navigation of a Mobile Robot,” J of RSJ, Vol.11, No.7, pp. 1028-1038, 1993.
  9. [9] K. Mima, C. Kanamori, M. Kajitani, and A. Ming, “Development of Landmark System Using Polarizing Film for Autonomous Mobile Robot,” J of RSJ, Vol.16, No.4, pp.518-526, 1998.
  10. [10] C. Kasuga, “Identification Method of the Self-Position Using the Digital Mark Pattern,” J of RSJ, Vol.12, No.6, pp. 857-862, 1994.
  11. [11] H. Sai and Y. Okawa, “A Structured Sign of Guiding Movable Robots,” J of RSJ, Vol.9, No.2, pp. 129-136, 1991.
  12. [12] G. Dudek, M. Jenkin, E. Milios, and D. Wilkes, “Modelling sonar range sensors,” Advances in Machine Vision: Strategies and Applications, pp. 361-370, World Scientific Publishing, 1992.
  13. [13] J. R. Beveridge and E. M. Riseman, “How easy is matching 2D line models using local search?,” In IEEE TPAMI, Vol.19, No.6, pp. 564-579, 1997.
  14. [14] Z. Politis and P. Probert, “Perception of an indoor robot workspace by using CTFM sonar imaging,” In Proc. IEEE ICRA 1998, Vol.4, pp. 2801-2806, Leuven, Belgium, 1998.
  15. [15] K. Akiyama, K. Iwase, T. Matsumaru, and T. Ito, “Teleoperation of Human-Friendly Robot (16th report) – Recognition of Environment using Range Sensor and Its Presentation –,” Proc. JSME ROBOMEC 2004, 2P2-L2-32, Nagoya, Japan, 2004.
  16. [16] H. Moravec and A. Elfes, “High resolution maps from wide angle sonar,” In Proc. IEEE ICRA 1985, Vol.2, pp. 116-121, Washington DC, USA, 1985.
  17. [17] A. Elfes, “Sonar-Based Real-World Mapping and Navigation,” IEEE T-RA, Vol.3, No.3, pp. 249-265, 1987.
  18. [18] A. Elfes, “Using occupancy grids for mobile robot perception and navigation,” Computer, Vol.22, No.6, pp. 46-57, 1989.
  19. [19] K. Konolige, “Improved Occupancy Grids for Map Building,” Autonomous Robots, Vol.4, No.4, pp. 351-367, 1997.
  20. [20] D. Fox, W. Burgard, and S. Thrun, “Active Markov localization for mobile robots,” Robotics and Autonomous Systems, Vol.25, pp. 195-207, 1998.
  21. [21] E. Dedieu and J.D.R. Mill'an, “Efficient occupancy grids for variable resolution map building,” In SIRS98, pp. 195-203, Edinburgh, Scotland, UK, 1998.
  22. [22] O. Wijk, P. Jensfelt, and H.I. Christensen, “Triangulation based fusion of ultrasonic sensor data,” In Proc. IEEE ICRA 1998, Vol.4, pp. 3419-3424, Leuven, Belgium, 1998.
  23. [23] O.M. Al-Jarrah and O.Q. Bani-Melhem, “Building maps for mobile robot navigation using fuzzy classification of ultrasonic range data,” J. of Intelligent & Fuzzy Systems: Applications in Engineering and Technology, Vol.11, Issue 3-4, pp. 171-184, 2001.
  24. [24] M.H. Kim, S.C. Lee, and K.H. Lee, “Self-localization of mobile robot with single camera in corridor environment,” In Proc. IEEE ISIE 2001, Vol.3, pp. 1619-1623, Pusan, South Korea, 2001.
  25. [25] D. Anguelov, R. Biswas, D. Koller, B. Limketkai, and S. Thrun, “Learning Hierarchical Object Maps of Non-Stationary Environments with Mobile Robots,” In Proc. UAI'02, pp. 10-17, Alberta, Edmonton, Canada, 2002.
  26. [26] J. Borenstein and Y. Koren, “The Vector Field Histogram – Fast Obstacle-Avoidance for Mobile Robots,” IEEE T-RA, Vol.7, No.3, pp. 278-288, 1991.
  27. [27] J. Borenstein and Y. Koren, “Histogramic In-motion Mapping for Mobile Robot Obstacle Avoidance,” IEEE T-RA, Vol.7, No.4, pp. 535-539, 1991.

*This site is desgined based on HTML5 and CSS3 for modern browsers, e.g. Chrome, Firefox, Safari, Edge, Opera.

Last updated on Nov. 25, 2021