JRM Vol.20 No.3 pp. 367-377
doi: 10.20965/jrm.2008.p0367


Moving-Object Tracking with In-Vehicle Multi-Laser Range Sensors

Masafumi Hashimoto*, Yosuke Matsui**, and Kazuhiko Takahashi*

*Faculty of Engineering, Doshisha University, 1-3 Miyakodani, Tatara, Kyotanabe, Kyoto 610-0321, Japan

**Toyota L&F Company, Toyota Industries Corporation, 2-1-1 Toyoda, Takahama, Aichi 444-1393, Japan

September 27, 2007
January 16, 2008
June 20, 2008
mobile robot, multi-laser range sensors, moving-object tracking, Kalman filter, data association

This paper presents a method for moving-object tracking with in-vehicle 2D laser range sensor (LRS) in a cluttered environment. A sensing area of one LRS is limited in orientation, and hence the mobile robot is equipped with multi-LRSs for omnidirectional sensing. Since each LRS takes the laser image on its own local coordinate frame, the laser image is mapped onto a reference coordinate frame so that the object tracking can be achieved by cooperation of multi-LRSs. For mapping the coordinate frames of multi-LRSs are calibrated, that is, the relative positions and orientations of the multi-LRSs are estimated. The calibration is based on Kalman filter and chi-hypothesis testing. Moving-object tracking is achieved by two steps: detection and tracking. Each LRS finds moving objects from its own laser image via a heuristic rule and an occupancy grid based method. It tracks the moving objects via Kalman filter and the assignment algorithm based data association. When the moving objects exist in the overlapped sensing areas of the LRSs, these LRSs exchange the tracking data and fuse them in a decentralized manner. A rule based track management is embedded into the tracking system in order to enhance the tracking performance. The experimental result of three walking-people tracking in an indoor environment validates the proposed method.

Cite this article as:
Masafumi Hashimoto, Yosuke Matsui, , and Kazuhiko Takahashi, “Moving-Object Tracking with In-Vehicle Multi-Laser Range Sensors,” J. Robot. Mechatron., Vol.20, No.3, pp. 367-377, 2008.
Data files:
  1. [1] H. Koyasu, J. Miura, and Y. Shirai, “Realtime Omnidirectional Stereo for Obstacle Detection and Tracking in Dynamic Environment,” Proc. of the 2001 IEEE/RSJ Int. Conf. on Intelligent Robots and Systems (IROS2001), CD-ROM, 2001.
  2. [2] D. Caveney, B. Feldman, and J. K. Hedrick, “Comprehensive Framework for Multisensor Multitarget Tracking in the Adaptive Cruise Control Environment,” Proc. of 6th Int. Symposium on Advanced Vehicle Control (AVEC’02), pp. 697-702, 2002.
  3. [3] E. Prassler, J. Scholz, and A. Elfes, “Tracking Multiple Moving Objects for Real-Time Robot Navigation,” Autonomous Robots, Vol.8, pp. 105-116, 2000.
  4. [4] E. Prassler, J. Scholz, and P. Fiorini, “A Robotic Wheelchair for Crowded Public Environment,” IEEE Robotics and Automation Magazine, pp. 38-45, 2001.
  5. [5] B. Kluge, C. Kohler, and E. Prassler, “Fast and Robust Tracking of Multiple Moving Objects with a Laser Range Finder,” Proc. of the 2001 IEEE Int. Conf. on Robotics & Automation (ICRA 2001), pp. 1683-1688, 2001.
  6. [6] D. Schulz, W. Burgard, D. Fox, and A. B. Cremers, “People Tracking with Mobile Robot using Sample-based Joint Probabilistic Data Association Filters,” Int. Journal of Robotics Research, pp. 99-115, 2003.
  7. [7] O. Frank, J. Nieto, J. Guivant, and S. Scheding, “Multiple Target Tracking Using Sequential Monte Carlo Methods and Statistical Data Association,” Proc. of the 2003 IEEE Int. Conf. on Robotics & Automation (ICRA 2003), pp. 2718-2723, 2003.
  8. [8] B. Jensen, R. Philippsen, and R. Siegwart, “Narrative Situation Assessment for Human-Robot Interaction,” Proc. of the IEEE Int. Conf. on Robotics and Automation (ICRA 2003), CD-ROM, 2003.
  9. [9] J. H. Lee, T. Tsubouchi, K. Yamamoto, and S. Egawa, “People Tracking and Trajectory Planning for a Mobile Robot,” Proc. of the 2006 JSME Conf. on Robotics and Mechatronics, CD-ROM, 2006.
  10. [10] S. Okusako and S. Sakane, “Human Tracking with a Mobile Robot using a Laser Range-Finder,” Journal of the Robotic Society of Japan, Vol.24, No.5, pp. 43-51, 2006 (in Japanese).
  11. [11] M. Lindstrom and J.-O. Eklundh, “Detecting and Tracking Moving Objects from a Mobile Platform Using a Laser Range Scanner,” Proc. of the 2001 IEEE/RSJ Int. Conf. on Intelligent Robots and Systems (IROS2001), CD-ROM, 2001.
  12. [12] C. Wang, C. Thorpe, and S. Thrun, “Online simultaneous localization and mapping with detection and tracking of moving objects: theory and results from a ground vehicle in crowded urban areas,” Proc. of Int. Conf. on Robotics and Automation (ICRA 2003), CDROM, 2003
  13. [13] C. Hue, J.-P. Le Cadre, and P. Perez, “Tracking Multiple objects with Particle Filtering,” IEEE Transaction on Aerospace and Electronic Systems, Vol.38, No.3, pp. 791-811, 2003.
  14. [14] A. Fod, A. Howard, and M. J. Mataric, “A Laser-Based People Tracker,” Proc. of the 2002 IEEE Int. Conf. on Robotics & Automation (ICRA 2002), pp. 3024-3029, 2002.
  15. [15] K. Nakamura, H. Zhao, R. Shibasaki, K. Sakamoto, T. Ooga, and N. Suzukawa, “Tracking Pedestrians by using Multiple Laser Range Scanners,” Proc. of ISPRS Congress, Vol.35, Part B4, pp. 1260-1265, 2004.
  16. [16] J. H. Lee, Y. S. Kim, H. Kawata, A. Ohya, and S. Yuta, “A People Tracking and Counting Method with Multi SOKUIKI Sensors,” Proc. of the 24th Annual Conf. of the Robotic Society of Japan, CD-ROM, 2006 (in Japanese).
  17. [17] Y. Bar-Shalom and T. E. Fortmann, “Tracking and Data Association,” Academic Press, Inc., 1988.
  18. [18] K. R. Pattipati, R. L. Popp, and T. Kirubarajan, “Survey of Assignment Techniques for Multitarget Tracking,” in Y. Bar-Shalom and W. D. Blair (Eds.), “Multitarget-Multisensor Tracking: Applications and Advances,” Vol.III, Artech House, Inc., pp. 77-159, 2000.
  19. [19] H. A. P. Blom and Y. Bar-Shalom, “The Interacting Multiple Model Algorithm for Systems with Markovian Switching Coefficient,” IEEE Transaction on Aerospace and Electronic Systems, Vol.33, No.8, pp. 780-783, 1988.
  20. [20] S. Julier and J. K. Uhlmann, “General Decentralized Data Fusion with Covariance Intersection (CI),” in D. L. Hall and J. Llinas (Eds.), Handbook of Multisensor Data Fusion, CRC Press, 2001.
  21. [21] K. Takagi, S. Ando, and M. Hashimoto, “Pedestrian Detection Using In-vehicle Multilayer LIDAR,” Proc. of 13th World Congress & Exhibition on Intelligent Transport Systems and Services, 2006.
  22. [22] M. D. Adams and A. Kerstens, “Tracking Naturally Occurring Indoor Features in 2-D and 3-D with Lidar Range/Amplitude Data,” Int. Journal of Robotics Research, Vol.17, No.9, pp. 907-923, 1998.
  23. [23] M. Hashimoto, S. Ogata, F. Oba, and S. Okada, “Multi-Moving Target Tracking with In-Vehicle Laser Range Sensors,” Transaction of the Japan Society of Mechanical Engineers, Vol.72, No.717(C), pp. 162-169, 2006 (in Japanese).

*This site is desgined based on HTML5 and CSS3 for modern browsers, e.g. Chrome, Firefox, Safari, Edge, Opera.

Last updated on Mar. 01, 2021