single-rb.php

JRM Vol.22 No.2 pp. 221-229
doi: 10.20965/jrm.2010.p0221
(2010)

Paper:

Multiple-Person Tracking by Multiple Cameras and Laser Range Scanners in Indoor Environments

Hiroshi Noguchi*, Taketoshi Mori*, Takashi Matsumoto**, Masamichi Shimosaka*, and Tomomasa Sato*

*The University of Tokyo, 7-3-1 Hongo, Bunkyo-ku, Tokyo 113-8656, Japan

**Mitsubishi Research Institute, Inc., 2-3-6 Otemachi, Chiyoda-ku, Tokyo 100-8141, Japan

Received:
September 30, 2009
Accepted:
January 27, 2010
Published:
April 20, 2010
Keywords:
multiple-person tracking, particle filter, laser range scanner and camera
Abstract
In this paper, we propose a method for multiple-person tracking using cameras and laser range scanners. Our method estimates 3D positions of human body and head, and labels them with their identities. Individual particle filters track person correctly by integrating information from laser range scanners and target-specific information from cameras, thus compensating for weak points of each. We also develop a particle filter framework that tracks the human head simultaneously using the estimated body position. Results of experiments demonstrate the effectiveness and robustness of the proposal in tracking multiple persons with multiple scanners and cameras.
Cite this article as:
H. Noguchi, T. Mori, T. Matsumoto, M. Shimosaka, and T. Sato, “Multiple-Person Tracking by Multiple Cameras and Laser Range Scanners in Indoor Environments,” J. Robot. Mechatron., Vol.22 No.2, pp. 221-229, 2010.
Data files:
References
  1. [1] Y. Kobayashi, D. Sugimura, Y. Sato, K. Hirasawa, N. Suzuki, H. Kage, and A. Sugimoto, “3D Head Tracking using the Particle Filter with Cascaded Classifiers,” In The British Machine Vision Conf., pp. 37-46, 2006.
  2. [2] Y. Matsumoto, T. Kato, and T. Wada, “An Occlusion Robust Likelihood Integration Method for Multi-Camera People Head Tracking,” In Int. Conf. on Networked Sensing Systems, pp. 235-242, 2007.
  3. [3] K. Kim and L. S. Davis, “Multi-camera Tracking and Segmentation of Occluded People on Ground Plane Using Search-Guided Particle Filtering,” In ECCV, pp. 98-109, 2006.
  4. [4] T. Mori, H. Noguchi, A. Takada, and T. Sato, “Sensing Room Environment: Distributed Sensor Space for Measurement of Human Dialy Behavior,” Trans. of the Society of Instrument and Control Engineers, E-S-1 No.1, pp. 97-103, 2006.
  5. [5] W.-H. Liau, C.-L. Wu, and L.-C. Fu, “Inhabitants Tracking System in a Cluttered Home Environment Via Floor Load Sensors,” IEEE Trans. on Automation Science and Engineering, Vol.5, No.1, pp. 10-20, 2008.
  6. [6] T. Murakita, T. Ikeda, and H. Ishiguro, “Human Tracking using Floor Sensors based on the Markov Chain Monte Carlo Method,” In Int. Conf. on Pattern Recognition(ICPR), pp. 917-920, 2004.
  7. [7] H. Morishita, R. Fukui, and T. Sato, “High Resolution Pressure Sensor Distributed Floor for Future Human-Robot Symbiosis Environment,” In Int. Conf. on Intelligent Robots and Systems (IROS), pp. 1246-1251, 2002.
  8. [8] H. Zhao and R. Shibasaki, “A Novel System for Tracking Pedestrians Using Multiple Single-row Laser-Range Scanners,” IEEE Trans. on Systems, Man and Cybernetics Part A: Systems and Humans, Vol.35, No.2, pp. 283-291, 2005.
  9. [9] J. Cui, H. Zha, H. Zhao, and R. Shibasaki, “Laser-based Interacting People Tracking Using Multi-level Observations,” In IEEE/RSJ Int. Conf. on Intelligent Robots and Systems, pp. 1799-1804, 2006.
  10. [10] A. Fod, A. Howard, and M. J. Mataric, “A Laser-Based People Tracker,” In IEEE Int. Conf. on Robotics and Automation, pp. 3024-3029, 2002.
  11. [11] D. F. Glas, T. Miyashita, H. Ishiguro, and N. Hagita, “Laser Tracking of Human Body Motion Using Adaptive Shape Modeling,” In IEEE/RSJ Int. Conf. on Intelligent Robots and Systems, pp. 602-608, 2007.
  12. [12] A. Dore, A. F. Cattoni, and C. S. Regazzoni, “A Particle Filter Based Fusion Framework for Video-Radio Tracking in Smart Spaces,” In Int. Conf. on Advanced Video and Signal based Surveillance (AVSS 2007), pp. 99-104, Sep. 2007.
  13. [13] T. Mori, T. Matsumoto, M. Shimosaka, H. Noguchi, and T. Sato, “Multiple Persons Tracking with Data Fusion of Multiple Cameras and Floor Sensors Using Particle Filters,” In ECCV Workshop on Multi-camera ad Multi-modal Sensor Fusion Algorithms and Applications (M2SFA2), 2008.
  14. [14] T. Mori, Y. Suemasu, H. Noguchi, and T. Sato, “Multiple People Tracking by Integrating Distributed Floor Pressure Sensors and RFID System,” In IEEE Int. Conf. on System Man and Cybernetics, pp. 5271-5278, 2004.
  15. [15] J. Cui, H. Zha, H. Zhao, and R. Shibasaki, “Tracking Multiple People using Laser and Vision,” In Int. Conf. on Intelligent Robots and Systems, pp. 1301-1306, 2005.
  16. [16] R. Kurazume, H. Yamada, K. Murakami, Y. Iwashita, and T. Hasegawa, “Target Tracking Using SIR and MCMC Particle Filters by Multiple Cameras and Laser Range Finders,” In IEEE/RSJ Int. Conf. on Intelligent Robots and Systems (IROS), pp. 3838-3844, 2008.
  17. [17] M. Isard and A. Blake, “Condensation-Conditional density propagation for visual tracking,” Int. J. on Computer Vision, Vol.29, No.1, pp. 5-28, 1998.
  18. [18] J. Vermaak and A. D. a. Perez, “Maintaining Multi-Modality through Mixture Tracking,” In Int. Conf. on Computer Vision, pp. 1110-1116, 2003.
  19. [19] J. Czyza, B. Risticb, and B. Macqa, “A Particle Filter for Joint Detection and Tracking of Color Objects,” Image and Vision Computing, Vol.25, No.8, pp. 1271-1281, 2007.
  20. [20] Z. Zhang, “A Flexible New Technique for Camera Calibration,” Microsoft Research Technical Report, MSR-TR-98-71, 1998.

*This site is desgined based on HTML5 and CSS3 for modern browsers, e.g. Chrome, Firefox, Safari, Edge, Opera.

Last updated on Apr. 18, 2024