single-au.php

IJAT Vol.5 No.6 pp. 924-931
doi: 10.20965/ijat.2011.p0924
(2011)

Paper:

Improvement of Human Tracking in Stereoscopic Environment Using Subtraction Stereo with Shadow Detection

Kenji Terabayashi*, Yuma Hoshikawa**, Alessandro Moro*,
and Kazunori Umeda*

*Department of Precision Mechanics, Faculty of Science and Engineering, Chuo University / CREST, JST, 1-13-27 Kasuga, Bunkyo-ku, Tokyo 112-8551, Japan

**Toyota Motor Corporation, 375-1 Imazato Susono Sizuoka 410-1104, Japan

Received:
May 24, 2011
Accepted:
September 22, 2011
Published:
November 5, 2011
Keywords:
subtraction stereo, human tracking, stereoscopic environment, shadow detection
Abstract
The combination of subtraction stereo with shadow detection we propose improves people tracking in stereoscopic environments. Subtraction stereo is a stereo matching method which is fast and robust for the correspondence problem – one of the most serious issues in computer vision – restricting the search range of matching to foreground regions. Shadow detection gives adequate foreground regions of tracked people by removing cast shadows. This leads to accurate three-dimensional measurement of positions in stereoscopic environment tracking. By focusing on disparity images obtained by subtraction stereo, we can detect people easily based on standard labeling. Objects can also be measured directly in size by subtraction stereo without geometric information about environments for tracking. This is important for installing the tracking system easily. To track multiple passing people, we use the extended Kalman filter to address the occlusion problem usually encountered in crowded environments. The proposed method is verified by experiments using unknown stereoscopic environments.
Cite this article as:
K. Terabayashi, Y. Hoshikawa, A. Moro, and K. Umeda, “Improvement of Human Tracking in Stereoscopic Environment Using Subtraction Stereo with Shadow Detection,” Int. J. Automation Technol., Vol.5 No.6, pp. 924-931, 2011.
Data files:
References
  1. [1] R. Collins, A. Lipton, T. Kanade, H. Fujiyoshi, D. Duggins, Y. Tsin, D. Tolliver, N. Enomoto, and O. Hasegawa, “A system for video surveillance and monitoring,” Technical Report CMU-RI-TR-00-12, Robotics Institute, Carnegie Mellon University, 2000.
  2. [2] S. Kagami, K. Okada, M. Inaba, and H. Inoue, “Real-time 3D depth flow generation and its application to track to walking human being,” Proceedings of 15th International Conference on Pattern Recognition (ICPR2000), Vol.4, pp. 197-200, 2000.
  3. [3] T. Haga, K. Sumi, and Y. Yagi, “Human detection in outdoor scene using spatio-temporal motion analysis,” Proceedings of 17th International Conference on Pattern Recognition (ICPR2004), Vol.4, pp. 331-334, 2004.
  4. [4] J. Cui, H. Zha, H. Zhao, and R. Shibasaki, “Tracking multiple people using laser and vision,” Proceedings of 2005 IEEE/RSJ International Conference on Intelligent Robots and Systems (IROS2005), pp. 2116-2121, 2005.
  5. [5] Y. Benezeth, B. Emile, H. Laurent, and C. Rosenberger, “Visionbased system for human detection and tracking in indoor environment,” International Journal of Social Robotics, Vol.2, No.1, pp. 41-52, 2010.
  6. [6] O. M. Mozos, R. Kurazume, and T. Hasegawa, “Multi-part people detection using 2D range data,” International Journal of Social Robotics, Vol.2, No.1, pp. 31-40, 2010.
  7. [7] A. Panangadan, M. Matari’c, and G. S. Sukhatme, “Tracking and modeling of human activity using laser rangefinders,” International Journal of Social Robotics, Vol.2, No.1, pp. 95-107, 2010.
  8. [8] S. Bahadori, L. Iocchi, G. R. Leone, D. Nardi, and L. Scozzafava, “Real-time people localization and tracking through fixed stereo vision,” Applied Intelligence, Vol.26, No.2, pp. 83-97, 2007.
  9. [9] A. Ess, B. Leibe, K. Schindler, and L. van Gool, “Robust multiperson tracking from a mobile platform,” IEEE Transactions on Pattern Analysis and Machine Intelligence, Vol.31, No.10, pp. 1831-1846, 2009.
  10. [10] R. Hartley and A. Zisserman, “Multiple View Geometry in Computer Vision,” Cambridge Univ. Press, 2000.
  11. [11] K. Terabayashi, Y. Hoshikawa, Y. Hashimoto, and K. Umeda, “Real-time human tracking in stereoscopic environments using subtraction Stereo,” Proceedings of 4th International Asia Symposium on Mechatronics (AISM2010), pp. 97-104, 2010.
  12. [12] K. Umeda, T. Nakanishi, Y. Hashimoto, K. Irie, and K. Terabayashi, “Subtraction stereo – a stereo camera system that focuses on moving regions –,” Proceedings of SPIE 3D Imaging Metrology, Vol.7239, 2009.
  13. [13] Point Grey Research, “http://www.ptgrey.com/”
  14. [14] A. Moro, K. Terabayashi, K. Umeda, and E. Mumolo, “Autoadaptive threshold and shadow detection approaches for pedestrians detection,” Proceedings of Asian Workshop on Sensing and Visualization of City-Human Interaction (AWSVCI2009), pp. 9-12, 2009.
  15. [15] L. Zhe and L. S. Davis, “Shape-Based Human Detection and Segmentation via Hierarchical Part-Template Matching,” IEEE Transactions on Pattern Analysis and Machine Intelligence, Vol.32, No.4, pp. 604-618, 2010.
  16. [16] M. Arie, A. Moro, Y. Hoshikawa, T. Ubukata, K. Terabayashi, and K. Umeda, “Fast and Stable Human Detection Using Multiple Classifiers Based on Subtraction Stereo with HOG Features,” Proceedings of 2011 IEEE International Conference on Robotics and Automation (ICRA2011), pp. 868-873, 2011.

*This site is desgined based on HTML5 and CSS3 for modern browsers, e.g. Chrome, Firefox, Safari, Edge, Opera.

Last updated on Dec. 02, 2024