JRM Vol.23 No.1 pp. 53-65
doi: 10.20965/jrm.2011.p0053


An Intelligent High-Frame-Rate Video Logging System for Abnormal Behavior Analysis

Yao-DongWang*, Idaku Ishii*, Takeshi Takaki*, and Kenji Tajima**

*Robotics Laboratory, Graduate School of Engineering, Hiroshima University, 1-4-1 Kagamiyama, Higashi-hiroshima, Hiroshima 739-8527, Japan

**Photron Limited, 1-1-8 Fujimi, Chiyoda-ku, Tokyo 102-0071, Japan

April 2, 2010
June 25, 2010
February 20, 2011
real-time vision, high-speed vision, abnormal behavior detection, high-frame-rate video logging
This paper introduces a high-speed vision system called IDP Express, which can execute real-time image processing and High-Frame-Rate (HFR) video recording simultaneously. In IDP Express, 512×512 pixel images from two camera heads and the processed results on a dedicated FPGA (Field Programmable Gate Array) board are transferred to standard PC memory at a rate of 1000 fps or more. Owing to the simultaneous HFR video processing and recording, IDP Express can be used as an intelligent video logging system for long-term high-speed phenomenon analysis. In this paper, a real-time abnormal behavior detection algorithm was implemented on IDP-Express to capture HFR videos of crucial moments of unpredictable abnormal behaviors in high-speed periodic motions. Several experiments were performed for a high-speed slider machine with repetitive operation at a frequency of 15 Hz and videos of the abnormal behaviors were automatically recorded to verify the effectiveness of our intelligent HFR video logging system.
Cite this article as:
Yao-DongWang, I. Ishii, T. Takaki, and K. Tajima, “An Intelligent High-Frame-Rate Video Logging System for Abnormal Behavior Analysis,” J. Robot. Mechatron., Vol.23 No.1, pp. 53-65, 2011.
Data files:
  1. [1] W. Hu, T. Tan, L. Wang, and S. Maybank, “A Survey on Visual Surveillance of Object Motion and Behaviors,” IEEE Trans. Syst. Man Cyber. Part C, Vol.34, No.3, pp. 334-352, 2004.
  2. [2] C. Wren, A. Azarbayejani, T. Darrell, and A. Pentland, “Pfinder: Real-Time Tracking of the Human Body,” IEEE Trans. Pattern Anal. Machine Intell., Vol.19, No.7, pp. 780-785, 1997.
  3. [3] C. Stauffer and E. L. Grimson, “Learning Patterns of Activity Using Real-Time Tracking,” IEEE Trans. Pattern Anal. Machine Intell., Vol.22, No.8, pp. 747-757, 2000.
  4. [4] A. Elgammal, R. Duraiswami, D. Harwood, and L. S. Davis, “Background and Foreground Modeling Using Nonparametric Kernel Density for Visual Surveillance,” Proc. IEEE, Vol.90, pp. 1151-1163, 2003.
  5. [5] T. Nanri and N. Otsu, “Unsupervised Abnormality Detection in Video Surveillance,” Proc. IAPR Conf. Mach. Vis. Appl., pp. 574-577, 2005.
  6. [6] E. B. Ermis, V. Saligrama, P.-M. Jodoin, and J. Konrad, “Motion Segmentation and Abnormal Behavior Detection via Behavior Clustering,” Proc. IEEE Int. Conf. Image Process., pp. 769-772, 2008.
  7. [7] T.-H. Yu and Y.-S. Moon, “Unsupervised Real-Time Unusual Behavior Detection for Biometric-Assisted Visual Surveillance,” Proc. Int. Conf. on Advances in Biometrics, pp. 1019-1029, 2009.
  8. [8] T. M. Bernard, B. Y. Zavidovique, and F. J. Devos, “A programmable artificial retina,” IEEE J. Solid-State Circ. Vol.28, No.7, pp. 789-797, 1993.
  9. [9] J. E. Eklund, C. Svensson, and A. Astrom, “VLSI implementation of a focal plane image processor – A realization of the near-sensor image processing concept,” IEEE Trans. VLSI Syst., Vol.4, No.3, pp. 322-335, 1996.
  10. [10] T. Komuro, S. Kagami, and M. Ishikawa, “A Dynamically Reconfigurable SIMD Processor for a Vision Chip,” IEEE J. Solid-State Circ., Vol.39, No.1, pp. 265-268, 2004.
  11. [11] I. Ishii, K. Yamamoto, and M. Kubozono, “Higher Order Autocorrelation Vision Chip,” IEEE Trans. Electron Dev., Vol.53, No.8, pp. 1797-1804, 2006.
  12. [12] Y. Watanabe, T. Komuro, and M. Ishikawa, “955-Fps Real-Time Shape Measurement of a Moving/Deforming Object Using High-Speed Vision for Numerous-Point Analysis,” Proc. IEEE Int. Conf. Robot. Autom., pp. 3192-3197, 2007.
  13. [13] S. Hirai, M. Zakoji, A. Masubuchi, and T. Tsuboi, “Realtime FPGA-Based Vision System,” J. of Robotics and Mechatronics, Vol.17, No.4, pp. 401-409, 2005.
  14. [14] I. Ishii, T. Taniguchi, R. Sukenobe, and K. Yamamoto, “Development of high-speed and real-time vision platform, H3 Vision,” Proc. IEEE/RSJ Int. Conf. Intelli. Rob. Sys., pp. 3671-3678, 2009.
  15. [15] I. Ishii, Y. Nakabo, and M. Ishikawa, “Target Tracking Algorithm for 1ms Visual Feedback System Using Massively Parallel Processing,” Proc. IEEE Int. Conf. Robot. Autom., pp. 2309-2314, 1996.
  16. [16] A. Namiki, Y. Imai, M. Ishikawa, and M. Kaneko, “Development of a High-speed Multifingered Hand System and Its Application to Catching,” Proc. IEEE/RSJ Int. Conf. Intelli. Rob. Sys., pp. 2666-2671, 2003
  17. [17] Y. Nakamura, K. Kishi, and H. Kawakami, “Heartbeat Synchronization for Robotic Cardiac Surgery,” Proc. IEEE Int. Conf. Robot. Autom., pp. 2014-2019, 2001.
  18. [18] Y. Nie, I. Ishii, K. Yamamoto, K. Orito, and H.Matsuda, “Real-time Scratching Behavior Quantification System for Laboratory Mice Using High-Speed Vision,” J. Real-Time Image Proc., Vol.4, No.2, pp. 181-190, 2009.
  19. [19] I. Ishii, T. Tatebe, Q. Gu, Y. Moriue, T. Takaki, and K. Tajima, “2000 fps Real-time Vision System with High-frame-rate Video Recording,” Proc. IEEE Int. Conf. Robot. Autom., 2010.

*This site is desgined based on HTML5 and CSS3 for modern browsers, e.g. Chrome, Firefox, Safari, Edge, Opera.

Last updated on May. 19, 2024