single-rb.php

JRM Vol.30 No.1 pp. 117-127
doi: 10.20965/jrm.2018.p0117
(2018)

Paper:

A High-Speed Vision System with Multithread Automatic Exposure Control for High-Dynamic-Range Imaging

Xianwu Jiang*, Qingyi Gu**, Tadayoshi Aoyama***, Takeshi Takaki*, and Idaku Ishii*

*Robotics Laboratory, Graduate School of Engineering, Hiroshima University
1-4-1 Kagamiyama, Higashi-Hiroshima, Hiroshima 739-8527, Japan

**Research Center of Precision Sensing and Control, Institute of Automation, Chinese Academy of Sciences
No.95 Zhongguancun East Road, Haidian District, Beijing 100190, China

***Department of Micro-Nano Mechanical Science and Engineering, Nagoya University
1 Furo-cho, Chikusa-ku, Nagoya 464-8603, Japan

Received:
July 7, 2017
Accepted:
December 5, 2017
Published:
February 20, 2018
Keywords:
high-speed vision, AE control, color tracking, HDR imaging
Abstract

In this study, we develop a real-time high-frame-rate vision system with frame-by-frame automatic exposure (AE) control that can simultaneously synthesize multiple images with different exposure times into a high-dynamic-range (HDR) image for scenarios with dynamic change in illumination. By accelerating the video capture and processing for time-division multithread AE control at the millisecond level, the proposed system can virtually function as multiple AE cameras with different exposure times. This system can capture color HDR images of 512 × 512 pixels in real time at 500 fps by synthesizing four 8-bit color images with different exposure times at consecutive frames, captured at an interval of 2 ms, with pixel-level parallel processing accelerated by a GPU (Graphic Processing Unit) board. Several experimental results for scenarios with a large change in illumination are demonstrated to confirm the performance of the proposed system for real-time HDR imaging.

HDR imaging for dynamic scene

HDR imaging for dynamic scene

Cite this article as:
X. Jiang, Q. Gu, T. Aoyama, T. Takaki, and I. Ishii, “A High-Speed Vision System with Multithread Automatic Exposure Control for High-Dynamic-Range Imaging,” J. Robot. Mechatron., Vol.30 No.1, pp. 117-127, 2018.
Data files:
References
  1. [1] P. E. Debevec and J. Malik, “Recovering High Dynamic Range Radiance Maps from Photographs,” Proc. ACM SIGGRAPH, pp. 369-378, 1997.
  2. [2] G. Yourganov and W. Stuerzlinger, “Acquiring High Dynamic Range Video at Video Rates,” Tech. Rep. York Univ., 2001.
  3. [3] S. B. Kang, M. Uyttendaele, S. A. J. Winder, and R. Szeliski, “High Dynamic Range Video,” ACM Trans. Graphics, Vol.22, pp. 319-325, 2003.
  4. [4] B. Guthier, S. Kopf, M. Wichtlhuber, and W. Effelsberg, “Parallel Implementation of a Real-time High Dynamic Range Video System,” Integrated Computer-Aided Engineering, Vol.21, pp. 189-202, 2014.
  5. [5] P. Larpray, B. Heyrman, and D. Ginhac, “Hardware-based Smart Camera for Recovering High Dynamic Range Video from Multiple Exposures,” Optical Engineering, Vol.53, pp. 1-10, 2014.
  6. [6] O. T. Tursun, A. O. Akyuz, A. Erdem, and E. Erdem, “An Objective Deghosting Quality Metric for HDR Images,” Computer Graphics Forum, Vol.35, pp. 139-152, 2016.
  7. [7] O. Gallo, A. Troccoli, J. Hu, K. Pulli, and J. Kautz, “Locally Non-Rigid Registration for Mobile HDR Photography,” arXiv: 1504.01441.
  8. [8] A. Troccoli, S. B. Kang, and S. Seitz, “Multi-view Multi-exposure Stereo,” Proc. Int. Symp. on 3D data Processing, Visualization, and Transmission, Vol.22, pp. 861-868, 2006.
  9. [9] M. Batz, T. Richter, J. U. Garbas, and A. Papst, “High Dynamic Range Video Reconstruction from a Stereo Camera Setup,” Signal Processing: Image Communication, Vol.29, pp. 191-202, 2014.
  10. [10] M. Aggarwal and N. Ahuja, “Split Aperture Imaging for High Dynamic Range,” Int. J. Comput. Vision, Vol.58, pp. 7-17, 2001.
  11. [11] J. Kronander, S. Gustavson, G. Bonnet, A. Ynnerman, and J. Unger, “A Unified Framework for Multi-Sensor HDR Video Reconstruction,” Signal Processing: Image Communication, Vol.29, pp. 203-215, 2014.
  12. [12] S. K. Nayar and T. Mitsunaga, “High Dynamic Range Imaging: Spatially Varying Pixel Exposures,” Proc. Comput. Vis. Patt. Recog., Vol.1, pp. 472-479, 2000.
  13. [13] A. E. Gamal, “High Dynamic Range Image Sensors,” Proc. Int. Solid-State Circuits Conf., 2002.
  14. [14] I. Ishii, T. Taniguchi, R. Sukenobe, and K. Yamamoto, “Development of High-speed and Real-time Vision Platform, H3 Vision,” Proc. IEEE/RSJ Int. Conf. Intelli. Robot. Sys., pp. 3671-3678, 2009.
  15. [15] I. Ishii, T. Tatebe, Q. Gu, Y. Moriue, T. Takaki, and K. Tajima, “2000 fps Real-time Vision System with High-frame-rate Video Recording,” Proc. IEEE Int. Conf. Robot. Autom., pp. 1536-1541, 2010.
  16. [16] T. Yamazaki, H. Katayama, S. Uehara, A. Nose, M. Kobayashi, S. Shida, M. Odahara, K. Takamiya, Y. Hisamatsu, S. Matsumoto, L. Miyashita, Y. Watanabe, T. Izawa, Y. Muramatsu, and M. Ishikawa, “A 1ms High-Speed Vision Chip with 3D-Stacked 140GOPS Column-Parallel PEs for Spatio-Temporal Image Processing,” Proc. Int. Solid-State Circuits Conf., pp. 82-83, 2017.
  17. [17] K. Okumura, K. Yokoyama, H. Oku, and M. Ishikawa, “1 ms Auto Pantilt – Video Shooting Technology for Objects in Motion Based on Saccade Mirror with Background Subtraction,” Advan. Robot., Vol.29, No.7, pp. 457-468, 2015.
  18. [18] M. Jiang, T. Aoyama, T. Takaki, and I. Ishii, “Pixel-Level and Robust Vibration Source Sensing in High-Frame-Rate Video Analysis,” Sensors, Vol.16, No.11, E1842, 2016.
  19. [19] M. Jiang, Q. Gu, T. Aoyama, T. Takaki, and I. Ishii, “Real-Time Vibration Source Tracking Using High-Speed Vision,” IEEE Sensors J., Vol.17, No.11, pp. 1513-1527, 2017.
  20. [20] A. Namiki, Y. Imai, M. Kaneko, and M. Ishikawa, “Development of a High-Speed Multifingered Hand System and Its Application to Catching,” Proc. IEEE/RSJ Int. Conf. Intelli. Robot. Sys., pp. 2666-2671, 2003.
  21. [21] Q. Gu, T. Aoyama, T. Takaki, and I. Ishii, “Simultaneous Vision-Based Shape and Motion Analysis of Cells Fast-Flowing in a Microchannel,” IEEE Trans. Automat. Sci. Eng., Vol.12, No.1, pp. 204-215, 2015.
  22. [22] Q. Gu, T. Kawahara, T. Aoyama, T. Takaki, I. Ishii, A. Takemoto, and N. Sakamoto, “LOC-Based High-Throughput Cell Morphology Analysis System,” IEEE Trans. on Automation Science and Engineering, Vol.12, No.4, pp. 1346-1356, 2015.
  23. [23] T. Aoyama, A. De Zoysa, Q. Gu, T. Takaki, and I. Ishii, “Vision-Based Real-time Micro Flow-Rate Control System for Cell Analysis,” J. of Robotics and Mechatronics, Vol.28, No.6, pp. 854-861, 2016.
  24. [24] H. Yang, T. Takaki, and I. Ishii, “Simultaneous Dynamics-Based Visual Inspection Using Modal Parameter Estimation,” J. of Robotics and Mechatronics, Vol.23, No.1, pp. 180-195, 2011.
  25. [25] H. Yang, Q. Gu, T. Aoyama, T. Takaki, and I. Ishii, “Dynamics-Based Stereo Visual Inspection Using Multidimensional Modal Analysis,” IEEE Sensors J., Vol.13, No.12, pp. 4831-4843, 2013.
  26. [26] I. Ishii, T. Taniguchi, K. Yamamoto, and T. Takaki, “High-Frame-Rate Optical Flow System,” IEEE Trans. Circ. Sys. Video Tech., Vol.22, No.1, pp. 105-112, 2012.
  27. [27] I. Ishii, T. Tatebe, Q. Gu, and T. Takaki, “Color-Histogram-Based Tracking at 2000 fps,” J. Electronic Imaging, Vol.21, No.1, 013010-1, 2012.
  28. [28] I. Ishii, T. Ichida, Q. Gu, and T. Takaki, “500-fps face Tracking System,” J. Real-time Image Proc., Vol.8, No.4, pp. 379-388, 2013.
  29. [29] Q. Gu, S. Raut, K. Okumura, T. Aoyama, T. Takaki, and I. Ishii, “Real-Time Image Mosaicing System Using a High-Frame-Rate Video Sequence,” J. of Robotic and Mechatronics, Vol.27, No.1, pp. 12-23, 2015.
  30. [30] Q. Gu, A. A. Noman, T. Aoyama, T. Takaki, and I. Ishii, “A High-Frame-Rate Vision System with Automatic Exposure Control,” IEICE Trans. Information and Systems, Vol.97-D, pp. 936-950, 2014.
  31. [31] J. Duan, M. Bressan, C. Dance, and G. Qiu, “Tone-Mapping High Dynamic Range Images by Novel Histogram Adjustment,” Patt. Recog., Vol.43, pp. 1847-1862, 2010.
  32. [32] O. Gallo, N. Gelfand, W. C. Chen, M. Tico, and K. Pulli, “Artifact-Free High Dynamic Range Imaging,” Proc. IEEE Int. Conf. of Computational Photography, pp. 1-7, 2009.
  33. [33] A. R. Smith, “Color Gamut Transform Pairs,” SIGGRAPH Comput. Graph., Vol.12, No.3, pp. 12-19, 1978.

*This site is desgined based on HTML5 and CSS3 for modern browsers, e.g. Chrome, Firefox, Safari, Edge, Opera.

Last updated on Apr. 18, 2024