single-rb.php

JRM Vol.27 No.4 pp. 365-373
doi: 10.20965/jrm.2015.p0365
(2015)

Paper:

Color Extraction Using Multiple Photographs Taken with Different Exposure Time in RWRC

Kenji Yamauchi, Naoki Akai, and Koichi Ozaki

Graduate School of Engineering, Utsunomiya University
7-1-2 Yoto, Utsunomiya-City, Tochigi 321-8585, Japan

Received:
February 3, 2015
Accepted:
April 21, 2015
Published:
August 20, 2015
Keywords:
color extraction, exposure time, color transition, Real World Robot Challenge
Abstract

Color extraction rsult in RWRC

Extracting the color of a target object from images in environments with different illumination conditions, such as outdoors, is difficult because color performance changes easily. The novel color extraction we propose enables the exact color of a target object to be extracted using multiple photographs taken with different exposure times. The object’s color performance transits due to changes in exposure time. This transition is the same as when environmental light sources do not change significantly. In outdoor environment, most situations are regarded as that situation. We first indicate this in an experimental analysis, then detail our proposal. Our method evaluates transition and realizes precise color extraction of target objects in outdoors. We apply this method to an orange cap in the Tsukuba Real-World Robot Challenge. Through experiments, we show that the cap is detected accurately in different environments and discuss the method’s effectiveness and usefulness in the real world.

Cite this article as:
K. Yamauchi, N. Akai, and K. Ozaki, “Color Extraction Using Multiple Photographs Taken with Different Exposure Time in RWRC,” J. Robot. Mechatron., Vol.27, No.4, pp. 365-373, 2015.
Data files:
References
  1. [1] J. Eguchi et al., “Development of the autonomous mobile robot for target-searching in urban areas in the Tsukuba Challenge 2013,” J. of Robotics and Mechatronics, Vol.26, No.2, pp. 166-176, 2014.
  2. [2] K. Yamauchi et al., “Person detection method based on color layout in real world robot challenge 2013,” J. of Robotics and Mechatronics, Vol.26, No.2, pp. 151-157, 2014.
  3. [3] K. Yamauchi et al., “Precise color extraction method based on color transition due to change in exposure time,” IEEE/SICE Int. Symposium on System Integration, pp. 275-280, 2014.
  4. [4] S.-S. Hao et al., “Smoothing algorithms for clip-and-paste model-based video coding,” IEEE Trans. on Consumer Electronics, Vol.45, No.2, pp. 427-435, 1999.
  5. [5] R. Kawakami et al., “Consistent surface color for texturing large objects in outdoor scenes,” Proc. of the Tenth IEEE Int. Conf. on Computer Vision, Vol.2, pp. 1200-1207, 2005.
  6. [6] G. Buchbaum, “A spatial processor model for object colour perception,” J. of the Franklin Institute, Vol.310, No.1, pp. 1-26, 1980.
  7. [7] E. Y. Lam, “Combining gray world and Retinex theory for automatic white balance in digital photography,” Proc. of the Ninth Int. Symposium on Consumer Electronics, pp. 134-139, 2005.
  8. [8] R. Kawakami et al., “A robust framework to estimate surface color from changing illumination,” Proc. of the Sixth Asian Conf. on Computer Vision, Vol.2, pp. 1026-1031, 2004.
  9. [9] G. D. Finlayson et al., “Color constancy under varying illumination,” Proc. of IEEE Int. Conf. on Computer Vision, pp. 720-725, 1995.
  10. [10] P. E. Debeve et al., “Recovering high dynamic range radiance map from photographs,” Proc. of the 24th Annual Conf. on Computer Gaphics and Interactive Techniques, pp. 369-378, 1997.

*This site is desgined based on HTML5 and CSS3 for modern browsers, e.g. Chrome, Firefox, Safari, Edge, IE9,10,11, Opera.

Last updated on Nov. 12, 2018