Fusion of Multiple Ultrasonic Sensor Data and Image Data for Measuring an Object’s Motion
Kazunori Umeda*, Jun Ota**, and Hisayuki Kimura***
*Dept. Precision Mechanics, Faculty of Science and Engineering, Chuo University, 1-13-27 Kasuga, Bunkyo-ku, Tokyo 112-8551, Japan
**Dept. of Precision Engineering, Graduate School of Engineering, The University of Tokyo, 7-3-1 Hongo, Bunkyo-ku, Tokyo 113-8656, Japan
***Kanagawa Prefectural Shoko Commercial and Technical High School, 743 Imai-cho, Hodogaya-ku, Yokohama 240-0035, Japan
Robot sensing requires two types of observation – intensive and wide-angle. We selected multiple ultrasonic sensors for intensive observation and an image sensor for wide-angle observation in measuring a moving object’s motion with sensors in two kinds of fusion – one fusing multiple ultrasonic sensor data and the other fusing the two types of sensor data. The fusion of multiple ultrasonic sensor data takes advantage of object movement from a measurement range of an ultrasonic sensor to another sensor’s range. They are formulated in a Kalman filter framework. Simulation and experiments demonstrate the effectiveness and applicability to an actual robot system.
-  Y. Yagi, H. Okumura, and M. Yachida, “Multiple visual sensing system for mobile robot,” Proc. 1994 IEEE Int. Conf. on RA, pp. 1679-1684, 1994.
-  Y. Kuniyoshi, N. Kita, K. Sugimoto, S. Nakamura, and T. Suehiro, “A Foveated Wide Angle Lens for Active Vision,” Proc. 1995 IEEE Int. Conf. on RA, pp. 2982-2988, 1995.
-  A. Ohya, Y. Nagashima, and S. Yuta, “Exploring Unknown Environment and Map Construction Using Ultrasonic Sensing of Normal Direction of Walls,” IEEE Int. Conf. Robotics and Automation’94, Vol.1, pp. 485-492, 1994.
-  M. Takano, S. Odaka, T. Tsukishima, and K. Sasaki, “Study On Mobile Robot Navigation Control By Internal And External Sensor Data With Ultrasonic Sensor,” Proc. IEEE/RSJ Int. Workshop on Robots and Systems’89 (IROS’89), pp. 456-463, 1989.
-  H. Choset, K. Nagatani, and N. Lazar, “The Arc-Transversal Median Algorithm: A Geometric Approach to Increasing Ultrasonic Sensor Azimuth Accuracy,” IEEE Trans. on Robotics and Automation, Vol.19, No.3, pp. 513-523, 2003.
-  O. Wijk, P. Jensfelt, and H. Christensen, “Triangulation Based Fusion of Ultrasonic Sensor Data,” IEEE Proc. Int. Conf. on Robotics and Automation, pp. 3419-3424, 1998.
-  A. Ohya, A. Kosaka, and A. Kak, “Vision-Based Navigation by Mobile Robots with Obstacle Avoidance Using Single-Camera Vision and Ultrasonic Sensing,” IEEE Trans. Robotics and Automation, Vol.14, No.6, pp. 969-978, 1998.
-  K. Umeda, J. Ota, and H. Kimura, “Fusion of Multiple Ultrasonic Sensor Data and Imagery Data for Measuring Moving Obstacle’s Motion,” Proc. 1996 IEEE/SICE/RSJ Int. Conf. on Multisensor Fusion and Integration for Intelligent Systems, pp. 742-748, 1996.
-  For example, iRobot
-  J. Borenstein, and Y. Koren, “Error eliminating rapid ultrasonic firing for mobile robot obstacle avoidance,” IEEE Trans. RA, Vol.11, No.1, pp. 132-138, 1995.
-  A. la Cour-Harbo, and J. Stoustrup, “Using spread spectrum transform for fastand robust simultaneous measurement in active sensors with multiple emitters,” Proc. of IEEE IECON’02, pp. 2669-2674, 2002.
-  Z. Zhang, and O. Faugeras, “3D dynamic scene analysis,” Springer-Verlag, 1992.
-  R. C. Luo, and M. G. Kay, “Multisensor integration and fusion in intelligent systems,” IEEE Trans. SMC, Vol.19, No.5, pp. 901-931, 1989.
This article is published under a Creative Commons Attribution-NoDerivatives 4.0 International License.
Copyright© 2005 by Fuji Technology Press Ltd. and Japan Society of Mechanical Engineers. All right reserved.