Paper:
Communication Robot Based on Image Processing and Voice Recognition
Noriyuki Kawarazaki and Tadashi Yoshidome
Department of Robotics and Mechatronics, Kanagawa Institute of Technology, 1030 Shimo-Ogino, Atsugi, Kanagawa 243-0292, Japan
This paper discusses a communication robot system based on image processing and voice recognition. We have developed the communication robot system Hakuen which consists of a multimedia robot with stereo cameras, a wheeled mobile robot and a PC with a microphone. What makes our robot unique is that the robot interacts with people in the same way the human beings do. The robot, for example, approaches and holds its hand out to someone based on the defined voice commands. The robot detects a person’s face based on the pixel values of the flesh tint in the color image. Since the system must calculate the distance between the robot and the person rapidly, we use this disparity. Experimental results clarified the effectiveness of our system.
- [1] N. Kawarazaki, N. Kashiwagi, I. Hoya, and K. Nishihara, “Manipulator Work System Using Gesture Instructions,” Journal of Robotics and Mechatronics, Vol.14, No.5, pp. 506-513, 2002.
- [2] A. Agah and K. Tanie, “Human interaction with a service robot: mobile-manipulator handing over an object to a human,” Proc. of the IEEE Int. Conf. on Robotics and Automation, pp. 575-580,1997.
- [3] V. I. Pavlovic, R. Sharma, and T. S. Huang, “Visual interpretation of hand gestures for human-computer interaction: A review,” IEEE Trans. on Pattern Analysis and Machine Intelligence, Vol.19, No.7, pp. 677-695, 1997.
- [4] H. Huettenrauch, K. S. Eklundh, A. Green, and E. A. Topp, “Investigating Spatial Relationships in Human-Robot Interaction,” Proceedings of the 2006 IEEE/RSJ International Conference on Intelligent Robots and Systems, pp. 5052-5059, 2006.
- [5] Z. Z. Bien, K. Park, W. Bang, and D. H. Stefanov, “LARES: An Intelligent Sweet Home for Assisting the Elderly and the Handicapped,” CWUAAT 2002, pp. 43-46, 2002.
- [6] T. Yoshidome, N. Kawarazaki, and K. Nishihara, “ A Robot Operation by a Voice Instruction Including a Quantitative Expression,” Proc. of the 5th FRANCO-JAPANESE CONGRESS & 3rd EUROPEAN-ASIAN CONGRESS on MECHATRONICS, pp. 123-126, 2001.
- [7] N. Yamasaki and Y. Anzai, “Active Interface for Human-Robot Interaction,” Proc. of the IEEE Int. Conf. on Robotics and Automation, pp. 3103-3109, 1995.
- [8] N. Kawarazaki, N. Kashiwagi, I. Hoya, and K. Nishihara, “Manipulator Work System Using Gesture Instructions,” Journal of Robotics and Mechatronics, Vol.14, No.5, pp. 506-513, 2002.
- [9] N. Kawarazaki, Y. Suzuki, Y. Takashima, K. Nishihara, and T. Yoshidome, “Robot Control System Using Omnidirectional Image,” Proc. of Japan-China Conference on Mechatoronics 2005, pp. 97-98.
- [10] N. Kawarazaki, K. Kawashima, T. Yoshidome, and K. Nishihara, “Communication Robot System based on stereo vision and voice instructions” Proc. of China-Japan Conference on Mechatoronics 2007, pp. 23-25.
- [11] N. Kawarazaki, Y. Kitajima, K. Kojima, and T. Yoshidome, “Communication robot system based on the handshaking action,” Proceedings of the 12th International Symposium on Robotics and Application, TUE-PM2, 2010.9.
- [12] N. Kawarazaki, Y. Kitajima, K. Kojima, and T. Yoshidome, “Communication robot based on the image processing and voice recognition,” Proceedings of the 4th International Symposium on Mechatronics, WP2-B-1, 2010.12.
- [13] E. T. Hall, “The Hidden Dimension,” Misuzu-shobo, 1970 (in Japanese).
- [14] J. C. Russ, “The Image Processing Handbook,” A CRC Handbook Published in Cooperation with IEEE press, S. M. Metev and V. P. Veiko, Laser Assisted Microtechnology, 2nd Ed., R. M. Osgood, Jr., Ed., Berlin, Germany: Springer-Verlag, 1998.
- [15] “Detection of the skin colour,” (in Japanese),
http://www.ai.kyutech.ac.jp/˜toshi/kiso_2/ensyu/color/skin.html
This article is published under a Creative Commons Attribution-NoDerivatives 4.0 Internationa License.