Paper:
Views over last 60 days: 631
Intelligent Sensor Fusion in Robotic Prosthetic Eye System
Jason J. Gu*, Max Meng**, Albert Cook***, and Peter Xiaoping Liu****
*Department of Electrical and Computer Engineering Dalhousie University, Halifax, NS B3J 2X4, Canada
**Department of Electronic Engineering, The Chinese University of Hong Kong, Shatin, Hong Kong
***Dean of Faculty of Rehabilitation Medicine University of Alberta, 3-48 Corbett Hall, Edmonton, AB T6G 2G4, Canada
****Department of Systems and Computer Engineering, Carleton University, Ottawa, ON, Canada K1S 5B6
Received:May 14, 2003Accepted:March 20, 2004Published:May 20, 2004
Keywords:robotics, neural networks, median filter, control system, sensor fusion
Abstract
This paper is concerned with the design, sensing and control of a robotic prosthetic eye that moves horizontally in synchronization with the movement of the natural eye. Two generations of robotic prosthetic eye models have been developed. Theoretical issues on sensor failure detection and recovery, and signal processing techniques used in sensor data fusion are studied using statistical methods and artificial neural network based techniques. In addition, practical control system design and implementation using micro controllers are studied and implemented to carry out the natural eye movement detection and artificial robotic eye control tasks. Simulation and experimental studies are performed and the results are included to demonstrate the effectiveness of the research project reported in this paper.
Cite this article as:J. Gu, M. Meng, A. Cook, and P. Liu, “Intelligent Sensor Fusion in Robotic Prosthetic Eye System,” J. Adv. Comput. Intell. Intell. Inform., Vol.8 No.3, pp. 313-323, 2004.Data files: