Intelligent Sensor Fusion in Robotic Prosthetic Eye System
Jason J. Gu*, Max Meng**, Albert Cook***, and Peter Xiaoping Liu****
*Department of Electrical and Computer Engineering Dalhousie University, Halifax, NS B3J 2X4, Canada
**Department of Electronic Engineering, The Chinese University of Hong Kong, Shatin, Hong Kong
***Dean of Faculty of Rehabilitation Medicine University of Alberta, 3-48 Corbett Hall, Edmonton, AB T6G 2G4, Canada
****Department of Systems and Computer Engineering, Carleton University, Ottawa, ON, Canada K1S 5B6
This paper is concerned with the design, sensing and control of a robotic prosthetic eye that moves horizontally in synchronization with the movement of the natural eye. Two generations of robotic prosthetic eye models have been developed. Theoretical issues on sensor failure detection and recovery, and signal processing techniques used in sensor data fusion are studied using statistical methods and artificial neural network based techniques. In addition, practical control system design and implementation using micro controllers are studied and implemented to carry out the natural eye movement detection and artificial robotic eye control tasks. Simulation and experimental studies are performed and the results are included to demonstrate the effectiveness of the research project reported in this paper.
This article is published under a Creative Commons Attribution-NoDerivatives 4.0 Internationa License.