JACIII Vol.9 No.2 pp. 127-133
doi: 10.20965/jaciii.2005.p0127


Studies on an Electronic Analog of a Recurrent Neural Network with Retrieval Phase Weight Adaptations

Vishwanathan Mohan, Yashwant V. Joshi, Anand Itagi, and Garipelli Gangadhar

Dept. of Electronics and Telecommunication Engineering, SGGS Institute of Engineering and Technology, Vishnupuri, Nanded, Maharashtra State 431606, India

October 20, 2004
December 25, 2004
March 20, 2005
weight adaptation, retrieval, Hopfield model, capacity, character recognition

It is argued that weight adaptations even during retrieval phase can greatly enhance the performance of a neurodynamic associative memory. Our simulations with an electronic implementation of an associative memory showed that extending the Hopfield dynamics with an appropriate adaptive law in retrieval phase could give rise to significant improvements in storage capacity and computational reliability. Weights, which are supposed to encode the information stored in the Hopfield neural network, are usually held constant once training/storage is complete. In our case, weights also change during retrieval, hence losing information in the process, but resulting in much better retrieval of stored patterns. We describe and characterize the functional elements comprising the network, the learning system, and include the experimental results obtained from applying the network for character recognition in various noisy conditions. Stability issues emerging as a consequence of retrieval phase weight adaptation and implications of weights being used as transitory, intermediary variables are briefly discussed.

Cite this article as:
V. Mohan, Y. Joshi, A. Itagi, and G. Gangadhar, “Studies on an Electronic Analog of a Recurrent Neural Network with Retrieval Phase Weight Adaptations,” J. Adv. Comput. Intell. Intell. Inform., Vol.9, No.2, pp. 127-133, 2005.
Data files:

*This site is desgined based on HTML5 and CSS3 for modern browsers, e.g. Chrome, Firefox, Safari, Edge, Opera.

Last updated on Sep. 30, 2022