single-jc.php

JACIII Vol.11 No.7 pp. 780-786
doi: 10.20965/jaciii.2007.p0780
(2007)

Paper:

Neural Network Model for Word Sense Disambiguation Using Up/Down State and Morphoelectrotonic Transform

Norifumi Watanabe and Shun Ishizaki

Graduate School of Media and Governance, Keio University, 5233 Endo, Fujisawa, Kanagawa 252-8520, Japan

Received:
January 16, 2007
Accepted:
May 2, 2007
Published:
September 20, 2007
Keywords:
neural network, word sense disambiguation, up/down state, morphoelectrotonic transform
Abstract
We propose a neural network model for word sense disambiguation using up/down state neurons and morphoelectrotonic transform. Relations between stimulus words and associated words are implemented in this neural network using an associative ontology. This new neural coding model disambiguates word senses in input sentences by using neural network firing dynamics. Whether to put a new link between two neurons is decided using the cooccurrence frequency between two words corresponding to the neurons and the attenuation rate of morphoelectrotonic potential between the two neurons. The distance of the new link is obtained with a learning mechanism by calculating morphoelectrotonic transform from the morphoelectrotonic potential of the two neurons. By analyzing learning behavior using average shortest path lengths and clustering coefficients, we show that this model has a small-world structure.
Cite this article as:
N. Watanabe and S. Ishizaki, “Neural Network Model for Word Sense Disambiguation Using Up/Down State and Morphoelectrotonic Transform,” J. Adv. Comput. Intell. Intell. Inform., Vol.11 No.7, pp. 780-786, 2007.
Data files:
References
  1. [1] K. Kitano, “Two-State Membrane Potential Transitions of Striatal Spiny Neurons as Evidenced by Numerical Simulations and Electrophysiological Recordings in Awake Monkeys,” J. Neuroscience, Vol.22, RC230, pp. 1-6, 2002.
  2. [2] A. M. Zador, “The morphoelectrotonic transform: A Graphical Approach to Dendritic Function,” J. Neuroscience, Vol.15, pp. 1669-1682, 1995.
  3. [3] J. L. Elman, “Distributed representations, simple recurrent networks, and grammatical structure,” Machine Learning, Vol.7, pp. 195-225, 1991.
  4. [4] E. Agirre and G. Rigau, “Word sense disambiguation using conceptual density,” Proc. of 1996 International Conference on Computational Linguistics COLING96, pp. 16-22, 1996.
  5. [5] N. Takahashi, “Applying layered neural networks to Japanese lexical disambiguation – a context-based approach,” Proc. of SNLP ’95, Bangkok, 1995.
  6. [6] K. Shirai and T. Yagi, “Learning a Robust Word Sense Disambiguation Model using Hypernyms in Definition Sentences,” 20th International Conference on Computational Linguistics, pp. 917-923, 2004.
  7. [7] K. Shirai, “SENSEVAL-2 Japanese Dictionary Task,” Natural Language Processing, Vol.10, No.3, pp. 3-24, 2003 (in Japanese).
  8. [8] J. Okamoto and S. Ishizaki, “Associative concept dictionary construction and its comparison with electronic concept dictionaries,” Proc. of 2003 Pacific Association for Computational Linguistics Conference, pp. 1-7, 2003.
  9. [9] N. Masuda, H. Miwa, and N. Konno, “Geographical threshold graphs with small-world and scale-free properties,” Physical Review E, Vol.71, 036108, 2005.
  10. [10] J. N. Kerr and D. Plenz, “Dendritic calcium encodes striatal neuron output during Up-states,” J. Neuroscience, Vol.22, pp. 1499-1512, 2002.
  11. [11] N. Masuda and K. Aihara, “Global and local synchrony of coupled neurons in small-world networks,” Biol. Cybern., Vol.90, pp. 302-309, 2004.
  12. [12] D. J. Watts, “Small Worlds,” Princeton University Press, Princeton, 1999.

*This site is desgined based on HTML5 and CSS3 for modern browsers, e.g. Chrome, Firefox, Safari, Edge, Opera.

Last updated on Oct. 01, 2024