Some novel real/complex-valued neural network models

1Citations
Citations of this article
3Readers
Mendeley users who have this article in their library.
Get full text

Abstract

Traditional models of neurons are based on the assumption that a synapse is a lumped element represented by a scalar synaptic weight. But to faithfully model biological neurons, synapse is considered as a linear filter. Thus, a new model of continuous time neuron is discussed. It is described how such model leads to interesting neural networks. Also continuous time, complex-valued neuron is discussed. It is also described, how a synapse can be modeled as an FIR filter. Such a model of neuron leads to practically useful neural networks. A novel, continuous time associative memory is proposed. An approach to resolve the convergence of state of such an associative memory is discussed. Various interesting generalizations of neural networks are described.

Cite

CITATION STYLE

APA

Garimella, R. (2006). Some novel real/complex-valued neural network models. In Computational Intelligence, Theory and Applications: International Conference 9th Fuzzy Days in Dortmund, Germany, Sept. 18-20, 2006 Proceedings (pp. 473–483). Springer Berlin Heidelberg. https://doi.org/10.1007/3-540-34783-6_47

Register to see more suggestions

Mendeley helps you to discover research relevant for your work.

Already have an account?

Save time finding and organizing research with Mendeley

Sign up for free