High order eigentensors as symbolic rules in competitive learning

1Citations
Citations of this article
1Readers
Mendeley users who have this article in their library.
Get full text

Abstract

We discuss properties of high order neurons in competitive learning. In such neurons, geometric shapes replace the role of classic 'point' neurons in neural networks. Complex analytical shapes are modeled by replacing the classic synaptic weight of the neuron by high-order tensors in homogeneous coordinates. Such neurons permit not only mapping of the data domain but also decomposition of some of its topological properties, which may reveal symbolic structure of the data. Moreover, eigentensors of the synaptic tensors reveal the coefficients of polynomial rules that the network is essentially carrying out. We show how such neurons can be formulated to follow the maximum-correlation activation principle and permit simple local Hebbian learning. We demonstrate decomposition of spatial arrangements of data clusters including very close and partially overlapping clusters, which are difficult to separate using classic neurons.

Cite

CITATION STYLE

APA

Lipson, H., & Siegelmann, H. T. (2000). High order eigentensors as symbolic rules in competitive learning. In Lecture Notes in Artificial Intelligence (Subseries of Lecture Notes in Computer Science) (Vol. 1778, pp. 286–297). Springer Verlag. https://doi.org/10.1007/10719871_20

Register to see more suggestions

Mendeley helps you to discover research relevant for your work.

Already have an account?

Save time finding and organizing research with Mendeley

Sign up for free