288 pp. per issue
6 x 9, illustrated
2014 Impact factor:

Neural Computation

April 2007, Vol. 19, No. 4, Pages 956-973
(doi: 10.1162/neco.2007.19.4.956)
© 2007 Massachusetts Institute of Technology
Information and Topology in Attractor Neural Networks
Article PDF (453.06 KB)

A wide range of networks, including those with small-world topology, can be modeled by the connectivity ratio and randomness of the links. Both learning and attractor abilities of a neural network can be measured by the mutual information (MI) as a function of the load and the overlap between patterns and retrieval states. In this letter, we use MI to search for the optimal topology with regard to the storage and attractor properties of the network in an Amari-Hopfield model. We find that while an optimal storage implies an extremely diluted topology, a large basin of attraction leads to moderate levels of connectivity. This optimal topology is related to the clustering and path length of the network. We also build a diagram for the dynamical phases with random or local initial overlap and show that very diluted networks lose their attractor ability.