288 pp. per issue
6 x 9, illustrated
2014 Impact factor:

Neural Computation

December 1, 2004, Vol. 16, No. 12, Pages 2597-2637
(doi: 10.1162/0899766042321805)
© 2004 Massachusetts Institute of Technology
Mean Field and Capacity in Realistic Networks of Spiking Neurons Storing Sparsely Coded Random Memories
Article PDF (274.67 KB)

Mean-field (MF) theory is extended to realistic networks of spiking neurons storing in synaptic couplings of randomly chosen stimuli of a given low coding level. The underlying synaptic matrix is the result of a generic, slow, long-term synaptic plasticity of two-state synapses, upon repeated presentation of the fixed set of the stimuli to be stored. The neural populations subtending the MF description are classified by the number of stimuli to which their neurons are responsive (multiplicity). This involves 2p + 1 populations for a network storing p memories. The computational complexity of the MF description is then significantly reduced by observing that at low coding levels (f), only a few populations remain relevant: the population of mean multiplicity –pf and those of multiplicity of order √pf around the mean.

The theory is used to produce (predict) bifurcation diagrams (the onset of selective delay activity and the rates in its various stationary states) and to compute the storage capacity of the network (the maximal number of single items used in training for each of which the network can sustain a persistent, selective activity state). This is done in various regions of the space of constitutive parameters for the neurons and for the learning process. The capacity is computed in MF versus potentiation amplitude, ratio of potentiation to depression probability and coding level f. The MF results compare well with recordings of delay activity rate distributions in simulations of the underlying microscopic network of 10,000 neurons.