288 pp. per issue
6 x 9, illustrated
2014 Impact factor:

Neural Computation

Fall 1991, Vol. 3, No. 3, Pages 402-408
(doi: 10.1162/neco.1991.3.3.402)
© 1991 Massachusetts Institute of Technology
Learning by Asymmetric Parallel Boltzmann Machines
Article PDF (304.7 KB)

We consider the Little, Shaw, Vasudevan model as a parallel asymmetric Boltzmann machine, in the sense that we extend to this model the entropic learning rule first studied by Ackley, Hinton, and Sejnowski in the case of a sequentially activated network with symmetric synaptic matrix. The resulting Hebbian learning rule for the parallel asymmetric model draws the signal for the updating of synaptic weights from time averages of the discrepancy between expected and actual transitions along the past history of the network. As we work without the hypothesis of symmetry of the weights, we can include in our analysis also feedforward networks, for which the entropic learning rule turns out to be complementary to the error backpropagation rule, in that it “rewards the correct behavior” instead of “penalizing the wrong answers.”