288 pp. per issue
6 x 9, illustrated
2014 Impact factor:

Neural Computation

May 1994, Vol. 6, No. 3, Pages 405-419
(doi: 10.1162/neco.1994.6.3.405)
© 1994 Massachusetts Institute of Technology
Integration and Differentiation in Dynamic Recurrent Neural Networks
Article PDF (1.33 MB)

Dynamic neural networks with recurrent connections were trained by backpropagation to generate the differential or the leaky integral of a nonrepeating frequency-modulated sinusoidal signal. The trained networks performed these operations on arbitrary input waveforms. Reducing the network size by deleting ineffective hidden units and combining redundant units, and then retraining the network produced a minimal network that computed the same function and revealed the underlying computational algorithm. Networks could also be trained to compute simultaneously the differential and integral of the input on two outputs; the two operations were performed in distributed overlapping fashion, and the activations of the hidden units were dominated by the integral. Incorporating units with time constants into model networks generally enhanced their performance as integrators and interfered with their ability to differentiate.