Monthly
288 pp. per issue
6 x 9, illustrated
ISSN
0899-7667
E-ISSN
1530-888X
2014 Impact factor:
2.21

Neural Computation

Fall 1991, Vol. 3, No. 3, Pages 375-385
(doi: 10.1162/neco.1991.3.3.375)
© 1991 Massachusetts Institute of Technology
FIR and IIR Synapses, a New Neural Network Architecture for Time Series Modeling
Article PDF (354.23 KB)
Abstract

A new neural network architecture involving either local feedforward global feedforward, and/or local recurrent global feedforward structure is proposed. A learning rule minimizing a mean square error criterion is derived. The performance of this algorithm (local recurrent global feedforward architecture) is compared with a local-feedforward global-feedforward architecture. It is shown that the local-recurrent global-feedforward model performs better than the local-feedforward global-feedforward model.