288 pp. per issue
6 x 9, illustrated
2014 Impact factor:

Neural Computation

November 1993, Vol. 5, No. 6, Pages 910-927
(doi: 10.1162/neco.1993.5.6.910)
© 1993 Massachusetts Institute of Technology
On the Geometry of Feedforward Neural Network Error Surfaces
Article PDF (871.43 KB)

Many feedforward neural network architectures have the property that their overall input-output function is unchanged by certain weight permutations and sign flips. In this paper, the geometric structure of these equioutput weight space transformations is explored for the case of multilayer perceptron networks with tanh activation functions (similar results hold for many other types of neural networks). It is shown that these transformations form an algebraic group isomorphic to a direct product of Weyl groups. Results concerning the root spaces of the Lie algebras associated with these Weyl groups are then used to derive sets of simple equations for minimal sufficient search sets in weight space. These sets, which take the geometric forms of a wedge and a cone, occupy only a minute fraction of the volume of weight space. A separate analysis shows that large numbers of copies of a network performance function optimum weight vector are created by the action of the equioutput transformation group and that these copies all lie on the same sphere. Some implications of these results for learning are discussed.