Monthly
288 pp. per issue
6 x 9, illustrated
ISSN
0899-7667
E-ISSN
1530-888X
2014 Impact factor:
2.21

Neural Computation

Summer 1990, Vol. 2, No. 2, Pages 248-260
(doi: 10.1162/neco.1990.2.2.248)
© 1990 Massachusetts Institute of Technology
The Perceptron Algorithm is Fast for Nonmalicious Distributions
Article PDF (674.12 KB)
Abstract

Within the context of Valiant's protocol for learning, the perceptron algorithm is shown to learn an arbitrary half-space in time O(n2/∊3) if D, the probability distribution of examples, is taken uniform over the unit sphere Sn. Here ∊ is the accuracy parameter. This is surprisingly fast, as “standard” approaches involve solution of a linear programming problem involving Ω(n/∊) constraints in n dimensions. A modification of Valiant's distribution-independent protocol for learning is proposed in which the distribution and the function to be learned may be chosen by adversaries, however these adversaries may not communicate. It is argued that this definition is more reasonable and applicable to real world learning than Valiant's. Under this definition, the perceptron algorithm is shown to be a distribution-independent learning algorithm. In an appendix we show that, for uniform distributions, some classes of infinite V-C dimension including convex sets and a class of nested differences of convex sets are learnable.