Monthly
288 pp. per issue
6 x 9, illustrated
ISSN
0899-7667
E-ISSN
1530-888X
2014 Impact factor:
2.21

Neural Computation

Fall 1989, Vol. 1, No. 3, Pages 312-317
(doi: 10.1162/neco.1989.1.3.312)
© 1989 Massachusetts Institute of Technology
The Vapnik-Chervonenkis Dimension: Information versus Complexity in Learning
Article PDF (305.96 KB)
Abstract

When feasible, learning is a very attractive alternative to explicit programming. This is particularly true in areas where the problems do not lend themselves to systematic programming, such as pattern recognition in natural environments. The feasibility of learning an unknown function from examples depends on two questions:

  • 1. Do the examples convey enough information to determine the function?

  • 2. Is there a speedy way of constructing the function from the examples?

These questions contrast the roles of information and complexity in learning. While the two roles share some ground, they are conceptually and technically different. In the common language of learning, the information question is that of generalization and the complexity question is that of scaling. The work of Vapnik and Chervonenkis (1971) provides the key tools for dealing with the information issue. In this review, we develop the main ideas of this framework and discuss how complexity fits in.