288 pp. per issue
6 x 9, illustrated
2014 Impact factor:

Neural Computation

December 2009, Vol. 21, No. 12, Pages 3532-3561
(doi: 10.1162/neco.2009.11-08-908)
© 2009 Massachusetts Institute of Technology
Adaptive Relevance Matrices in Learning Vector Quantization
Article PDF (1.07 MB)

We propose a new matrix learning scheme to extend relevance learning vector quantization (RLVQ), an efficient prototype-based classification algorithm, toward a general adaptive metric. By introducing a full matrix of relevance factors in the distance measure, correlations between different features and their importance for the classification scheme can be taken into account and automated, and general metric adaptation takes place during training. In comparison to the weighted Euclidean metric used in RLVQ and its variations, a full matrix is more powerful to represent the internal structure of the data appropriately. Large margin generalization bounds can be transferred to this case, leading to bounds that are independent of the input dimensionality. This also holds for local metrics attached to each prototype, which corresponds to piecewise quadratic decision boundaries. The algorithm is tested in comparison to alternative learning vector quantization schemes using an artificial data set, a benchmark multiclass problem from the UCI repository, and a problem from bioinformatics, the recognition of splice sites for C. elegans.