288 pp. per issue
6 x 9, illustrated
2014 Impact factor:

Neural Computation

July 2013, Vol. 25, No. 7, Pages 1926-1951
(doi: 10.1162/NECO_a_00457)
© 2013 Massachusetts Institute of Technology
Multiple Spectral Kernel Learning and a Gaussian Complexity Computation
Article PDF (201.05 KB)

Multiple kernel learning (MKL) partially solves the kernel selection problem in support vector machines and similar classifiers by minimizing the empirical risk over a subset of the linear combination of given kernel matrices. For large sample sets, the size of the kernel matrices becomes a numerical issue. In many cases, the kernel matrix is of low-efficient rank. However, the low-rank property is not efficiently utilized in MKL algorithms. Here, we suggest multiple spectral kernel learning that efficiently uses the low-rank property by finding a kernel matrix from a set of Gram matrices of a few eigenvectors from all given kernel matrices, called a spectral kernel set. We provide a new bound for the gaussian complexity of the proposed kernel set, which depends on both the geometry of the kernel set and the number of Gram matrices. This characterization of the complexity implies that in an MKL setting, adding more kernels may not monotonically increase the complexity, while previous bounds show otherwise.