Header logo is

Generalization performance of regularization networks and support vector machines via entropy numbers of compact operators

2001

Article

ei


We derive new bounds for the generalization error of kernel machines, such as support vector machines and related regularization networks by obtaining new bounds on their covering numbers. The proofs make use of a viewpoint that is apparently novel in the field of statistical learning theory. The hypothesis class is described in terms of a linear operator mapping from a possibly infinite-dimensional unit ball in feature space into a finite-dimensional space. The covering numbers of the class are then determined via the entropy numbers of the operator. These numbers, which characterize the degree of compactness of the operator can be bounded in terms of the eigenvalues of an integral operator induced by the kernel function used by the machine. As a consequence, we are able to theoretically explain the effect of the choice of kernel function on the generalization performance of support vector machines.

Author(s): Williamson, RC. and Smola, AJ. and Schölkopf, B.
Journal: IEEE Transactions on Information Theory
Volume: 47
Number (issue): 6
Pages: 2516-2532
Year: 2001
Month: September
Day: 0

Department(s): Empirical Inference
Bibtex Type: Article (article)

Digital: 0
DOI: 10.1109/18.945262
Language: en
Organization: Max-Planck-Gesellschaft
School: Biologische Kybernetik

BibTex

@article{787,
  title = {Generalization performance of regularization networks and support vector machines via entropy numbers of compact operators},
  author = {Williamson, RC. and Smola, AJ. and Sch{\"o}lkopf, B.},
  journal = {IEEE Transactions on Information Theory},
  volume = {47},
  number = {6},
  pages = {2516-2532},
  organization = {Max-Planck-Gesellschaft},
  school = {Biologische Kybernetik},
  month = sep,
  year = {2001},
  doi = {10.1109/18.945262},
  month_numeric = {9}
}