Breaking SVM Complexity with Cross-Training
2005
Conference Paper
ei
We propose an algorithm for selectively removing examples from the training set using probabilistic estimates related to editing algorithms (Devijver and Kittler82). The procedure creates a separable distribution of training examples with minimal impact on the decision boundary position. It breaks the linear dependency between the number of SVs and the number of training examples, and sharply reduces the complexity of SVMs during both the training and prediction stages.
Author(s): | Bakir, GH. and Bottou, L. and Weston, J. |
Book Title: | Advances in Neural Information Processing Systems 17 |
Journal: | Advances in Neural Information Processing Systems |
Pages: | 81-88 |
Year: | 2005 |
Month: | July |
Day: | 0 |
Editors: | Saul, L.K. , Y. Weiss, L. Bottou |
Publisher: | MIT Press |
Department(s): | Empirical Inference |
Bibtex Type: | Conference Paper (inproceedings) |
Event Name: | Eighteenth Annual Conference on Neural Information Processing Systems (NIPS 2004) |
Event Place: | Vancouver, BC, Canada |
Address: | Cambridge, MA, USA |
Digital: | 0 |
ISBN: | 0-262-19534-8 |
Organization: | Max-Planck-Gesellschaft |
School: | Biologische Kybernetik |
Links: |
PDF
Web |
BibTex @inproceedings{2846, title = {Breaking SVM Complexity with Cross-Training}, author = {Bakir, GH. and Bottou, L. and Weston, J.}, journal = {Advances in Neural Information Processing Systems}, booktitle = {Advances in Neural Information Processing Systems 17}, pages = {81-88}, editors = {Saul, L.K. , Y. Weiss, L. Bottou}, publisher = {MIT Press}, organization = {Max-Planck-Gesellschaft}, school = {Biologische Kybernetik}, address = {Cambridge, MA, USA}, month = jul, year = {2005}, doi = {}, month_numeric = {7} } |