Header logo is

Discrete vs. Continuous: Two Sides of Machine Learning

2004

Talk

ei


We consider the problem of transductive inference. In many real-world problems, unlabeled data is far easier to obtain than labeled data. Hence transductive inference is very significant in many practical problems. According to Vapnik's point of view, one should predict the function value only on the given points directly rather than a function defined on the whole space, the latter being a more complicated problem. Inspired by this idea, we develop discrete calculus on finite discrete spaces, and then build discrete regularization. A family of transductive algorithms is naturally derived from this regularization framework. We validate the algorithms on both synthetic and real-world data from text/web categorization to bioinformatics problems. A significant by-product of this work is a powerful way of ranking data based on examples including images, documents, proteins and many other kinds of data. This talk is mainly based on the followiing contribution: (1) D. Zhou and B. Sch{\"o}lkopf: Transductive Inference with Graphs, MPI Technical report, August, 2004; (2) D. Zhou, B. Sch{\"o}lkopf and T. Hofmann. Semi-supervised Learning on Directed Graphs. NIPS 2004; (3) D. Zhou, O. Bousquet, T.N. Lal, J. Weston and B. Sch{\"o}lkopf. Learning with Local and Global Consistency. NIPS 2003.

Author(s): Zhou, D.
Year: 2004
Month: October
Day: 12

Department(s): Empirical Inference
Bibtex Type: Talk (talk)

Digital: 0
Event Place: IBM Watson Research Center, Yorktown Heights, New York
Organization: Max-Planck-Gesellschaft
School: Biologische Kybernetik

Links: PDF

BibTex

@talk{2902,
  title = {Discrete vs. Continuous: Two Sides of Machine Learning},
  author = {Zhou, D.},
  organization = {Max-Planck-Gesellschaft},
  school = {Biologische Kybernetik},
  month = oct,
  year = {2004},
  doi = {},
  month_numeric = {10}
}