Header logo is


2015


no image
Causal Inference for Empirical Time Series Based on the Postulate of Independence of Cause and Mechanism

Besserve, M.

53rd Annual Allerton Conference on Communication, Control, and Computing, September 2015 (talk)

ei

[BibTex]

2015


[BibTex]


no image
Kernel methods in medical imaging

Charpiat, G., Hofmann, M., Schölkopf, B.

In Handbook of Biomedical Imaging, pages: 63-81, 4, (Editors: Paragios, N., Duncan, J. and Ayache, N.), Springer, Berlin, Germany, June 2015 (inbook)

ei

Web link (url) [BibTex]

Web link (url) [BibTex]


no image
Independence of cause and mechanism in brain networks

Besserve, M.

DALI workshop on Networks: Processes and Causality, April 2015 (talk)

ei

[BibTex]

[BibTex]


no image
Information-Theoretic Implications of Classical and Quantum Causal Structures

Chaves, R., Majenz, C., Luft, L., Maciel, T., Janzing, D., Schölkopf, B., Gross, D.

18th Conference on Quantum Information Processing (QIP), 2015 (talk)

ei

Web link (url) [BibTex]

Web link (url) [BibTex]


no image
Justifying Information-Geometric Causal Inference

Janzing, D., Steudel, B., Shajarisales, N., Schölkopf, B.

In Measures of Complexity: Festschrift for Alexey Chervonenkis, pages: 253-265, 18, (Editors: Vovk, V., Papadopoulos, H. and Gammerman, A.), Springer, 2015 (inbook)

ei

DOI [BibTex]

DOI [BibTex]


no image
The search for single exoplanet transits in the Kepler light curves

Foreman-Mackey, D., Hogg, D. W., Schölkopf, B.

IAU General Assembly, 22, pages: 2258352, 2015 (talk)

ei

link (url) [BibTex]

link (url) [BibTex]

2013


no image
Studying large-scale brain networks: electrical stimulation and neural-event-triggered fMRI

Logothetis, N., Eschenko, O., Murayama, Y., Augath, M., Steudel, T., Evrard, H., Besserve, M., Oeltermann, A.

Twenty-Second Annual Computational Neuroscience Meeting (CNS*2013), July 2013, journal = {BMC Neuroscience}, year = {2013}, month = {7}, volume = {14}, number = {Supplement 1}, pages = {A1}, (talk)

ei

Web [BibTex]

2013


Web [BibTex]


no image
A Review of Performance Variations in SMR-Based Brain–Computer Interfaces (BCIs)

Grosse-Wentrup, M., Schölkopf, B.

In Brain-Computer Interface Research, pages: 39-51, 4, SpringerBriefs in Electrical and Computer Engineering, (Editors: Guger, C., Allison, B. Z. and Edlinger, G.), Springer, 2013 (inbook)

ei

PDF DOI [BibTex]

PDF DOI [BibTex]


no image
Semi-supervised learning in causal and anticausal settings

Schölkopf, B., Janzing, D., Peters, J., Sgouritsa, E., Zhang, K., Mooij, J.

In Empirical Inference, pages: 129-141, 13, Festschrift in Honor of Vladimir Vapnik, (Editors: Schölkopf, B., Luo, Z. and Vovk, V.), Springer, 2013 (inbook)

ei

DOI [BibTex]

DOI [BibTex]


no image
Tractable large-scale optimization in machine learning

Sra, S.

In Tractability: Practical Approaches to Hard Problems, pages: 202-230, 7, (Editors: Bordeaux, L., Hamadi , Y., Kohli, P. and Mateescu, R. ), Cambridge University Press , 2013 (inbook)

ei

[BibTex]

[BibTex]


no image
Domain Generalization via Invariant Feature Representation

Muandet, K.

30th International Conference on Machine Learning (ICML2013), 2013 (talk)

ei

PDF [BibTex]

PDF [BibTex]


no image
On the Relations and Differences between Popper Dimension, Exclusion Dimension and VC-Dimension

Seldin, Y., Schölkopf, B.

In Empirical Inference - Festschrift in Honor of Vladimir N. Vapnik, pages: 53-57, 6, (Editors: Schölkopf, B., Luo, Z. and Vovk, V.), Springer, 2013 (inbook)

ei

[BibTex]

[BibTex]

2004


no image

no image
Discrete vs. Continuous: Two Sides of Machine Learning

Zhou, D.

October 2004 (talk)

Abstract
We consider the problem of transductive inference. In many real-world problems, unlabeled data is far easier to obtain than labeled data. Hence transductive inference is very significant in many practical problems. According to Vapnik's point of view, one should predict the function value only on the given points directly rather than a function defined on the whole space, the latter being a more complicated problem. Inspired by this idea, we develop discrete calculus on finite discrete spaces, and then build discrete regularization. A family of transductive algorithms is naturally derived from this regularization framework. We validate the algorithms on both synthetic and real-world data from text/web categorization to bioinformatics problems. A significant by-product of this work is a powerful way of ranking data based on examples including images, documents, proteins and many other kinds of data. This talk is mainly based on the followiing contribution: (1) D. Zhou and B. Sch{\"o}lkopf: Transductive Inference with Graphs, MPI Technical report, August, 2004; (2) D. Zhou, B. Sch{\"o}lkopf and T. Hofmann. Semi-supervised Learning on Directed Graphs. NIPS 2004; (3) D. Zhou, O. Bousquet, T.N. Lal, J. Weston and B. Sch{\"o}lkopf. Learning with Local and Global Consistency. NIPS 2003.

ei

PDF [BibTex]


no image
Grundlagen von Support Vector Maschinen und Anwendungen in der Bildverarbeitung

Eichhorn, J.

September 2004 (talk)

Abstract
Invited talk at the workshop "Numerical, Statistical and Discrete Methods in Image Processing" at the TU M{\"u}nchen (in GERMAN)

ei

PDF [BibTex]


no image
Riemannian Geometry on Graphs and its Application to Ranking and Classification

Zhou, D.

June 2004 (talk)

Abstract
We consider the problem of transductive inference. In many real-world problems, unlabeled data is far easier to obtain than labeled data. Hence transductive inference is very significant in many practical problems. According to Vapnik's point of view, one should predict the function value only on the given points directly rather than a function defined on the whole space, the latter being a more complicated problem. Inspired by this idea, we develop discrete calculus on finite discrete spaces, and then build discrete regularization. A family of transductive algorithms is naturally derived from this regularization framework. We validate the algorithms on both synthetic and real-world data from text/web categorization to bioinformatics problems. A significant by-product of this work is a powerful way of ranking data based on examples including images, documents, proteins and many other kinds of data.

ei

PDF [BibTex]


no image
Distributed Command Execution

Stark, S., Berlin, M.

In BSD Hacks: 100 industrial-strength tips & tools, pages: 152-152, (Editors: Lavigne, Dru), O’Reilly, Beijing, May 2004 (inbook)

Abstract
Often you want to execute a command not only on one computer, but on several at once. For example, you might want to report the current statistics on a group of managed servers or update all of your web servers at once.

ei

[BibTex]

[BibTex]


no image
Learning from Labeled and Unlabeled Data: Semi-supervised Learning and Ranking

Zhou, D.

January 2004 (talk)

Abstract
We consider the general problem of learning from labeled and unlabeled data, which is often called semi-supervised learning or transductive inference. A principled approach to semi-supervised learning is to design a classifying function which is sufficiently smooth with respect to the intrinsic structure collectively revealed by known labeled and unlabeled points. We present a simple algorithm to obtain such a smooth solution. Our method yields encouraging experimental results on a number of classification problems and demonstrates effective use of unlabeled data.

ei

PDF [BibTex]


no image
Introduction to Category Theory

Bousquet, O.

Internal Seminar, January 2004 (talk)

Abstract
A brief introduction to the general idea behind category theory with some basic definitions and examples. A perspective on higher dimensional categories is given.

ei

PDF [BibTex]

PDF [BibTex]


no image
Gaussian Processes in Machine Learning

Rasmussen, CE.

In 3176, pages: 63-71, Lecture Notes in Computer Science, (Editors: Bousquet, O., U. von Luxburg and G. Rätsch), Springer, Heidelberg, 2004, Copyright by Springer (inbook)

Abstract
We give a basic introduction to Gaussian Process regression models. We focus on understanding the role of the stochastic process and how it is used to define a distribution over functions. We present the simple equations for incorporating training data and examine how to learn the hyperparameters using the marginal likelihood. We explain the practical advantages of Gaussian Process and end with conclusions and a look at the current trends in GP work.

ei

PDF PostScript [BibTex]

PDF PostScript [BibTex]


no image
Protein Classification via Kernel Matrix Completion

Kin, T., Kato, T., Tsuda, K.

In pages: 261-274, (Editors: Schoelkopf, B., K. Tsuda and J.P. Vert), MIT Press, Cambridge, MA; USA, 2004 (inbook)

ei

PDF [BibTex]

PDF [BibTex]


no image
Statistische Lerntheorie und Empirische Inferenz

Schölkopf, B.

Jahrbuch der Max-Planck-Gesellschaft, 2004, pages: 377-382, 2004 (misc)

Abstract
Statistical learning theory studies the process of inferring regularities from empirical data. The fundamental problem is what is called generalization: how it is possible to infer a law which will be valid for an infinite number of future observations, given only a finite amount of data? This problem hinges upon fundamental issues of statistics and science in general, such as the problems of complexity of explanations, a priori knowledge, and representation of data.

ei

PDF Web [BibTex]

PDF Web [BibTex]


no image
Introduction to Statistical Learning Theory

Bousquet, O., Boucheron, S., Lugosi, G.

In Lecture Notes in Artificial Intelligence 3176, pages: 169-207, (Editors: Bousquet, O., U. von Luxburg and G. Rätsch), Springer, Heidelberg, Germany, 2004 (inbook)

ei

PDF [BibTex]

PDF [BibTex]


no image
A Primer on Kernel Methods

Vert, J., Tsuda, K., Schölkopf, B.

In Kernel Methods in Computational Biology, pages: 35-70, (Editors: B Schölkopf and K Tsuda and JP Vert), MIT Press, Cambridge, MA, USA, 2004 (inbook)

ei

PDF [BibTex]

PDF [BibTex]


no image
Concentration Inequalities

Boucheron, S., Lugosi, G., Bousquet, O.

In Lecture Notes in Artificial Intelligence 3176, pages: 208-240, (Editors: Bousquet, O., U. von Luxburg and G. Rätsch), Springer, Heidelberg, Germany, 2004 (inbook)

ei

PDF [BibTex]

PDF [BibTex]


no image
Kernels for graphs

Kashima, H., Tsuda, K., Inokuchi, A.

In pages: 155-170, (Editors: Schoelkopf, B., K. Tsuda and J.P. Vert), MIT Press, Cambridge, MA; USA, 2004 (inbook)

ei

PDF [BibTex]

PDF [BibTex]


no image
A primer on molecular biology

Zien, A.

In pages: 3-34, (Editors: Schoelkopf, B., K. Tsuda and J. P. Vert), MIT Press, Cambridge, MA, USA, 2004 (inbook)

Abstract
Modern molecular biology provides a rich source of challenging machine learning problems. This tutorial chapter aims to provide the necessary biological background knowledge required to communicate with biologists and to understand and properly formalize a number of most interesting problems in this application domain. The largest part of the chapter (its first section) is devoted to the cell as the basic unit of life. Four aspects of cells are reviewed in sequence: (1) the molecules that cells make use of (above all, proteins, RNA, and DNA); (2) the spatial organization of cells (``compartmentalization''); (3) the way cells produce proteins (``protein expression''); and (4) cellular communication and evolution (of cells and organisms). In the second section, an overview is provided of the most frequent measurement technologies, data types, and data sources. Finally, important open problems in the analysis of these data (bioinformatics challenges) are briefly outlined.

ei

PDF PostScript Web [BibTex]

PDF PostScript Web [BibTex]


no image
Advanced Statistical Learning Theory

Bousquet, O.

Machine Learning Summer School, 2004 (talk)

ei

PDF [BibTex]

PDF [BibTex]