Header logo is


2016


no image
Contextual Policy Search for Linear and Nonlinear Generalization of a Humanoid Walking Controller

Abdolmaleki, A., Lau, N., Reis, L., Peters, J., Neumann, G.

Journal of Intelligent & Robotic Systems, 83(3-4):393-408, (Editors: Luis Almeida, Lino Marques ), September 2016, Special Issue: Autonomous Robot Systems (article)

ei

DOI [BibTex]

2016


DOI [BibTex]


no image
Acquiring and Generalizing the Embodiment Mapping from Human Observations to Robot Skills

Maeda, G., Ewerton, M., Koert, D., Peters, J.

IEEE Robotics and Automation Letters, 1(2):784-791, July 2016 (article)

ei

DOI [BibTex]

DOI [BibTex]


no image
Autofocusing-based correction of B0 fluctuation-induced ghosting

Loktyushin, A., Ehses, P., Schölkopf, B., Scheffler, K.

24th Annual Meeting and Exhibition of the International Society for Magnetic Resonance in Medicine (ISMRM), May 2016 (poster)

ei

link (url) [BibTex]

link (url) [BibTex]


no image
On estimation of functional causal models: General results and application to post-nonlinear causal model

Zhang, K., Wang, Z., Zhang, J., Schölkopf, B.

ACM Transactions on Intelligent Systems and Technologies, 7(2):article no. 13, January 2016 (article)

ei

PDF DOI [BibTex]

PDF DOI [BibTex]


Thumb xl cloud tracking
Gaussian Process-Based Predictive Control for Periodic Error Correction

Klenske, E. D., Zeilinger, M., Schölkopf, B., Hennig, P.

IEEE Transactions on Control Systems Technology , 24(1):110-121, 2016 (article)

ei pn

PDF DOI [BibTex]

PDF DOI [BibTex]


no image
Pymanopt: A Python Toolbox for Optimization on Manifolds using Automatic Differentiation

Townsend, J., Koep, N., Weichwald, S.

Journal of Machine Learning Research, 17(137):1-5, 2016 (article)

ei

PDF Arxiv Code Project page link (url) [BibTex]


no image
A Causal, Data-driven Approach to Modeling the Kepler Data

Wang, D., Hogg, D. W., Foreman-Mackey, D., Schölkopf, B.

Publications of the Astronomical Society of the Pacific, 128(967):094503, 2016 (article)

ei

link (url) DOI Project Page [BibTex]

link (url) DOI Project Page [BibTex]


no image
Probabilistic Inference for Determining Options in Reinforcement Learning

Daniel, C., van Hoof, H., Peters, J., Neumann, G.

Machine Learning, Special Issue, 104(2):337-357, (Editors: Gärtner, T., Nanni, M., Passerini, A. and Robardet, C.), European Conference on Machine Learning im Machine Learning, Journal Track, 2016, Best Student Paper Award of ECML-PKDD 2016 (article)

am ei

DOI Project Page [BibTex]

DOI Project Page [BibTex]


no image
PGO wave-triggered functional MRI: mapping the networks underlying synaptic consolidation

Logothetis, N. K., Murayama, Y., Ramirez-Villegas, J. F., Besserve, M., Evrard, H.

47th Annual Meeting of the Society for Neuroscience (Neuroscience), 2016 (poster)

ei

[BibTex]

[BibTex]


no image
Influence of initial fixation position in scene viewing

Rothkegel, L. O. M., Trukenbrod, H. A., Schütt, H. H., Wichmann, F. A., Engbert, R.

Vision Research, 129, pages: 33-49, 2016 (article)

ei

link (url) DOI Project Page [BibTex]

link (url) DOI Project Page [BibTex]


no image
Testing models of peripheral encoding using metamerism in an oddity paradigm

Wallis, T. S. A., Bethge, M., Wichmann, F. A.

Journal of Vision, 16(2), 2016 (article)

ei

DOI Project Page [BibTex]

DOI Project Page [BibTex]


no image
Modeling Confounding by Half-Sibling Regression

Schölkopf, B., Hogg, D., Wang, D., Foreman-Mackey, D., Janzing, D., Simon-Gabriel, C. J., Peters, J.

Proceedings of the National Academy of Science, 113(27):7391-7398, 2016 (article)

ei

Code link (url) DOI Project Page [BibTex]

Code link (url) DOI Project Page [BibTex]


Thumb xl dual control sampled b
Dual Control for Approximate Bayesian Reinforcement Learning

Klenske, E. D., Hennig, P.

Journal of Machine Learning Research, 17(127):1-30, 2016 (article)

ei pn

PDF link (url) [BibTex]

PDF link (url) [BibTex]


no image
A Population Based Gaussian Mixture Model Incorporating 18F-FDG-PET and DW-MRI Quantifies Tumor Tissue Classes

Divine, M. R., Katiyar, P., Kohlhofer, U., Quintanilla-Martinez, L., Disselhorst, J. A., Pichler, B. J.

Journal of Nuclear Medicine, 57(3):473-479, 2016 (article)

ei

DOI [BibTex]

DOI [BibTex]


Thumb xl img02
Probabilistic Duality for Parallel Gibbs Sampling without Graph Coloring

Mescheder, L., Nowozin, S., Geiger, A.

Arxiv, 2016 (article)

Abstract
We present a new notion of probabilistic duality for random variables involving mixture distributions. Using this notion, we show how to implement a highly-parallelizable Gibbs sampler for weakly coupled discrete pairwise graphical models with strictly positive factors that requires almost no preprocessing and is easy to implement. Moreover, we show how our method can be combined with blocking to improve mixing. Even though our method leads to inferior mixing times compared to a sequential Gibbs sampler, we argue that our method is still very useful for large dynamic networks, where factors are added and removed on a continuous basis, as it is hard to maintain a graph coloring in this setup. Similarly, our method is useful for parallelizing Gibbs sampling in graphical models that do not allow for graph colorings with a small number of colors such as densely connected graphs.

avg

pdf [BibTex]


no image
Painfree and accurate Bayesian estimation of psychometric functions for (potentially) overdispersed data

Schütt, H. H., Harmeling, S., Macke, J. H., Wichmann, F. A.

Vision Research, 122, pages: 105-123, 2016 (article)

ei

link (url) DOI Project Page [BibTex]

link (url) DOI Project Page [BibTex]


no image
Hierarchical Relative Entropy Policy Search

Daniel, C., Neumann, G., Kroemer, O., Peters, J.

Journal of Machine Learning Research, 17(93):1-50, 2016 (article)

ei

link (url) Project Page [BibTex]

link (url) Project Page [BibTex]


no image
Kernel Mean Shrinkage Estimators

Muandet, K., Sriperumbudur, B., Fukumizu, K., Gretton, A., Schölkopf, B.

Journal of Machine Learning Research, 17(48):1-41, 2016 (article)

ei

link (url) [BibTex]

link (url) [BibTex]


no image
Learning to Deblur

Schuler, C. J., Hirsch, M., Harmeling, S., Schölkopf, B.

IEEE Transactions on Pattern Analysis and Machine Intelligence, 38(7):1439-1451, IEEE, 2016 (article)

ei

DOI [BibTex]

DOI [BibTex]


no image
Transfer Learning in Brain-Computer Interfaces

Jayaram, V., Alamgir, M., Altun, Y., Schölkopf, B., Grosse-Wentrup, M.

IEEE Computational Intelligence Magazine, 11(1):20-31, 2016 (article)

ei

PDF DOI [BibTex]

PDF DOI [BibTex]


no image
MERLiN: Mixture Effect Recovery in Linear Networks

Weichwald, S., Grosse-Wentrup, M., Gretton, A.

IEEE Journal of Selected Topics in Signal Processing, 10(7):1254-1266, 2016 (article)

ei

Arxiv Code PDF DOI Project Page [BibTex]

Arxiv Code PDF DOI Project Page [BibTex]


no image
Causal inference using invariant prediction: identification and confidence intervals

Peters, J., Bühlmann, P., Meinshausen, N.

Journal of the Royal Statistical Society, Series B (Statistical Methodology), 78(5):947-1012, 2016, (with discussion) (article)

ei

link (url) DOI [BibTex]

link (url) DOI [BibTex]


no image
Causal discovery and inference: concepts and recent methodological advances

Spirtes, P., Zhang, K.

Applied Informatics, 3(3):1-28, 2016 (article)

ei

DOI [BibTex]

DOI [BibTex]


no image
Self-regulation of brain rhythms in the precuneus: a novel BCI paradigm for patients with ALS

Fomina, T., Lohmann, G., Erb, M., Ethofer, T., Schölkopf, B., Grosse-Wentrup, M.

Journal of Neural Engineering, 13(6):066021, 2016 (article)

ei

link (url) Project Page [BibTex]


no image
Influence Estimation and Maximization in Continuous-Time Diffusion Networks

Gomez-Rodriguez, M., Song, L., Du, N., Zha, H., Schölkopf, B.

ACM Transactions on Information Systems, 34(2):9:1-9:33, 2016 (article)

ei

DOI Project Page Project Page [BibTex]

DOI Project Page Project Page [BibTex]


no image
The population of long-period transiting exoplanets

Foreman-Mackey, D., Morton, T. D., Hogg, D. W., Agol, E., Schölkopf, B.

The Astronomical Journal, 152(6):206, 2016 (article)

ei

link (url) Project Page [BibTex]

link (url) Project Page [BibTex]


no image
Statistical source separation of rhythmic LFP patterns during sharp wave ripples in the macaque hippocampus

Ramirez-Villegas, J. F., Logothetis, N. K., Besserve, M.

47th Annual Meeting of the Society for Neuroscience (Neuroscience), 2016 (poster)

ei

[BibTex]

[BibTex]


no image
An overview of quantitative approaches in Gestalt perception

Jäkel, F., Singh, M., Wichmann, F. A., Herzog, M. H.

Vision Research, 126, pages: 3-8, 2016 (article)

ei

link (url) DOI Project Page [BibTex]

link (url) DOI Project Page [BibTex]


no image
Bootstrat: Population Informed Bootstrapping for Rare Variant Tests

Huang, H., Peloso, G. M., Howrigan, D., Rakitsch, B., Simon-Gabriel, C. J., Goldstein, J. I., Daly, M. J., Borgwardt, K., Neale, B. M.

bioRxiv, 2016, preprint (article)

ei

link (url) DOI [BibTex]

link (url) DOI [BibTex]


no image
Probabilistic Movement Models Show that Postural Control Precedes and Predicts Volitional Motor Control

Rueckert, E., Camernik, J., Peters, J., Babic, J.

Nature PG: Scientific Reports, 6(Article number: 28455), 2016 (article)

ei

DOI Project Page [BibTex]

DOI Project Page [BibTex]


no image
Learning Taxonomy Adaptation in Large-scale Classification

Babbar, R., Partalas, I., Gaussier, E., Amini, M., Amblard, C.

Journal of Machine Learning Research, 17(98):1-37, 2016 (article)

ei

link (url) Project Page [BibTex]

link (url) Project Page [BibTex]


no image
Hippocampal neural events predict ongoing brain-wide BOLD activity

Besserve, M., Logothetis, N. K.

47th Annual Meeting of the Society for Neuroscience (Neuroscience), 2016 (poster)

ei

[BibTex]

[BibTex]


no image
BOiS—Berlin Object in Scene Database: Controlled Photographic Images for Visual Search Experiments with Quantified Contextual Priors

Mohr, J., Seyfarth, J., Lueschow, A., Weber, J. E., Wichmann, F. A., Obermayer, K.

Frontiers in Psychology, 2016 (article)

ei

DOI [BibTex]

DOI [BibTex]


no image
Preface to the ACM TIST Special Issue on Causal Discovery and Inference

Zhang, K., Li, J., Bareinboim, E., Schölkopf, B., Pearl, J.

ACM Transactions on Intelligent Systems and Technologies, 7(2):article no. 17, 2016 (article)

ei

DOI [BibTex]

DOI [BibTex]


Thumb xl pami
Map-Based Probabilistic Visual Self-Localization

Brubaker, M. A., Geiger, A., Urtasun, R.

IEEE Trans. on Pattern Analysis and Machine Intelligence (PAMI), 2016 (article)

Abstract
Accurate and efficient self-localization is a critical problem for autonomous systems. This paper describes an affordable solution to vehicle self-localization which uses odometry computed from two video cameras and road maps as the sole inputs. The core of the method is a probabilistic model for which an efficient approximate inference algorithm is derived. The inference algorithm is able to utilize distributed computation in order to meet the real-time requirements of autonomous systems in some instances. Because of the probabilistic nature of the model the method is capable of coping with various sources of uncertainty including noise in the visual odometry and inherent ambiguities in the map (e.g., in a Manhattan world). By exploiting freely available, community developed maps and visual odometry measurements, the proposed method is able to localize a vehicle to 4m on average after 52 seconds of driving on maps which contain more than 2,150km of drivable roads.

avg ps

pdf Project Page [BibTex]

pdf Project Page [BibTex]


no image
Recurrent Spiking Networks Solve Planning Tasks

Rueckert, E., Kappel, D., Tanneberg, D., Pecevski, D., Peters, J.

Nature PG: Scientific Reports, 6(Article number: 21142), 2016 (article)

ei

DOI Project Page [BibTex]

DOI Project Page [BibTex]


no image
Bio-inspired feedback-circuit implementation of discrete, free energy optimizing, winner-take-all computations

Genewein, T, Braun, DA

Biological Cybernetics, 110(2):135–150, June 2016 (article)

Abstract
Bayesian inference and bounded rational decision-making require the accumulation of evidence or utility, respectively, to transform a prior belief or strategy into a posterior probability distribution over hypotheses or actions. Crucially, this process cannot be simply realized by independent integrators, since the different hypotheses and actions also compete with each other. In continuous time, this competitive integration process can be described by a special case of the replicator equation. Here we investigate simple analog electric circuits that implement the underlying differential equation under the constraint that we only permit a limited set of building blocks that we regard as biologically interpretable, such as capacitors, resistors, voltage-dependent conductances and voltage- or current-controlled current and voltage sources. The appeal of these circuits is that they intrinsically perform normalization without requiring an explicit divisive normalization. However, even in idealized simulations, we find that these circuits are very sensitive to internal noise as they accumulate error over time. We discuss in how far neural circuits could implement these operations that might provide a generic competitive principle underlying both perception and action.

ei

DOI [BibTex]

DOI [BibTex]


no image
Decision-Making under Ambiguity Is Modulated by Visual Framing, but Not by Motor vs. Non-Motor Context: Experiments and an Information-Theoretic Ambiguity Model

Grau-Moya, J, Ortega, PA, Braun, DA

PLoS ONE, 11(4):1-21, April 2016 (article)

Abstract
A number of recent studies have investigated differences in human choice behavior depending on task framing, especially comparing economic decision-making to choice behavior in equivalent sensorimotor tasks. Here we test whether decision-making under ambiguity exhibits effects of task framing in motor vs. non-motor context. In a first experiment, we designed an experience-based urn task with varying degrees of ambiguity and an equivalent motor task where subjects chose between hitting partially occluded targets. In a second experiment, we controlled for the different stimulus design in the two tasks by introducing an urn task with bar stimuli matching those in the motor task. We found ambiguity attitudes to be mainly influenced by stimulus design. In particular, we found that the same subjects tended to be ambiguity-preferring when choosing between ambiguous bar stimuli, but ambiguity-avoiding when choosing between ambiguous urn sample stimuli. In contrast, subjects’ choice pattern was not affected by changing from a target hitting task to a non-motor context when keeping the stimulus design unchanged. In both tasks subjects’ choice behavior was continuously modulated by the degree of ambiguity. We show that this modulation of behavior can be explained by an information-theoretic model of ambiguity that generalizes Bayes-optimal decision-making by combining Bayesian inference with robust decision-making under model uncertainty. Our results demonstrate the benefits of information-theoretic models of decision-making under varying degrees of ambiguity for a given context, but also demonstrate the sensitivity of ambiguity attitudes across contexts that theoretical models struggle to explain.

ei

DOI [BibTex]

2005


no image
Kernel Methods for Measuring Independence

Gretton, A., Herbrich, R., Smola, A., Bousquet, O., Schölkopf, B.

Journal of Machine Learning Research, 6, pages: 2075-2129, December 2005 (article)

Abstract
We introduce two new functionals, the constrained covariance and the kernel mutual information, to measure the degree of independence of random variables. These quantities are both based on the covariance between functions of the random variables in reproducing kernel Hilbert spaces (RKHSs). We prove that when the RKHSs are universal, both functionals are zero if and only if the random variables are pairwise independent. We also show that the kernel mutual information is an upper bound near independence on the Parzen window estimate of the mutual information. Analogous results apply for two correlation-based dependence functionals introduced earlier: we show the kernel canonical correlation and the kernel generalised variance to be independence measures for universal kernels, and prove the latter to be an upper bound on the mutual information near independence. The performance of the kernel dependence functionals in measuring independence is verified in the context of independent component analysis.

ei

PDF PostScript PDF [BibTex]

2005


PDF PostScript PDF [BibTex]


no image
A Unifying View of Sparse Approximate Gaussian Process Regression

Quinonero Candela, J., Rasmussen, C.

Journal of Machine Learning Research, 6, pages: 1935-1959, December 2005 (article)

Abstract
We provide a new unifying view, including all existing proper probabilistic sparse approximations for Gaussian process regression. Our approach relies on expressing the effective prior which the methods are using. This allows new insights to be gained, and highlights the relationship between existing methods. It also allows for a clear theoretically justified ranking of the closeness of the known approximations to the corresponding full GPs. Finally we point directly to designs of new better sparse approximations, combining the best of the existing strategies, within attractive computational constraints.

ei

PDF [BibTex]

PDF [BibTex]


no image
Kernel methods for dependence testing in LFP-MUA

Gretton, A., Belitski, A., Murayama, Y., Schölkopf, B., Logothetis, N.

35(689.17), 35th Annual Meeting of the Society for Neuroscience (Neuroscience), November 2005 (poster)

Abstract
A fundamental problem in neuroscience is determining whether or not particular neural signals are dependent. The correlation is the most straightforward basis for such tests, but considerable work also focuses on the mutual information (MI), which is capable of revealing dependence of higher orders that the correlation cannot detect. That said, there are other measures of dependence that share with the MI an ability to detect dependence of any order, but which can be easier to compute in practice. We focus in particular on tests based on the functional covariance, which derive from work originally accomplished in 1959 by Renyi. Conceptually, our dependence tests work by computing the covariance between (infinite dimensional) vectors of nonlinear mappings of the observations being tested, and then determining whether this covariance is zero - we call this measure the constrained covariance (COCO). When these vectors are members of universal reproducing kernel Hilbert spaces, we can prove this covariance to be zero only when the variables being tested are independent. The greatest advantage of these tests, compared with the mutual information, is their simplicity – when comparing two signals, we need only take the largest eigenvalue (or the trace) of a product of two matrices of nonlinearities, where these matrices are generally much smaller than the number of observations (and are very simple to construct). We compare the mutual information, the COCO, and the correlation in the context of finding changes in dependence between the LFP and MUA signals in the primary visual cortex of the anaesthetized macaque, during the presentation of dynamic natural stimuli. We demonstrate that the MI and COCO reveal dependence which is not detected by the correlation alone (which we prove by artificially removing all correlation between the signals, and then testing their dependence with COCO and the MI); and that COCO and the MI give results consistent with each other on our data.

ei

Web [BibTex]

Web [BibTex]


no image
Maximal Margin Classification for Metric Spaces

Hein, M., Bousquet, O., Schölkopf, B.

Journal of Computer and System Sciences, 71(3):333-359, October 2005 (article)

Abstract
In order to apply the maximum margin method in arbitrary metric spaces, we suggest to embed the metric space into a Banach or Hilbert space and to perform linear classification in this space. We propose several embeddings and recall that an isometric embedding in a Banach space is always possible while an isometric embedding in a Hilbert space is only possible for certain metric spaces. As a result, we obtain a general maximum margin classification algorithm for arbitrary metric spaces (whose solution is approximated by an algorithm of Graepel. Interestingly enough, the embedding approach, when applied to a metric which can be embedded into a Hilbert space, yields the SVM algorithm, which emphasizes the fact that its solution depends on the metric and not on the kernel. Furthermore we give upper bounds of the capacity of the function classes corresponding to both embeddings in terms of Rademacher averages. Finally we compare the capacities of these function classes directly.

ei

PDF PDF DOI [BibTex]

PDF PDF DOI [BibTex]


no image
Selective integration of multiple biological data for supervised network inference

Kato, T., Tsuda, K., Asai, K.

Bioinformatics, 21(10):2488 , October 2005 (article)

ei

PDF [BibTex]

PDF [BibTex]


no image
Assessing Approximate Inference for Binary Gaussian Process Classification

Kuss, M., Rasmussen, C.

Journal of Machine Learning Research, 6, pages: 1679 , October 2005 (article)

Abstract
Gaussian process priors can be used to define flexible, probabilistic classification models. Unfortunately exact Bayesian inference is analytically intractable and various approximation techniques have been proposed. In this work we review and compare Laplace‘s method and Expectation Propagation for approximate Bayesian inference in the binary Gaussian process classification model. We present a comprehensive comparison of the approximations, their predictive performance and marginal likelihood estimates to results obtained by MCMC sampling. We explain theoretically and corroborate empirically the advantages of Expectation Propagation compared to Laplace‘s method.

ei

PDF PDF [BibTex]

PDF PDF [BibTex]


no image
Clustering on the Unit Hypersphere using von Mises-Fisher Distributions

Banerjee, A., Dhillon, I., Ghosh, J., Sra, S.

Journal of Machine Learning Research, 6, pages: 1345-1382, September 2005 (article)

Abstract
Several large scale data mining applications, such as text categorization and gene expression analysis, involve high-dimensional data that is also inherently directional in nature. Often such data is L2 normalized so that it lies on the surface of a unit hypersphere. Popular models such as (mixtures of) multi-variate Gaussians are inadequate for characterizing such data. This paper proposes a generative mixture-model approach to clustering directional data based on the von Mises-Fisher (vMF) distribution, which arises naturally for data distributed on the unit hypersphere. In particular, we derive and analyze two variants of the Expectation Maximization (EM) framework for estimating the mean and concentration parameters of this mixture. Numerical estimation of the concentration parameters is non-trivial in high dimensions since it involves functional inversion of ratios of Bessel functions. We also formulate two clustering algorithms corresponding to the variants of EM that we derive. Our approach provides a theoretical basis for the use of cosine similarity that has been widely employed by the information retrieval community, and obtains the spherical kmeans algorithm (kmeans with cosine similarity) as a special case of both variants. Empirical results on clustering of high-dimensional text and gene-expression data based on a mixture of vMF distributions show that the ability to estimate the concentration parameter for each vMF component, which is not present in existing approaches, yields superior results, especially for difficult clustering tasks in high-dimensional spaces.

ei

PDF [BibTex]

PDF [BibTex]