Header logo is


2015


no image
Derivation of phenomenological expressions for transition matrix elements for electron-phonon scattering

Illg, C., Haag, M., Müller, B. Y., Czycholl, G., Fähnle, M.

2015 (misc)

mms

link (url) [BibTex]

2015


2014


Thumb xl thumb 9780262028370
Advanced Structured Prediction

Nowozin, S., Gehler, P. V., Jancsary, J., Lampert, C. H.

Advanced Structured Prediction, pages: 432, Neural Information Processing Series, MIT Press, November 2014 (book)

Abstract
The goal of structured prediction is to build machine learning models that predict relational information that itself has structure, such as being composed of multiple interrelated parts. These models, which reflect prior knowledge, task-specific relations, and constraints, are used in fields including computer vision, speech recognition, natural language processing, and computational biology. They can carry out such tasks as predicting a natural language sentence, or segmenting an image into meaningful components. These models are expressive and powerful, but exact computation is often intractable. A broad research effort in recent years has aimed at designing structured prediction models and approximate inference and learning procedures that are computationally efficient. This volume offers an overview of this recent research in order to make the work accessible to a broader research community. The chapters, by leading researchers in the field, cover a range of topics, including research trends, the linear programming relaxation approach, innovations in probabilistic modeling, recent theoretical progress, and resource-aware learning.

ps

publisher link (url) [BibTex]

2014


publisher link (url) [BibTex]


no image
Local Gaussian Regression

Meier, F., Hennig, P., Schaal, S.

arXiv preprint, March 2014, clmc (misc)

Abstract
Abstract: Locally weighted regression was created as a nonparametric learning method that is computationally efficient, can learn from very large amounts of data and add data incrementally. An interesting feature of locally weighted regression is that it can work with ...

am pn

Web link (url) [BibTex]

Web link (url) [BibTex]


no image
Fibrillar structures to reduce viscous drag on aerodynamic and hydrodynamic wall surfaces

Castillo, L., Aksak, B., Sitti, M.

March 2014, US Patent App. 14/774,767 (misc)

pi

[BibTex]

[BibTex]


no image
The design of microfibers with mushroom-shaped tips for optimal adhesion

Sitti, M., Aksak, B.

February 2014, US Patent App. 14/766,561 (misc)

pi

[BibTex]

[BibTex]


no image
Learning Motor Skills: From Algorithms to Robot Experiments

Kober, J., Peters, J.

97, pages: 191, Springer Tracts in Advanced Robotics, Springer, 2014 (book)

ei

DOI [BibTex]

DOI [BibTex]

2004


no image
Kernel Methods in Computational Biology

Schölkopf, B., Tsuda, K., Vert, J.

pages: 410, Computational Molecular Biology, MIT Press, Cambridge, MA, USA, August 2004 (book)

Abstract
Modern machine learning techniques are proving to be extremely valuable for the analysis of data in computational biology problems. One branch of machine learning, kernel methods, lends itself particularly well to the difficult aspects of biological data, which include high dimensionality (as in microarray measurements), representation as discrete and structured data (as in DNA or amino acid sequences), and the need to combine heterogeneous sources of information. This book provides a detailed overview of current research in kernel methods and their applications to computational biology. Following three introductory chapters—an introduction to molecular and computational biology, a short review of kernel methods that focuses on intuitive concepts rather than technical details, and a detailed survey of recent applications of kernel methods in computational biology—the book is divided into three sections that reflect three general trends in current research. The first part presents different ideas for the design of kernel functions specifically adapted to various biological data; the second part covers different approaches to learning from heterogeneous data; and the third part offers examples of successful applications of support vector machine methods.

ei

Web [BibTex]

2004


Web [BibTex]


no image
Statistische Lerntheorie und Empirische Inferenz

Schölkopf, B.

Jahrbuch der Max-Planck-Gesellschaft, 2004, pages: 377-382, 2004 (misc)

Abstract
Statistical learning theory studies the process of inferring regularities from empirical data. The fundamental problem is what is called generalization: how it is possible to infer a law which will be valid for an infinite number of future observations, given only a finite amount of data? This problem hinges upon fundamental issues of statistics and science in general, such as the problems of complexity of explanations, a priori knowledge, and representation of data.

ei

PDF Web [BibTex]

PDF Web [BibTex]


no image
Nanoscale Materials for Energy Storage
{Materials Science \& Engineering B}, 108, pages: 292, Elsevier, 2004 (misc)

mms

[BibTex]

[BibTex]

2003


no image
Magnetism and the Microstructure of Ferromagnetic Solids

Kronmüller, H., Fähnle, M.

pages: 432 p., 1st ed., Cambridge University Press, Cambridge, 2003 (book)

mms

[BibTex]

2003


[BibTex]

2000


no image
Advances in Large Margin Classifiers

Smola, A., Bartlett, P., Schölkopf, B., Schuurmans, D.

pages: 422, Neural Information Processing, MIT Press, Cambridge, MA, USA, October 2000 (book)

Abstract
The concept of large margins is a unifying principle for the analysis of many different approaches to the classification of data from examples, including boosting, mathematical programming, neural networks, and support vector machines. The fact that it is the margin, or confidence level, of a classification--that is, a scale parameter--rather than a raw training error that matters has become a key tool for dealing with classifiers. This book shows how this idea applies to both the theoretical analysis and the design of algorithms. The book provides an overview of recent developments in large margin classifiers, examines connections with other methods (e.g., Bayesian inference), and identifies strengths and weaknesses of the method, as well as directions for future research. Among the contributors are Manfred Opper, Vladimir Vapnik, and Grace Wahba.

ei

Web [BibTex]

2000


Web [BibTex]