Header logo is


2005


no image
Micromagnetic simulation as a bridge between magnetic-force and magnetic-transmission X-ray microscopy

Bolte, M., Eiselt, R., Eimüller, T.

{Journal of Magnetism and Magnetic Materials}, 290, pages: 723-726, 2005 (article)

mms

[BibTex]

2005


[BibTex]


no image
Grain-boundary melting phase transition in the Cu-Bi system

Divinski, S., Lohmann, M., Herzig, C., Straumal, B., Baretzky, B., Gust, W.

{Physical Review B}, 71, 2005 (article)

mms

[BibTex]

[BibTex]

1997


no image
Comparing support vector machines with Gaussian kernels to radial basis function classifiers

Schölkopf, B., Sung, K., Burges, C., Girosi, F., Niyogi, P., Poggio, T., Vapnik, V.

IEEE Transactions on Signal Processing, 45(11):2758-2765, November 1997 (article)

Abstract
The support vector (SV) machine is a novel type of learning machine, based on statistical learning theory, which contains polynomial classifiers, neural networks, and radial basis function (RBF) networks as special cases. In the RBF case, the SV algorithm automatically determines centers, weights, and threshold that minimize an upper bound on the expected test error. The present study is devoted to an experimental comparison of these machines with a classical approach, where the centers are determined by X-means clustering, and the weights are computed using error backpropagation. We consider three machines, namely, a classical RBF machine, an SV machine with Gaussian kernel, and a hybrid system with the centers determined by the SV method and the weights trained by error backpropagation. Our results show that on the United States postal service database of handwritten digits, the SV machine achieves the highest recognition accuracy, followed by the hybrid system. The SV approach is thus not only theoretically well-founded but also superior in a practical application.

ei

Web DOI [BibTex]

1997


Web DOI [BibTex]


no image
ATM-dependent telomere loss in aging human diploid fibroblasts and DNA damage lead to the post-translational activation of p53 protein involving poly(ADP-ribose) polymerase.

Vaziri, H., MD, .., RC, .., Davison, T., YS, .., CH, .., GG, .., Benchimol, S.

The European Molecular Biology Organization Journal, 16(19):6018-6033, 1997 (article)

ei

Web [BibTex]

Web [BibTex]


Thumb xl yasersmile
Recognizing facial expressions in image sequences using local parameterized models of image motion

Black, M. J., Yacoob, Y.

Int. Journal of Computer Vision, 25(1):23-48, 1997 (article)

Abstract
This paper explores the use of local parametrized models of image motion for recovering and recognizing the non-rigid and articulated motion of human faces. Parametric flow models (for example affine) are popular for estimating motion in rigid scenes. We observe that within local regions in space and time, such models not only accurately model non-rigid facial motions but also provide a concise description of the motion in terms of a small number of parameters. These parameters are intuitively related to the motion of facial features during facial expressions and we show how expressions such as anger, happiness, surprise, fear, disgust, and sadness can be recognized from the local parametric motions in the presence of significant head motion. The motion tracking and expression recognition approach performed with high accuracy in extensive laboratory experiments involving 40 subjects as well as in television and movie sequences.

ps

pdf pdf from publisher abstract video [BibTex]


no image
Locally weighted learning

Atkeson, C. G., Moore, A. W., Schaal, S.

Artificial Intelligence Review, 11(1-5):11-73, 1997, clmc (article)

Abstract
This paper surveys locally weighted learning, a form of lazy learning and memory-based learning, and focuses on locally weighted linear regression. The survey discusses distance functions, smoothing parameters, weighting functions, local model structures, regularization of the estimates and bias, assessing predictions, handling noisy data and outliers, improving the quality of predictions by tuning fit parameters, interference between old and new data, implementing locally weighted learning efficiently, and applications of locally weighted learning. A companion paper surveys how locally weighted learning can be used in robot learning and control. Keywords: locally weighted regression, LOESS, LWR, lazy learning, memory-based learning, least commitment learning, distance functions, smoothing parameters, weighting functions, global tuning, local tuning, interference.

am

link (url) [BibTex]

link (url) [BibTex]


no image
Locally weighted learning for control

Atkeson, C. G., Moore, A. W., Schaal, S.

Artificial Intelligence Review, 11(1-5):75-113, 1997, clmc (article)

Abstract
Lazy learning methods provide useful representations and training algorithms for learning about complex phenomena during autonomous adaptive control of complex systems. This paper surveys ways in which locally weighted learning, a type of lazy learning, has been applied by us to control tasks. We explain various forms that control tasks can take, and how this affects the choice of learning paradigm. The discussion section explores the interesting impact that explicitly remembering all previous experiences has on the problem of learning to control. Keywords: locally weighted regression, LOESS, LWR, lazy learning, memory-based learning, least commitment learning, forward models, inverse models, linear quadratic regulation (LQR), shifting setpoint algorithm, dynamic programming.

am

link (url) [BibTex]

link (url) [BibTex]

1992


no image
Ins CAD integrierte Kostenkalkulation (CAD-Integrated Cost Calculation)

Ehrlenspiel, K., Schaal, S.

Konstruktion 44, 12, pages: 407-414, 1992, clmc (article)

am

[BibTex]

1992


[BibTex]