Header logo is


1998


no image
Local adaptive subspace regression

Vijayakumar, S., Schaal, S.

Neural Processing Letters, 7(3):139-149, 1998, clmc (article)

Abstract
Incremental learning of sensorimotor transformations in high dimensional spaces is one of the basic prerequisites for the success of autonomous robot devices as well as biological movement systems. So far, due to sparsity of data in high dimensional spaces, learning in such settings requires a significant amount of prior knowledge about the learning task, usually provided by a human expert. In this paper we suggest a partial revision of the view. Based on empirical studies, we observed that, despite being globally high dimensional and sparse, data distributions from physical movement systems are locally low dimensional and dense. Under this assumption, we derive a learning algorithm, Locally Adaptive Subspace Regression, that exploits this property by combining a dynamically growing local dimensionality reduction technique  as a preprocessing step with a nonparametric learning technique, locally weighted regression, that also learns the region of validity of the regression. The usefulness of the algorithm and the validity of its assumptions are illustrated for a synthetic data set, and for data of the inverse dynamics of human arm movements and an actual 7 degree-of-freedom anthropomorphic robot arm. 

am

link (url) [BibTex]

1998


link (url) [BibTex]

1995


no image
Memory-based neural networks for robot learning

Atkeson, C. G., Schaal, S.

Neurocomputing, 9, pages: 1-27, 1995, clmc (article)

Abstract
This paper explores a memory-based approach to robot learning, using memory-based neural networks to learn models of the task to be performed. Steinbuch and Taylor presented neural network designs to explicitly store training data and do nearest neighbor lookup in the early 1960s. In this paper their nearest neighbor network is augmented with a local model network, which fits a local model to a set of nearest neighbors. This network design is equivalent to a statistical approach known as locally weighted regression, in which a local model is formed to answer each query, using a weighted regression in which nearby points (similar experiences) are weighted more than distant points (less relevant experiences). We illustrate this approach by describing how it has been used to enable a robot to learn a difficult juggling task. Keywords: memory-based, robot learning, locally weighted regression, nearest neighbor, local models.

am

link (url) [BibTex]

1995


link (url) [BibTex]

1993


no image
Design concurrent calculation: A CAD- and data-integrated approach

Schaal, S., Ehrlenspiel, K.

Journal of Engineering Design, 4, pages: 71-85, 1993, clmc (article)

Abstract
Besides functional regards, product design demands increasingly more for further reaching considerations. Quality alone cannot suffice anymore to compete in the market; design for manufacturability, for assembly, for recycling, etc., are well-known keywords. Those can largely be reduced to the necessity of design for costs. This paper focuses on a CAD-based approach to design concurrent calculation. It will discuss how, in the meantime well-established, tools like feature technology, knowledge-based systems, and relational databases can be blended into one coherent concept to achieve an entirely CAD- and data-integrated cost information tool. This system is able to extract data from the CAD-system, combine it with data about the company specific manufacturing environment, and subsequently autonomously evaluate manufacturability aspects and costs of the given CAD-model. Within minutes the designer gets quantitative in-formation about the major cost sources of his/her design. Additionally, some alternative methods for approximating manu-facturing times from empirical data, namely neural networks and local weighted regression, are introduced.

am

[BibTex]

1993


[BibTex]