Header logo is


2012


no image
Magnetic proximity effect in YBa2Cu3O7 / La2/3Ca1/3MnO3 and YBa2Cu3O7 / LaMnO3+δsuperlattices

Satapathy, D. K., Uribe-Laverde, M. A., Marozau, I., Malik, V. K., Das, S., Wagner, T., Marcelot, C., Stahn, J., Brück, S., Rühm, A., Macke, S., Tietze, T., Goering, E., Frañó, A., Kim, J., Wu, M., Benckiser, E., Keimer, B., Devishvili, A., Toperverg, B. P., Merz, M., Nagel, P., Schuppler, S., Bernhard, C.

{Physical Review Letters}, 108, 2012 (article)

mms

DOI [BibTex]

2012


DOI [BibTex]


no image
Structural and chemical characterization on the nanoscale

Stierle, A., Carstanjen, H.-D., Hofmann, S.

In Nanoelectronics and Information Technology. Advanced Electronic Materials and Novel Devices, pages: 233-254, Wiley-VCH, Weinheim, 2012 (incollection)

mms

[BibTex]

[BibTex]


no image
Noble gases and microporous frameworks; from interaction to application

Soleimani Dorcheh, A., Denysenko, D., Volkmer, D., Donner, W., Hirscher, M.

{Microporous and Mesoporous Materials}, 162, pages: 64-68, Elsevier, Amsterdam, 2012 (article)

mms

DOI [BibTex]

DOI [BibTex]


no image
Note: Unique characterization possibilities in the ultra high vacuum scanning transmission x-ray microscope (UHV-STXM) "MAXYMUS" using a rotatable permanent magnetic field up to 0.22 T

Nolle, D., Weigand, M., Audehm, P., Goering, E., Wiesemann, U., Wolter, C., Nolle, E., Schütz, G.

{Review of Scientific Instruments}, 83(4), 2012 (article)

mms

DOI [BibTex]


no image
Rutherford Backscattering

Carstanjen, H. D.

In Nanoelectronics and Information Technology. Advanced Electronic Materials and Novel Devices, pages: 250-252, WILEY-VCH Verlag, Weinheim, Germany, 2012 (incollection)

mms

[BibTex]

[BibTex]


no image
Microstructure and superconducting properties of MgB2 films prepared by solid state reaction of multilayer precursors of the elements

Kugler, B., Stahl, C., Treiber, S., Soltan, S., Haug, S., Schütz, G., Albrecht, J.

{Thin Solid Films}, 520, pages: 6985-6988, 2012 (article)

mms

DOI [BibTex]

DOI [BibTex]


Thumb xl bookcdc4cv
Consumer Depth Cameras for Computer Vision - Research Topics and Applications

Fossati, A., Gall, J., Grabner, H., Ren, X., Konolige, K.

Advances in Computer Vision and Pattern Recognition, Springer, 2012 (book)

ps

workshop publisher's site [BibTex]

workshop publisher's site [BibTex]


Thumb xl amdo2012v2
Spatial Measures between Human Poses for Classification and Understanding

Soren Hauberg, Kim S. Pedersen

In Articulated Motion and Deformable Objects, 7378, pages: 26-36, LNCS, (Editors: Perales, Francisco J. and Fisher, Robert B. and Moeslund, Thomas B.), Springer Berlin Heidelberg, 2012 (inproceedings)

ps

Publishers site Project Page [BibTex]

Publishers site Project Page [BibTex]


Thumb xl nips teaser
A Geometric Take on Metric Learning

Hauberg, S., Freifeld, O., Black, M. J.

In Advances in Neural Information Processing Systems (NIPS) 25, pages: 2033-2041, (Editors: P. Bartlett and F.C.N. Pereira and C.J.C. Burges and L. Bottou and K.Q. Weinberger), MIT Press, 2012 (inproceedings)

Abstract
Multi-metric learning techniques learn local metric tensors in different parts of a feature space. With such an approach, even simple classifiers can be competitive with the state-of-the-art because the distance measure locally adapts to the structure of the data. The learned distance measure is, however, non-metric, which has prevented multi-metric learning from generalizing to tasks such as dimensionality reduction and regression in a principled way. We prove that, with appropriate changes, multi-metric learning corresponds to learning the structure of a Riemannian manifold. We then show that this structure gives us a principled way to perform dimensionality reduction and regression according to the learned metrics. Algorithmically, we provide the first practical algorithm for computing geodesics according to the learned metrics, as well as algorithms for computing exponential and logarithmic maps on the Riemannian manifold. Together, these tools let many Euclidean algorithms take advantage of multi-metric learning. We illustrate the approach on regression and dimensionality reduction tasks that involve predicting measurements of the human body from shape data.

ps

PDF Youtube Suppl. material Poster Project Page [BibTex]

PDF Youtube Suppl. material Poster Project Page [BibTex]

1994


no image
Robot juggling: An implementation of memory-based learning

Schaal, S., Atkeson, C. G.

Control Systems Magazine, 14(1):57-71, 1994, clmc (article)

Abstract
This paper explores issues involved in implementing robot learning for a challenging dynamic task, using a case study from robot juggling. We use a memory-based local modeling approach (locally weighted regression) to represent a learned model of the task to be performed. Statistical tests are given to examine the uncertainty of a model, to optimize its prediction quality, and to deal with noisy and corrupted data. We develop an exploration algorithm that explicitly deals with prediction accuracy requirements during exploration. Using all these ingredients in combination with methods from optimal control, our robot achieves fast real-time learning of the task within 40 to 100 trials.

am

link (url) [BibTex]

1994


link (url) [BibTex]


no image
Robot learning by nonparametric regression

Schaal, S., Atkeson, C. G.

In Proceedings of the International Conference on Intelligent Robots and Systems (IROS’94), pages: 478-485, Munich Germany, 1994, clmc (inproceedings)

Abstract
We present an approach to robot learning grounded on a nonparametric regression technique, locally weighted regression. The model of the task to be performed is represented by infinitely many local linear models, i.e., the (hyper-) tangent planes at every query point. Such a model, however, is only generated when a query is performed and is not retained. This is in contrast to other methods using a finite set of linear models to accomplish a piecewise linear model. Architectural parameters of our approach, such as distance metrics, are also a function of the current query point instead of being global. Statistical tests are presented for when a local model is good enough such that it can be reliably used to build a local controller. These statistical measures also direct the exploration of the robot. We explicitly deal with the case where prediction accuracy requirements exist during exploration: By gradually shifting a center of exploration and controlling the speed of the shift with local prediction accuracy, a goal-directed exploration of state space takes place along the fringes of the current data support until the task goal is achieved. We illustrate this approach by describing how it has been used to enable a robot to learn a challenging juggling task: Within 40 to 100 trials the robot accomplished the task goal starting out with no initial experiences.

am

[BibTex]

[BibTex]


no image
Assessing the quality of learned local models

Schaal, S., Atkeson, C. G.

In Advances in Neural Information Processing Systems 6, pages: 160-167, (Editors: Cowan, J.;Tesauro, G.;Alspector, J.), Morgan Kaufmann, San Mateo, CA, 1994, clmc (inproceedings)

Abstract
An approach is presented to learning high dimensional functions in the case where the learning algorithm can affect the generation of new data. A local modeling algorithm, locally weighted regression, is used to represent the learned function. Architectural parameters of the approach, such as distance metrics, are also localized and become a function of the query point instead of being global. Statistical tests are given for when a local model is good enough and sampling should be moved to a new area. Our methods explicitly deal with the case where prediction accuracy requirements exist during exploration: By gradually shifting a "center of exploration" and controlling the speed of the shift with local prediction accuracy, a goal-directed exploration of state space takes place along the fringes of the current data support until the task goal is achieved. We illustrate this approach with simulation results and results from a real robot learning a complex juggling task.

am

link (url) [BibTex]

link (url) [BibTex]


no image
Memory-based robot learning

Schaal, S., Atkeson, C. G.

In IEEE International Conference on Robotics and Automation, 3, pages: 2928-2933, San Diego, CA, 1994, clmc (inproceedings)

Abstract
We present a memory-based local modeling approach to robot learning using a nonparametric regression technique, locally weighted regression. The model of the task to be performed is represented by infinitely many local linear models, the (hyper-) tangent planes at every query point. This is in contrast to other methods using a finite set of linear models to accomplish a piece-wise linear model. Architectural parameters of our approach, such as distance metrics, are a function of the current query point instead of being global. Statistical tests are presented for when a local model is good enough such that it can be reliably used to build a local controller. These statistical measures also direct the exploration of the robot. We explicitly deal with the case where prediction accuracy requirements exist during exploration: By gradually shifting a center of exploration and controlling the speed of the shift with local prediction accuracy, a goal-directed exploration of state space takes place along the fringes of the current data support until the task goal is achieved. We illustrate this approach by describing how it has been used to enable a robot to learn a challenging juggling task: within 40 to 100 trials the robot accomplished the task goal starting out with no initial experiences.

am

[BibTex]

[BibTex]


no image
Nonparametric regression for learning

Schaal, S.

In Conference on Adaptive Behavior and Learning, Center of Interdisciplinary Research (ZIF) Bielefeld Germany, also technical report TR-H-098 of the ATR Human Information Processing Research Laboratories, 1994, clmc (inproceedings)

Abstract
In recent years, learning theory has been increasingly influenced by the fact that many learning algorithms have at least in part a comprehensive interpretation in terms of well established statistical theories. Furthermore, with little modification, several statistical methods can be directly cast into learning algorithms. One family of such methods stems from nonparametric regression. This paper compares nonparametric learning with the more widely used parametric counterparts and investigates how these two families differ in their properties and their applicability. 

am

link (url) [BibTex]

link (url) [BibTex]