Header logo is


2016


no image
Supplemental material for ’Communication Rate Analysis for Event-based State Estimation’

Ebner, S., Trimpe, S.

Max Planck Institute for Intelligent Systems, January 2016 (techreport)

am ics

PDF [BibTex]

2016


PDF [BibTex]


no image
Interface-controlled phenomena in nanomaterials

Mittemeijer, Eric J.; Wang, Zumin

2016 (mpi_year_book)

Abstract
Nanosized material systems characteristically exhibit an excessively high internal interface density. A series of previously unknown phenomena in nanomaterials have been disclosed that are fundamentally caused by the presence of interfaces. Thus anomalously large and small lattice parameters in nanocrystalline metals, quantum stress oscillations in growing nanofilms, and extraordinary atomic mobility at ultralow temperatures have been observed and explained. The attained understanding for these new phenomena can lead to new, sophisticated applications of nanomaterials in advanced technologies.

link (url) [BibTex]

link (url) [BibTex]


no image
Robots learn how to see

Geiger, A.

2016 (mpi_year_book)

Abstract
Autonomous vehicles and intelligent service robots could soon contribute to making our lives more pleasant and secure. However, for autonomous operation such systems first need to learn the perception process itself. This involves measuring distances and motions, detecting objects and interpreting the threedimensional world as a whole. While humans perceive their environment with seemingly little efforts, computers first need to be trained for these tasks. Our research is concerned with developing mathematical models which allow computers to robustly perceive their environment.

link (url) DOI [BibTex]

2002


no image
Kernel Dependency Estimation

Weston, J., Chapelle, O., Elisseeff, A., Schölkopf, B., Vapnik, V.

(98), Max Planck Institute for Biological Cybernetics, August 2002 (techreport)

Abstract
We consider the learning problem of finding a dependency between a general class of objects and another, possibly different, general class of objects. The objects can be for example: vectors, images, strings, trees or graphs. Such a task is made possible by employing similarity measures in both input and output spaces using kernel functions, thus embedding the objects into vector spaces. Output kernels also make it possible to encode prior information and/or invariances in the loss function in an elegant way. We experimentally validate our approach on several tasks: mapping strings to strings, pattern recognition, and reconstruction from partial images.

ei

PDF [BibTex]

2002


PDF [BibTex]


no image
A compression approach to support vector model selection

von Luxburg, U., Bousquet, O., Schölkopf, B.

(101), Max Planck Institute for Biological Cybernetics, 2002, see more detailed JMLR version (techreport)

Abstract
In this paper we investigate connections between statistical learning theory and data compression on the basis of support vector machine (SVM) model selection. Inspired by several generalization bounds we construct ``compression coefficients'' for SVMs, which measure the amount by which the training labels can be compressed by some classification hypothesis. The main idea is to relate the coding precision of this hypothesis to the width of the margin of the SVM. The compression coefficients connect well known quantities such as the radius-margin ratio R^2/rho^2, the eigenvalues of the kernel matrix and the number of support vectors. To test whether they are useful in practice we ran model selection experiments on several real world datasets. As a result we found that compression coefficients can fairly accurately predict the parameters for which the test error is minimized.

ei

[BibTex]

[BibTex]

2001


no image
Inference Principles and Model Selection

Buhmann, J., Schölkopf, B.

(01301), Dagstuhl Seminar, 2001 (techreport)

ei

Web [BibTex]

2001


Web [BibTex]