2020

Advances in Latent Variable and Causal Models
University of Cambridge, UK, 2020, (Cambridge-Tuebingen-Fellowship) (phdthesis)

ei

2013

Camera-specific Image Denoising
Eberhard Karls Universität Tübingen, Germany, October 2013 (diplomathesis)

ei pn

2013

Statistics on Manifolds with Applications to Modeling Shape Deformations
Brown University, August 2013 (phdthesis)

Abstract
Statistical models of non-rigid deformable shape have wide application in many fi elds, including computer vision, computer graphics, and biometry. We show that shape deformations are well represented through nonlinear manifolds that are also matrix Lie groups. These pattern-theoretic representations lead to several advantages over other alternatives, including a principled measure of shape dissimilarity and a natural way to compose deformations. Moreover, they enable building models using statistics on manifolds. Consequently, such models are superior to those based on Euclidean representations. We demonstrate this by modeling 2D and 3D human body shape. Shape deformations are only one example of manifold-valued data. More generally, in many computer-vision and machine-learning problems, nonlinear manifold representations arise naturally and provide a powerful alternative to Euclidean representations. Statistics is traditionally concerned with data in a Euclidean space, relying on the linear structure and the distances associated with such a space; this renders it inappropriate for nonlinear spaces. Statistics can, however, be generalized to nonlinear manifolds. Moreover, by respecting the underlying geometry, the statistical models result in not only more e ffective analysis but also consistent synthesis. We go beyond previous work on statistics on manifolds by showing how, even on these curved spaces, problems related to modeling a class from scarce data can be dealt with by leveraging information from related classes residing in di fferent regions of the space. We show the usefulness of our approach with 3D shape deformations. To summarize our main contributions: 1) We de fine a new 2D articulated model -- more expressive than traditional ones -- of deformable human shape that factors body-shape, pose, and camera variations. Its high realism is obtained from training data generated from a detailed 3D model. 2) We defi ne a new manifold-based representation of 3D shape deformations that yields statistical deformable-template models that are better than the current state-of-the- art. 3) We generalize a transfer learning idea from Euclidean spaces to Riemannian manifolds. This work demonstrates the value of modeling manifold-valued data and their statistics explicitly on the manifold. Specifi cally, the methods here provide new tools for shape analysis.

ps

Modelling and Learning Approaches to Image Denoising

Burger, HC.

Eberhard Karls Universität Tübingen, Germany, 2013 (phdthesis)

ei

Linear mixed models for genome-wide association studies

Lippert, C.

University of Tübingen, Germany, 2013 (phdthesis)

ei

Modeling and Learning Complex Motor Tasks: A case study on Robot Table Tennis
Technical University Darmstadt, Germany, 2013 (phdthesis)

ei

Intention Inference and Decision Making with Hierarchical Gaussian Process Dynamics Models
Technical University Darmstadt, Germany, 2013 (phdthesis)

ei

2012

Virtual Human Bodies with Clothing and Hair: From Images to Animation
Brown University, Department of Computer Science, December 2012 (phdthesis)

ps

2012

Scalable graph kernels
Eberhard Karls Universität Tübingen, Germany, October 2012 (phdthesis)

ei

From Pixels to Layers: Joint Motion Estimation and Segmentation
Brown University, Department of Computer Science, July 2012 (phdthesis)

ps

Learning Motor Skills: From Algorithms to Robot Experiments
Technische Universität Darmstadt, Germany, March 2012 (phdthesis)

ei

Structure and Dynamics of Diffusion Networks
Department of Electrical Engineering, Stanford University, 2012 (phdthesis)

ei

Blind Deconvolution in Scientific Imaging & Computational Photography
Eberhard Karls Universität Tübingen, Germany, 2012 (phdthesis)

ei

Restricted structural equation models for causal inference
ETH Zurich, Switzerland, 2012 (phdthesis)

ei

Combinatorial Problems with Submodular Coupling in Machine Learning and Computer Vision
ETH Zürich, Switzerland, 2012 (phdthesis)

ei

Automatische Seitenkettenzuordnung zur NMR Proteinstrukturaufklärung mittels ganzzahliger linearer Programmierung
University of Tübingen, Germany, 2012 (diplomathesis)

ei

Nonparametric System Identification and Control for Periodic Error Correction in Telescopes
University of Stuttgart, 2012 (diplomathesis)

ei pn

2009

Kernel Learning Approaches for Image Classification

Gehler, PV.

Biologische Kybernetik, Universität des Saarlandes, Saarbrücken, Germany, October 2009 (phdthesis)

Abstract
This thesis extends the use of kernel learning techniques to specific problems of image classification. Kernel learning is a paradigm in the field of machine learning that generalizes the use of inner products to compute similarities between arbitrary objects. In image classification one aims to separate images based on their visual content. We address two important problems that arise in this context: learning with weak label information and combination of heterogeneous data sources. The contributions we report on are not unique to image classification, and apply to a more general class of problems. We study the problem of learning with label ambiguity in the multiple instance learning framework. We discuss several different image classification scenarios that arise in this context and argue that the standard multiple instance learning requires a more detailed disambiguation. Finally we review kernel learning approaches proposed for this problem and derive a more efficient algorithm to solve them. The multiple kernel learning framework is an approach to automatically select kernel parameters. We extend it to its infinite limit and present an algorithm to solve the resulting problem. This result is then applied in two directions. We show how to learn kernels that adapt to the special structure of images. Finally we compare different ways of combining image features for object classification and present significant improvements compared to previous methods.

ei

2009

Kernel Methods in Computer Vision:Object Localization, Clustering,and Taxonomy Discovery

Blaschko, MB.

Biologische Kybernetik, Technische Universität Berlin, Berlin, Germany, March 2009 (phdthesis)

ei

Motor Control and Learning in Table Tennis
Eberhard Karls Universität Tübingen, Gerrmany, 2009 (diplomathesis)

ei

Hierarchical Clustering and Density Estimation Based on k-nearest-neighbor graphs

Drewe, P.

Eberhard Karls Universität Tübingen, Germany, 2009 (diplomathesis)

ei

Learning with Structured Data: Applications to Computer Vision
Technische Universität Berlin, Germany, 2009 (phdthesis)

ei

From Differential Equations to Differential Geometry: Aspects of Regularisation in Machine Learning
Universität des Saarlandes, Saarbrücken, Germany, 2009 (phdthesis)

ei

2005

Extension to Kernel Dependency Estimation with Applications to Robotics
Biologische Kybernetik, Technische Universität Berlin, Berlin, November 2005 (phdthesis)

Abstract
Kernel Dependency Estimation(KDE) is a novel technique which was designed to learn mappings between sets without making assumptions on the type of the involved input and output data. It learns the mapping in two stages. In a first step, it tries to estimate coordinates of a feature space representation of elements of the set by solving a high dimensional multivariate regression problem in feature space. Following this, it tries to reconstruct the original representation given the estimated coordinates. This thesis introduces various algorithmic extensions to both stages in KDE. One of the contributions of this thesis is to propose a novel linear regression algorithm that explores low-dimensional subspaces during learning. Furthermore various existing strategies for reconstructing patterns from feature maps involved in KDE are discussed and novel pre-image techniques are introduced. In particular, pre-image techniques for data-types that are of discrete nature such as graphs and strings are investigated. KDE is then explored in the context of robot pose imitation where the input is a an image with a human operator and the output is the robot articulated variables. Thus, using KDE, robot pose imitation is formulated as a regression problem.

ei

2005

ei

Implicit Surfaces For Modelling Human Heads
Biologische Kybernetik, Eberhard-Karls-Universität, Tübingen, September 2005 (diplomathesis)

ei

Machine Learning Methods for Brain-Computer Interdaces

Lal, TN.

Biologische Kybernetik, University of Darmstadt, September 2005 (phdthesis)

ei

Efficient Adaptive Sampling of the Psychometric Function by Maximizing Information Gain

Tanner, TG.

Biologische Kybernetik, Eberhard-Karls University Tübingen, Tübingen, Germany, May 2005 (diplomathesis)

Abstract
A common task in psychophysics is to measure the psychometric function. A psychometric function can be described by its shape and four parameters: offset or threshold, slope or width, false alarm rate or chance level and miss or lapse rate. Depending on the parameters of interest some points on the psychometric function may be more informative than others. Adaptive methods attempt to place trials on the most informative points based on the data collected in previous trials. A new Bayesian adaptive psychometric method placing trials by minimising the expected entropy of the posterior probabilty dis- tribution over a set of possible stimuli is introduced. The method is more flexible, faster and at least as efficient as the established method (Kontsevich and Tyler, 1999). Comparably accurate (2dB) threshold and slope estimates can be obtained after about 30 and 500 trials, respectively. By using a dynamic termination criterion the efficiency can be further improved. The method can be applied to all experimental designs including yes/no designs and allows acquisition of any set of free parameters. By weighting the importance of parameters one can include nuisance parameters and adjust the relative expected errors. Use of nuisance parameters may lead to more accurate estimates than assuming a guessed fixed value. Block designs are supported and do not harm the performance if a sufficient number of trials are performed. The method was evaluated by computer simulations in which the role of parametric assumptions, its robustness, the quality of different point estimates, the effect of dynamic termination criteria and many other settings were investigated.

ei

2004

Statistical Learning with Similarity and Dissimilarity Functions
pages: 1-166, Technische Universität Berlin, Germany, Technische Universität Berlin, Germany, 2004 (phdthesis)

ei

2004

Classification and Feature Extraction in Man and Machine

Graf, AAB.

Biologische Kybernetik, University of Tübingen, Germany, 2004, online publication (phdthesis)

ei