Header logo is


2017


no image
Elements of Causal Inference - Foundations and Learning Algorithms

Peters, J., Janzing, D., Schölkopf, B.

Adaptive Computation and Machine Learning Series, The MIT Press, Cambridge, MA, USA, 2017 (book)

ei

PDF [BibTex]

2017


PDF [BibTex]


no image
New Directions for Learning with Kernels and Gaussian Processes (Dagstuhl Seminar 16481)

Gretton, A., Hennig, P., Rasmussen, C., Schölkopf, B.

Dagstuhl Reports, 6(11):142-167, 2017 (book)

ei pn

DOI [BibTex]

DOI [BibTex]


no image
Development and Evaluation of a Portable BCI System for Remote Data Acquisition

Emde, T.

Graduate School of Neural Information Processing, Eberhard Karls Universität Tübingen, Germany, 2017 (mastersthesis)

ei

[BibTex]

[BibTex]


no image
Brain-Computer Interfaces for patients with Amyotrophic Lateral Sclerosis

Fomina, T.

Eberhard Karls Universität Tübingen, Germany, 2017 (phdthesis)

ei

[BibTex]

[BibTex]


no image
Causal models for decision making via integrative inference

Geiger, P.

University of Stuttgart, Germany, 2017 (phdthesis)

ei

[BibTex]

[BibTex]


no image
Learning Optimal Configurations for Modeling Frowning by Transcranial Electrical Stimulation

Sücker, K.

Graduate School of Neural Information Processing, Eberhard Karls Universität Tübingen, Germany, 2017 (mastersthesis)

ei

[BibTex]

[BibTex]

2012


no image
Support Vector Machines, Support Measure Machines, and Quasar Target Selection

Muandet, K.

Center for Cosmology and Particle Physics (CCPP), New York University, December 2012 (talk)

ei

[BibTex]

2012


[BibTex]


no image
Hilbert Space Embedding for Dirichlet Process Mixtures

Muandet, K.

NIPS Workshop on Confluence between Kernel Methods and Graphical Models, December 2012 (talk)

ei

[BibTex]

[BibTex]


no image
Scalable graph kernels

Shervashidze, N.

Eberhard Karls Universität Tübingen, Germany, October 2012 (phdthesis)

ei

Web [BibTex]

Web [BibTex]


no image
Simultaneous small animal PET/MR in activated and resting state reveals multiple brain networks

Wehrl, H., Lankes, K., Hossain, M., Bezrukov, I., Liu, C., Martirosian, P., Schick, F., Pichler, B.

20th Annual Meeting and Exhibition of the International Society for Magnetic Resonance in Medicine (ISMRM), May 2012 (talk)

ei

Web [BibTex]

Web [BibTex]


no image
A new PET insert for simultaneous PET/MR small animal imaging

Wehrl, H., Lankes, K., Hossain, M., Bezrukov, I., Liu, C., Martirosian, P., Reischl, G., Schick, F., Pichler, B.

20th Annual Meeting and Exhibition of the International Society for Magnetic Resonance in Medicine (ISMRM), May 2012 (talk)

ei

Web [BibTex]

Web [BibTex]


no image
Evaluation of a new, large field of view, small animal PET/MR system

Hossain, M., Wehrl, H., Lankes, K., Liu, C., Bezrukov, I., Reischl, G., Pichler, B.

50. Jahrestagung der Deutschen Gesellschaft fuer Nuklearmedizin (NuklearMedizin), April 2012 (talk)

ei

Web [BibTex]

Web [BibTex]


no image
Learning Motor Skills: From Algorithms to Robot Experiments

Kober, J.

Technische Universität Darmstadt, Germany, March 2012 (phdthesis)

ei

PDF [BibTex]

PDF [BibTex]


no image
Simultaneous small animal PET/MR reveals different brain networks during stimulation and rest

Wehrl, H., Hossain, M., Lankes, K., Liu, C., Bezrukov, I., Martirosian, P., Reischl, G., Schick, F., Pichler, B.

World Molecular Imaging Congress (WMIC), 2012 (talk)

ei

[BibTex]

[BibTex]


no image
Support Measure Machines for Quasar Target Selection

Muandet, K.

Astro Imaging Workshop, 2012 (talk)

Abstract
In this talk I will discuss the problem of quasar target selection. The objects attributes in astronomy such as fluxes are often subjected to substantial and heterogeneous measurement uncertainties, especially for the medium-redshift between 2.2 and 3.5 quasars which is relatively rare and must be targeted down to g ~ 22 mag. Most of the previous works for quasar target selection includes UV-excess, kernel density estimation, a likelihood approach, and artificial neural network cannot directly deal with the heterogeneous input uncertainties. Recently, extreme deconvolution (XD) has been used to tackle this problem in a well-posed manner. In this work, we present a discriminative approach for quasar target selection that can deal with input uncertainties directly. To do so, we represent each object as a Gaussian distribution whose mean is the object's attribute vector and covariance is the given flux measurement uncertainty. Given a training set of Gaussian distributions, the support measure machines (SMMs) algorithm are trained and used to build the quasar targeting catalog. Preliminary results will also be presented. Joint work with Jo Bovy and Bernhard Sch{\"o}lkopf

ei

Web [BibTex]


no image
PAC-Bayesian Analysis: A Link Between Inference and Statistical Physics

Seldin, Y.

Workshop on Statistical Physics of Inference and Control Theory, 2012 (talk)

ei

Web [BibTex]

Web [BibTex]


no image
PET Performance Measurements of a Next Generation Dedicated Small Animal PET/MR Scanner

Liu, C., Hossain, M., Lankes, K., Bezrukov, I., Wehrl, H., Kolb, A., Judenhofer, M., Pichler, B.

Nuclear Science Symposium and Medical Imaging Conference (NSS-MIC), 2012 (talk)

ei

[BibTex]

[BibTex]


no image
Structure and Dynamics of Diffusion Networks

Gomez Rodriguez, M.

Department of Electrical Engineering, Stanford University, 2012 (phdthesis)

ei

Web [BibTex]

Web [BibTex]


no image
Blind Deconvolution in Scientific Imaging & Computational Photography

Hirsch, M.

Eberhard Karls Universität Tübingen, Germany, 2012 (phdthesis)

ei

Web [BibTex]

Web [BibTex]


no image
PAC-Bayesian Analysis of Supervised, Unsupervised, and Reinforcement Learning

Seldin, Y., Laviolette, F., Shawe-Taylor, J.

Tutorial at the 29th International Conference on Machine Learning (ICML), 2012 (talk)

ei

Web Web [BibTex]

Web Web [BibTex]


no image
Influence of MR-based attenuation correction on lesions within bone and susceptibility artifact regions

Bezrukov, I., Schmidt, H., Mantlik, F., Schwenzer, N., Brendle, C., Pichler, B.

Molekulare Bildgebung (MoBi), 2012 (talk)

ei

[BibTex]

[BibTex]


no image
Structured Apprenticeship Learning

Boularias, A., Kroemer, O., Peters, J.

European Workshop on Reinforcement Learning (EWRL), 2012 (talk)

ei

[BibTex]

[BibTex]


no image
PAC-Bayesian Analysis and Its Applications

Seldin, Y., Laviolette, F., Shawe-Taylor, J.

Tutorial at The European Conference on Machine Learning and Principles and Practice of Knowledge Discovery in Databases (ECML-PKDD), 2012 (talk)

ei

Web [BibTex]

Web [BibTex]


no image
Kernel Bellman Equations in POMDPs

Nishiyama, Y., Boularias, A., Gretton, A., Fukumizu, K.

Technical Committee on Infomation-Based Induction Sciences and Machine Learning (IBISML'12), 2012 (talk)

ei

[BibTex]

[BibTex]


no image
Beta oscillations propagate as traveling waves in the macaque prefrontal cortex

Panagiotaropoulos, T., Besserve, M., Logothetis, N.

42nd Annual Meeting of the Society for Neuroscience (Neuroscience), 2012 (talk)

ei

[BibTex]

[BibTex]

2005


no image
Some thoughts about Gaussian Processes

Chapelle, O.

NIPS Workshop on Open Problems in Gaussian Processes for Machine Learning, December 2005 (talk)

ei

PDF Web [BibTex]

2005


PDF Web [BibTex]


no image
Extension to Kernel Dependency Estimation with Applications to Robotics

BakIr, G.

Biologische Kybernetik, Technische Universität Berlin, Berlin, November 2005 (phdthesis)

Abstract
Kernel Dependency Estimation(KDE) is a novel technique which was designed to learn mappings between sets without making assumptions on the type of the involved input and output data. It learns the mapping in two stages. In a first step, it tries to estimate coordinates of a feature space representation of elements of the set by solving a high dimensional multivariate regression problem in feature space. Following this, it tries to reconstruct the original representation given the estimated coordinates. This thesis introduces various algorithmic extensions to both stages in KDE. One of the contributions of this thesis is to propose a novel linear regression algorithm that explores low-dimensional subspaces during learning. Furthermore various existing strategies for reconstructing patterns from feature maps involved in KDE are discussed and novel pre-image techniques are introduced. In particular, pre-image techniques for data-types that are of discrete nature such as graphs and strings are investigated. KDE is then explored in the context of robot pose imitation where the input is a an image with a human operator and the output is the robot articulated variables. Thus, using KDE, robot pose imitation is formulated as a regression problem.

ei

PDF PDF [BibTex]

PDF PDF [BibTex]


no image
Geometrical aspects of statistical learning theory

Hein, M.

Biologische Kybernetik, Darmstadt, Darmstadt, November 2005 (phdthesis)

ei

PDF [BibTex]

PDF [BibTex]


no image
Implicit Surfaces For Modelling Human Heads

Steinke, F.

Biologische Kybernetik, Eberhard-Karls-Universität, Tübingen, September 2005 (diplomathesis)

ei

[BibTex]

[BibTex]


no image
Machine Learning Methods for Brain-Computer Interdaces

Lal, TN.

Biologische Kybernetik, University of Darmstadt, September 2005 (phdthesis)

ei

Web [BibTex]

Web [BibTex]


no image
Building Sparse Large Margin Classifiers

Wu, M., Schölkopf, B., BakIr, G.

The 22nd International Conference on Machine Learning (ICML), August 2005 (talk)

ei

PDF [BibTex]

PDF [BibTex]


no image
Learning from Labeled and Unlabeled Data on a Directed Graph

Zhou, D.

The 22nd International Conference on Machine Learning, August 2005 (talk)

Abstract
We propose a general framework for learning from labeled and unlabeled data on a directed graph in which the structure of the graph including the directionality of the edges is considered. The time complexity of the algorithm derived from this framework is nearly linear due to recently developed numerical techniques. In the absence of labeled instances, this framework can be utilized as a spectral clustering method for directed graphs, which generalizes the spectral clustering approach for undirected graphs. We have applied our framework to real-world web classification problems and obtained encouraging results.

ei

PDF [BibTex]

PDF [BibTex]


no image
Machine-Learning Approaches to BCI in Tübingen

Bensch, M., Bogdan, M., Hill, N., Lal, T., Rosenstiel, W., Schölkopf, B., Schröder, M.

Brain-Computer Interface Technology, June 2005, Talk given by NJH. (talk)

ei

[BibTex]

[BibTex]


no image
Efficient Adaptive Sampling of the Psychometric Function by Maximizing Information Gain

Tanner, TG.

Biologische Kybernetik, Eberhard-Karls University Tübingen, Tübingen, Germany, May 2005 (diplomathesis)

Abstract
A common task in psychophysics is to measure the psychometric function. A psychometric function can be described by its shape and four parameters: offset or threshold, slope or width, false alarm rate or chance level and miss or lapse rate. Depending on the parameters of interest some points on the psychometric function may be more informative than others. Adaptive methods attempt to place trials on the most informative points based on the data collected in previous trials. A new Bayesian adaptive psychometric method placing trials by minimising the expected entropy of the posterior probabilty dis- tribution over a set of possible stimuli is introduced. The method is more flexible, faster and at least as efficient as the established method (Kontsevich and Tyler, 1999). Comparably accurate (2dB) threshold and slope estimates can be obtained after about 30 and 500 trials, respectively. By using a dynamic termination criterion the efficiency can be further improved. The method can be applied to all experimental designs including yes/no designs and allows acquisition of any set of free parameters. By weighting the importance of parameters one can include nuisance parameters and adjust the relative expected errors. Use of nuisance parameters may lead to more accurate estimates than assuming a guessed fixed value. Block designs are supported and do not harm the performance if a sufficient number of trials are performed. The method was evaluated by computer simulations in which the role of parametric assumptions, its robustness, the quality of different point estimates, the effect of dynamic termination criteria and many other settings were investigated.

ei

[BibTex]

[BibTex]


no image
Kernel Constrained Covariance for Dependence Measurement

Gretton, A., Smola, A., Bousquet, O., Herbrich, R., Belitski, A., Augath, M., Murayama, Y., Schölkopf, B., Logothetis, N.

AISTATS, January 2005 (talk)

Abstract
We discuss reproducing kernel Hilbert space (RKHS)-based measures of statistical dependence, with emphasis on constrained covariance (COCO), a novel criterion to test dependence of random variables. We show that COCO is a test for independence if and only if the associated RKHSs are universal. That said, no independence test exists that can distinguish dependent and independent random variables in all circumstances. Dependent random variables can result in a COCO which is arbitrarily close to zero when the source densities are highly non-smooth. All current kernel-based independence tests share this behaviour. We demonstrate exponential convergence between the population and empirical COCO. Finally, we use COCO as a measure of joint neural activity between voxels in MRI recordings of the macaque monkey, and compare the results to the mutual information and the correlation. We also show the effect of removing breathing artefacts from the MRI recording.

ei

PostScript [BibTex]

PostScript [BibTex]

2002


no image
Learning with Kernels: Support Vector Machines, Regularization, Optimization, and Beyond

Schölkopf, B., Smola, A.

pages: 644, Adaptive Computation and Machine Learning, MIT Press, Cambridge, MA, USA, December 2002, Parts of this book, including an introduction to kernel methods, can be downloaded here. (book)

Abstract
In the 1990s, a new type of learning algorithm was developed, based on results from statistical learning theory: the Support Vector Machine (SVM). This gave rise to a new class of theoretically elegant learning machines that use a central concept of SVMs-kernels—for a number of learning tasks. Kernel machines provide a modular framework that can be adapted to different tasks and domains by the choice of the kernel function and the base algorithm. They are replacing neural networks in a variety of fields, including engineering, information retrieval, and bioinformatics. Learning with Kernels provides an introduction to SVMs and related kernel methods. Although the book begins with the basics, it also includes the latest research. It provides all of the concepts necessary to enable a reader equipped with some basic mathematical knowledge to enter the world of machine learning using theoretically well-founded yet easy-to-use kernel algorithms and to understand and apply the powerful algorithms that have been developed over the last few years.

ei

Web [BibTex]

2002


Web [BibTex]

2001


no image
Variationsverfahren zur Untersuchung von Grundzustandseigenschaften des Ein-Band Hubbard-Modells

Eichhorn, J.

Biologische Kybernetik, Technische Universität Dresden, Dresden/Germany, May 2001 (diplomathesis)

Abstract
Using different modifications of a new variational approach, statical groundstate properties of the one-band Hubbard model such as energy and staggered magnetisation are calculated. By taking into account additional fluctuations, the method ist gradually improved so that a very good description of the energy in one and two dimensions can be achieved. After a detailed discussion of the application in one dimension, extensions for two dimensions are introduced. By use of a modified version of the variational ansatz in particular a description of the quantum phase transition for the magnetisation should be possible.

ei

PostScript [BibTex]

2001


PostScript [BibTex]