Header logo is


2017


no image
Elements of Causal Inference - Foundations and Learning Algorithms

Peters, J., Janzing, D., Schölkopf, B.

Adaptive Computation and Machine Learning Series, The MIT Press, Cambridge, MA, USA, 2017 (book)

ei

PDF [BibTex]

2017


PDF [BibTex]


no image
New Directions for Learning with Kernels and Gaussian Processes (Dagstuhl Seminar 16481)

Gretton, A., Hennig, P., Rasmussen, C., Schölkopf, B.

Dagstuhl Reports, 6(11):142-167, 2017 (book)

ei pn

DOI [BibTex]

DOI [BibTex]


no image
Development and Evaluation of a Portable BCI System for Remote Data Acquisition

Emde, T.

Graduate School of Neural Information Processing, Eberhard Karls Universität Tübingen, Germany, 2017 (mastersthesis)

ei

[BibTex]

[BibTex]


no image
Brain-Computer Interfaces for patients with Amyotrophic Lateral Sclerosis

Fomina, T.

Eberhard Karls Universität Tübingen, Germany, 2017 (phdthesis)

ei

[BibTex]

[BibTex]


no image
Causal models for decision making via integrative inference

Geiger, P.

University of Stuttgart, Germany, 2017 (phdthesis)

ei

[BibTex]

[BibTex]


no image
Learning Optimal Configurations for Modeling Frowning by Transcranial Electrical Stimulation

Sücker, K.

Graduate School of Neural Information Processing, Eberhard Karls Universität Tübingen, Germany, 2017 (mastersthesis)

ei

[BibTex]

[BibTex]

2015


no image
easyGWAS: An Integrated Computational Framework for Advanced Genome-Wide Association Studies

Grimm, Dominik

Eberhard Karls Universität Tübingen, November 2015 (phdthesis)

ei

[BibTex]

2015


[BibTex]


no image
Causal Discovery Beyond Conditional Independences

Sgouritsa, E.

Eberhard Karls Universität Tübingen, Germany, October 2015 (phdthesis)

ei

link (url) [BibTex]

link (url) [BibTex]


no image
From Points to Probability Measures: A Statistical Learning on Distributions with Kernel Mean Embedding

Muandet, K.

University of Tübingen, Germany, University of Tübingen, Germany, September 2015 (phdthesis)

ei

[BibTex]

[BibTex]


no image
Machine Learning Approaches to Image Deconvolution

Schuler, C.

University of Tübingen, Germany, University of Tübingen, Germany, September 2015 (phdthesis)

ei

[BibTex]

[BibTex]


no image
Blind Retrospective Motion Correction of MR Images

Loktyushin, A.

University of Tübingen, Germany, May 2015 (phdthesis)

ei

[BibTex]

[BibTex]


no image
A Cognitive Brain-Computer Interface for Patients with Amyotrophic Lateral Sclerosis

Hohmann, M.

Graduate Training Centre of Neuroscience, University of Tübingen, Germany, 2015 (mastersthesis)

ei

[BibTex]

[BibTex]


no image
Sequential Image Deconvolution Using Probabilistic Linear Algebra

Gao, M.

Technical University of Munich, Germany, 2015 (mastersthesis)

ei

[BibTex]

[BibTex]


no image
Causal Inference in Neuroimaging

Casarsa de Azevedo, L.

Graduate Training Centre of Neuroscience, University of Tübingen, Germany, 2015 (mastersthesis)

ei

[BibTex]

[BibTex]


no image
The effect of frowning on attention

Ibarra Chaoul, A.

Graduate Training Centre of Neuroscience, University of Tübingen, Germany, 2015 (mastersthesis)

ei

[BibTex]

[BibTex]

2011


no image
Optimization for Machine Learning

Sra, S., Nowozin, S., Wright, S.

pages: 494, Neural information processing series, MIT Press, Cambridge, MA, USA, December 2011 (book)

Abstract
The interplay between optimization and machine learning is one of the most important developments in modern computational science. Optimization formulations and methods are proving to be vital in designing algorithms to extract essential knowledge from huge volumes of data. Machine learning, however, is not simply a consumer of optimization technology but a rapidly evolving field that is itself generating new optimization ideas. This book captures the state of the art of the interaction between optimization and machine learning in a way that is accessible to researchers in both fields. Optimization approaches have enjoyed prominence in machine learning because of their wide applicability and attractive theoretical properties. The increasing complexity, size, and variety of today's machine learning models call for the reassessment of existing assumptions. This book starts the process of reassessment. It describes the resurgence in novel contexts of established frameworks such as first-order methods, stochastic approximations, convex relaxations, interior-point methods, and proximal methods. It also devotes attention to newer themes such as regularized optimization, robust optimization, gradient and subgradient methods, splitting techniques, and second-order methods. Many of these techniques draw inspiration from other fields, including operations research, theoretical computer science, and subfields of optimization. The book will enrich the ongoing cross-fertilization between the machine learning community and these other fields, and within the broader optimization community.

ei

Web [BibTex]

2011


Web [BibTex]


no image
Bayesian Time Series Models

Barber, D., Cemgil, A., Chiappa, S.

pages: 432, Cambridge University Press, Cambridge, UK, August 2011 (book)

ei

[BibTex]

[BibTex]


no image
Crowdsourcing for optimisation of deconvolution methods via an iPhone application

Lang, A.

Hochschule Reutlingen, Germany, April 2011 (mastersthesis)

ei

[BibTex]

[BibTex]


no image
Handbook of Statistical Bioinformatics

Lu, H., Schölkopf, B., Zhao, H.

pages: 627, Springer Handbooks of Computational Statistics, Springer, Berlin, Germany, 2011 (book)

ei

Web DOI [BibTex]

Web DOI [BibTex]


no image
Model Learning in Robot Control

Nguyen-Tuong, D.

Albert-Ludwigs-Universität Freiburg, Germany, 2011 (phdthesis)

ei

[BibTex]

[BibTex]

2001


no image
Variationsverfahren zur Untersuchung von Grundzustandseigenschaften des Ein-Band Hubbard-Modells

Eichhorn, J.

Biologische Kybernetik, Technische Universität Dresden, Dresden/Germany, May 2001 (diplomathesis)

Abstract
Using different modifications of a new variational approach, statical groundstate properties of the one-band Hubbard model such as energy and staggered magnetisation are calculated. By taking into account additional fluctuations, the method ist gradually improved so that a very good description of the energy in one and two dimensions can be achieved. After a detailed discussion of the application in one dimension, extensions for two dimensions are introduced. By use of a modified version of the variational ansatz in particular a description of the quantum phase transition for the magnetisation should be possible.

ei

PostScript [BibTex]

2001


PostScript [BibTex]