Header logo is



Thumb xl paraview preview
Design of a visualization scheme for functional connectivity data of Human Brain

Bramlage, L.

Hochschule Osnabrück - University of Applied Sciences, 2017 (thesis)

sf

Bramlage_BSc_2017.pdf [BibTex]

2011


no image
Combined whole-body PET/MR imaging: MR contrast agents do not affect the quantitative accuracy of PET following attenuation correction

Lois, C., Kupferschläger, J., Bezrukov, I., Schmidt, H., Werner, M., Mannheim, J., Pichler, B., Schwenzer, N., Beyer, T.

(SST15-05 ), 97th Scientific Assemble and Annual Meeting of the Radiological Society of North America (RSNA), December 2011 (talk)

Abstract
PURPOSE Combined PET/MR imaging entails the use of MR contrast agents (MRCA) as part of integrated protocols. We assess additional attenuation of the PET emission signals in the presence of oral and intraveneous (iv) MRCA made up of iron oxide and Gd-chelates, respectively. METHOD AND MATERIALS Phantom scans were performed on a clinical PET/CT (Biograph HiRez16, Siemens) and integrated whole-body PET/MR (Biograph mMR, Siemens) using oral (Lumirem) and intraveneous (Gadovist) MRCA. Reference PET attenuation values were determined on a small-animal PET (Inveon, Siemens) using standard PET transmission imaging (TX). Seven syringes of 5mL were filled with (a) Water, (b) Lumirem_100 (100% conc.), (c) Gadovist_100 (100%), (d) Gadovist_18 (18%), (e) Gadovist_02 (0.2%), (f) Imeron-400 CT iv-contrast (100%) and (g) Imeron-400 (2.4%). The same set of syringes was scanned on CT (Sensation16, Siemens) at 120kVp and 160mAs. The effect of MRCA on the attenuation of PET emission data was evaluated using a 20cm cylinder filled uniformly with [18F]-FDG (FDG) in water (BGD). Three 4.5cm diameter cylinders were inserted into the phantom: (C1) Teflon, (C2) Water+FDG (2:1) and (C3) Lumirem_100+FDG (2:1). Two 50mL syringes filled with Gadovist_02+FDG (Sy1) and water+FDG (Sy2) were attached to the sides of (C1) to mimick the effects of iv-contrast in vessels near bone. Syringe-to-background activity ratio was 4-to-1. PET emission data were acquired for 10min each using the PET/CT and the PET/MR. Images were reconstructed using CT- and MR-based attenuation correction. RESULTS Mean linear PET attenuation (cm-1) on TX was (a) 0.098, (b) 0.098, (c) 0.300, (d) 0.134, (e) 0.095, (f) 0.397 and (g) 0.105. Corresponding CT attenuation (HU) was: (a) 5, (b) 14, (c) 3070, (d) 1040, (e) 13, (f) 3070 and (g) 347. Lumirem had little effect on PET attenuation with (C3) being 13% and 10% higher than (C2) on PET/CT and PET/MR, respectively. Gadovist_02 had even smaller effects with (Sy1) being 2.5% lower than (Sy2) on PET/CT and 1.2% higher than (Sy2) on PET/MR. CONCLUSION MRCA in high and clinically relevant concentrations have attenuation values similar to that of CT contrast and water, respectively. In clinical PET/MR scenarios MRCA are not expected to lead to significant attenuation of the PET emission signals.

ei

Web [BibTex]

2011


Web [BibTex]


no image
Cooperative Cuts: a new use of submodularity in image segmentation

Jegelka, S.

Second I.S.T. Austria Symposium on Computer Vision and Machine Learning, October 2011 (talk)

ei

Web [BibTex]

Web [BibTex]


no image
Effect of MR Contrast Agents on Quantitative Accuracy of PET in Combined Whole-Body PET/MR Imaging

Lois, C., Bezrukov, I., Schmidt, H., Schwenzer, N., Werner, M., Pichler, B., Kupferschläger, J., Beyer, T.

2011(MIC3-3), 2011 IEEE Nuclear Science Symposium, Medical Imaging Conference (NSS-MIC), October 2011 (talk)

Abstract
Combined whole-body PET/MR systems are being tested in clinical practice today. Integrated imaging protocols entail the use of MR contrast agents (MRCA) that could bias PET attenuation correction. In this work, we assess the effect of MRCA in PET/MR imaging. We analyze the effect of oral and intravenous MRCA on PET activity after attenuation correction. We conclude that in clinical scenarios, MRCA are not expected to lead to significant attenuation of PET signals, and that attenuation maps are not biased after the ingestion of adequate oral contrasts.

ei

Web [BibTex]

Web [BibTex]


no image
First Results on Patients and Phantoms of a Fully Integrated Clinical Whole-Body PET/MRI

Schmidt, H., Schwenzer, N., Bezrukov, I., Kolb, A., Mantlik, F., Kupferschläger, J., Lois, C., Sauter, A., Brendle, C., Pfannenberg, C., Pichler, B.

2011(J2-8), 2011 IEEE Nuclear Science Symposium, Medical Imaging Conference (NSS-MIC), October 2011 (talk)

Abstract
First clinical fully integrated whole-body PET/MR scanners are just entering the field. Here, we present studies toward quantification accuracy and variation within the PET field of view of small lesions from our BrainPET/MRI, a dedicated clinical brain scanner which was installed three years ago in Tbingen. Also, we present first results for patient and phantom scans of a fully integral whole-body PET/MRI, which was installed two months ago at our department. The quantification accuracy and homogeneity of the BrainPET-Insert (Siemens Medical Solutions, Germany) installed inside the magnet bore of a clinical 3T MRI scanner (Magnetom TIM Trio, Siemens Medical Solutions, Germany) was evaluated by using eight hollow spheres with inner diameters from 3.95 to 7.86 mm placed at different positions inside a homogeneous cylinder phantom with an 9:1 and 6:1 sphere to background ratio. The quantification accuracy for small lesions at different positions in the PET FoV shows a standard deviation of up to 11% and is acceptable for quantitative brain studies where the homogeneity of quantification on the entire FoV is essental. Image quality and resolution of the new Siemens whole-body PET/MR system (Biograph mMR, Siemens Medical Solutions, Germany) was evaluated according to the NEMA NU2 2007 protocol using a body phantom containing six spheres with inner diameter from 10 to 37 mm at sphere to background ratios of 8:1 and 4:1 and the F-18 point sources located at different positions inside the PET FoV, respectively. The evaluation of the whole-body PET/MR system reveals a good PET image quality and resolution comparable to state-of-the-art clinical PET/CT scanners. First images of patient studies carried out at the whole-body PET/MR are presented highlighting the potency of combined PET/MR imaging.

ei

Web [BibTex]

Web [BibTex]


no image
Effect of MR contrast agents on quantitative accuracy of PET in combined whole-body PET/MR imaging

Lois, C., Kupferschläger, J., Bezrukov, I., Schmidt, H., Werner, M., Mannheim, J., Pichler, B., Schwenzer, N., Beyer, T.

(OP314), Annual Congress of the European Association of Nuclear Medicine (EANM), October 2011 (talk)

Abstract
PURPOSE:Combined PET/MR imaging entails the use of MR contrast agents (MRCA) as part of integrated protocols. MRCA are made up of iron oxide and Gd-chelates for oral and intravenous (iv) application, respectively. We assess additional attenuation of the PET emission signals in the presence of oral and iv MRCA.MATERIALS AND METHODS:Phantom scans were performed on a clinical PET/CT (Biograph HiRez16, Siemens) and an integrated whole-body PET/MR (Biograph mMR, Siemens). Two common MRCA were evaluated: Lumirem (oral) and Gadovist (iv).Reference PET attenuation values were determined on a dedicated small-animal PET (Inveon, Siemens) using equivalent standard PET transmission source imaging (TX). Seven syringes of 5mL were filled with (a) Water, (b) Lumirem_100 (100% concentration), (c) Gadovist_100 (100%), (d) Gadovist_18 (18%), (e) Gadovist_02 (0.2%), (f) Imeron-400 CT iv-contrast (100%) and (g) Imeron-400 (2.4%). The same set of syringes was scanned on CT (Sensation16, Siemens) at 120kVp and 160mAs.The effect of MRCA on the attenuation of PET emission data was evaluated using a 20cm cylinder filled uniformly with [18F]-FDG (FDG) in water (BGD). Three 4.5cm diameter cylinders were inserted into the phantom: (C1) Teflon, (C2) Water+FDG (2:1) and (C3) Lumirem_100+FDG (2:1). Two 50mL syringes filled with Gadovist_02+FDG (Sy1) and water+FDG (Sy2) were attached to the sides of (C1) to mimick the effects of iv-contrast in vessels near bone. Syringe-to-background activity ratio was 4-to-1.PET emission data were acquired for 10min each using the PET/CT and the PET/MR. Images were reconstructed using CT- and MR-based attenuation correction (AC). Since Teflon is not correctly identified on MR, PET(/MR) data were reconstructed using MR-AC and CT-AC.RESULTS:Mean linear PET attenuation (cm-1) on TX was (a) 0.098, (b) 0.098, (c) 0.300, (d) 0.134, (e) 0.095, (f) 0.397 and (g) 0.105. Corresponding CT attenuation (HU) was: (a) 5, (b) 14, (c) 3070, (d) 1040, (e) 13, (f) 3070 and (g) 347.Lumirem had little effect on PET attenuation with (C3) being 13%, 10% and 11% higher than (C2) on PET/CT, PET/MR with MR-AC, and PET/MR with CT-AC, respectively. Gadovist_02 had even smaller effects with (Sy1) being 2.5% lower, 1.2% higher, and 3.5% lower than (Sy2) on PET/CT, PET/MR with MR-AC and PET/MR with CT-AC, respectively.CONCLUSION:MRCA in high and clinically relevant concentrations have attenuation values similar to that of CT contrast and water, respectively. In clinical PET/MR scenarios MRCA are not expected to lead to significant attenuation of the PET emission signals.

ei

Web [BibTex]

Web [BibTex]


no image
Multi-parametric Tumor Characterization and Therapy Monitoring using Simultaneous PET/MRI: initial results for Lung Cancer and GvHD

Sauter, A., Schmidt, H., Gueckel, B., Brendle, C., Bezrukov, I., Mantlik, F., Kolb, A., Mueller, M., Reimold, M., Federmann, B., Hetzel, J., Claussen, C., Pfannenberg, C., Horger, M., Pichler, B., Schwenzer, N.

(T110), 2011 World Molecular Imaging Congress (WMIC), September 2011 (talk)

Abstract
Hybrid imaging modalities such as [18F]FDG-PET/CT are superior in staging of e.g. lung cancer disease compared with stand-alone modalities. Clinical PET/MRI systems are about to enter the field of hybrid imaging and offer potential advantages. One added value could be a deeper insight into the tumor metabolism and tumorigenesis due to the combination of PET and dedicated MR methods such as MRS and DWI. Additionally, therapy monitoring of diffucult to diagnose disease such as chronic sclerodermic GvHD (csGvHD) can potentially be improved by this combination. We have applied PET/MRI in 3 patients with lung cancer and 4 patients with csGvHD before and during therapy. All 3 patients had lung cancer confirmed by histology (2 adenocarcinoma, 1 carcinoid). First, a [18F]FDG-PET/CT was performed with the following parameters: injected dose 351.7±25.1 MBq, uptake time 59.0±2.6 min, 3 min/bed. Subsequently, patients were brought to the PET/MRI imaging facility. The whole-body PET/MRI Biograph mMR system comprises 56 detector cassettes with a 59.4 cm transaxial and 25.8 cm axial FoV. The MRI is a modified Verio system with a magnet bore of 60 cm. The following parameters for PET acquisition were applied: uptake time 121.3±2.3 min, 3 bed positions, 6 min/bed. T1w, T2w, and DWI MR images were recorded simultaneously for each bed. Acquired PET data were reconstructed with an iterative 3D OSEM algorithm using 3 iterations and 21 subsets, Gaussian filter of 3 mm. The 4 patients with GvHD were brought to the brainPET/MRI imaging facility 2:10h-2:28h after tracer injection. A 9 min brainPET-acquisition with simultaneous MRI of the lower extremities was accomplished. MRI examination included T1-weighted (pre and post gadolinium) and T2-weighted sequences. Attenuation correction was calculated based on manual bone segmentation and thresholds for soft tissue, fat and air. Soleus muscle (m), crural fascia (f1) and posterior crural intermuscular septum fascia (f2) were surrounded with ROIs based on the pre-treatment T1-weighted images and coregistered using IRW (Siemens). Fascia-to-muscle ratios for PET (f/m), T1 contrast uptake (T1_post-contrast_f-pre-contrast_f/post-contrast_m-pre-contrast_m) and T2 (T2_f/m) were calculated. Both patients with adenocarcinoma show a lower ADC value compared with the carcinoid patient suggesting a higher cellularity. This is also reflected in FDG-PET with higher SUV values. Our initial results reveal that PET/MRI can provide complementary information for a profound tumor characterization and therapy monitoring. The high soft tissue contrast provided by MRI is valuable for the assessment of the fascial inflammation. While in the first patient FDG and contrast uptake as well as edema, represented by T2 signals, decreased with ongoing therapy, all parameters remained comparatively stable in the second patient. Contrary to expectations, an increase in FDG uptake of patient 3 and 4 was accompanied by an increase of the T2 signals, but a decrease in contrast uptake. These initial results suggest that PET/MRI provides complementary information of the complex disease mechanisms in fibrosing disorders.

ei

Web [BibTex]

Web [BibTex]


no image
Statistical Image Analysis and Percolation Theory

Langovoy, M., Habeck, M., Schölkopf, B.

2011 Joint Statistical Meetings (JSM), August 2011 (talk)

Abstract
We develop a novel method for detection of signals and reconstruction of images in the presence of random noise. The method uses results from percolation theory. We specifically address the problem of detection of multiple objects of unknown shapes in the case of nonparametric noise. The noise density is unknown and can be heavy-tailed. The objects of interest have unknown varying intensities. No boundary shape constraints are imposed on the objects, only a set of weak bulk conditions is required. We view the object detection problem as hypothesis testing for discrete statistical inverse problems. We present an algorithm that allows to detect greyscale objects of various shapes in noisy images. We prove results on consistency and algorithmic complexity of our procedures. Applications to cryo-electron microscopy are presented.

ei

Web [BibTex]

Web [BibTex]


no image
PAC-Bayesian Analysis of Martingales and Multiarmed Bandits

Seldin, Y., Laviolette, F., Shawe-Taylor, J., Peters, J., Auer, P.

Max Planck Institute for Biological Cybernetics, Tübingen, Germany, May 2011 (techreport)

Abstract
We present two alternative ways to apply PAC-Bayesian analysis to sequences of dependent random variables. The first is based on a new lemma that enables to bound expectations of convex functions of certain dependent random variables by expectations of the same functions of independent Bernoulli random variables. This lemma provides an alternative tool to Hoeffding-Azuma inequality to bound concentration of martingale values. Our second approach is based on integration of Hoeffding-Azuma inequality with PAC-Bayesian analysis. We also introduce a way to apply PAC-Bayesian analysis in situation of limited feedback. We combine the new tools to derive PAC-Bayesian generalization and regret bounds for the multiarmed bandit problem. Although our regret bound is not yet as tight as state-of-the-art regret bounds based on other well-established techniques, our results significantly expand the range of potential applications of PAC-Bayesian analysis and introduce a new analysis tool to reinforcement learning and many other fields, where martingales and limited feedback are encountered.

ei

PDF Web [BibTex]

PDF Web [BibTex]


no image
Non-stationary Correction of Optical Aberrations

Schuler, C., Hirsch, M., Harmeling, S., Schölkopf, B.

(1), Max Planck Institute for Intelligent Systems, Tübingen, Germany, May 2011 (techreport)

Abstract
Taking a sharp photo at several megapixel resolution traditionally relies on high grade lenses. In this paper, we present an approach to alleviate image degradations caused by imperfect optics. We rely on a calibration step to encode the optical aberrations in a space-variant point spread function and obtain a corrected image by non-stationary deconvolution. By including the Bayer array in our image formation model, we can perform demosaicing as part of the deconvolution.

ei

PDF [BibTex]

PDF [BibTex]


no image
Cooperative Cuts

Jegelka, S.

COSA Workshop: Combinatorial Optimization, Statistics, and Applications, March 2011 (talk)

Abstract
Combinatorial problems with submodular cost functions have recently drawn interest. In a standard combinatorial problem, the sum-of-weights cost is replaced by a submodular set function. The result is a powerful model that is though very hard. In this talk, I will introduce cooperative cuts, minimum cuts with submodular edge weights. I will outline methods to approximately solve this problem, and show an application in computer vision. If time permits, the talk will also sketch regret-minimizing online algorithms for submodular-cost combinatorial problems. This is joint work with Jeff Bilmes (University of Washington).

ei

Web [BibTex]

Web [BibTex]


no image
Multiple Kernel Learning: A Unifying Probabilistic Viewpoint

Nickisch, H., Seeger, M.

Max Planck Institute for Biological Cybernetics, March 2011 (techreport)

Abstract
We present a probabilistic viewpoint to multiple kernel learning unifying well-known regularised risk approaches and recent advances in approximate Bayesian inference relaxations. The framework proposes a general objective function suitable for regression, robust regression and classification that is lower bound of the marginal likelihood and contains many regularised risk approaches as special cases. Furthermore, we derive an efficient and provably convergent optimisation algorithm.

ei

Web [BibTex]

Web [BibTex]


no image
Multiple testing, uncertainty and realistic pictures

Langovoy, M., Wittich, O.

(2011-004), EURANDOM, Technische Universiteit Eindhoven, January 2011 (techreport)

Abstract
We study statistical detection of grayscale objects in noisy images. The object of interest is of unknown shape and has an unknown intensity, that can be varying over the object and can be negative. No boundary shape constraints are imposed on the object, only a weak bulk condition for the object's interior is required. We propose an algorithm that can be used to detect grayscale objects of unknown shapes in the presence of nonparametric noise of unknown level. Our algorithm is based on a nonparametric multiple testing procedure. We establish the limit of applicability of our method via an explicit, closed-form, non-asymptotic and nonparametric consistency bound. This bound is valid for a wide class of nonparametric noise distributions. We achieve this by proving an uncertainty principle for percolation on nite lattices.

ei

PDF [BibTex]

PDF [BibTex]


no image
Nonconvex proximal splitting: batch and incremental algorithms

Sra, S.

(2), Max Planck Institute for Intelligent Systems, Tübingen, Germany, 2011 (techreport)

Abstract
Within the unmanageably large class of nonconvex optimization, we consider the rich subclass of nonsmooth problems having composite objectives (this includes the extensively studied convex, composite objective problems as a special case). For this subclass, we introduce a powerful, new framework that permits asymptotically non-vanishing perturbations. In particular, we develop perturbation-based batch and incremental (online like) nonconvex proximal splitting algorithms. To our knowledge, this is the rst time that such perturbation-based nonconvex splitting algorithms are being proposed and analyzed. While the main contribution of the paper is the theoretical framework, we complement our results by presenting some empirical results on matrix factorization.

ei

PDF [BibTex]

PDF [BibTex]

2005


no image
Some thoughts about Gaussian Processes

Chapelle, O.

NIPS Workshop on Open Problems in Gaussian Processes for Machine Learning, December 2005 (talk)

ei

PDF Web [BibTex]

2005


PDF Web [BibTex]


no image
Popper, Falsification and the VC-dimension

Corfield, D., Schölkopf, B., Vapnik, V.

(145), Max Planck Institute for Biological Cybernetics, November 2005 (techreport)

ei

PDF [BibTex]

PDF [BibTex]


no image
A Combinatorial View of Graph Laplacians

Huang, J.

(144), Max Planck Institute for Biological Cybernetics, Tübingen, Germany, August 2005 (techreport)

Abstract
Discussions about different graph Laplacian, mainly normalized and unnormalized versions of graph Laplacian, have been ardent with respect to various methods in clustering and graph based semi-supervised learning. Previous research on graph Laplacians investigated their convergence properties to Laplacian operators on continuous manifolds. There is still no strong proof on convergence for the normalized Laplacian. In this paper, we analyze different variants of graph Laplacians directly from the ways solving the original graph partitioning problem. The graph partitioning problem is a well-known combinatorial NP hard optimization problem. The spectral solutions provide evidence that normalized Laplacian encodes more reasonable considerations for graph partitioning. We also provide some examples to show their differences.

ei

[BibTex]

[BibTex]


no image
Beyond Pairwise Classification and Clustering Using Hypergraphs

Zhou, D., Huang, J., Schölkopf, B.

(143), Max Planck Institute for Biological Cybernetics, August 2005 (techreport)

Abstract
In many applications, relationships among objects of interest are more complex than pairwise. Simply approximating complex relationships as pairwise ones can lead to loss of information. An alternative for these applications is to analyze complex relationships among data directly, without the need to first represent the complex relationships into pairwise ones. A natural way to describe complex relationships is to use hypergraphs. A hypergraph is a graph in which edges can connect more than two vertices. Thus we consider learning from a hypergraph, and develop a general framework which is applicable to classification and clustering for complex relational data. We have applied our framework to real-world web classification problems and obtained encouraging results.

ei

PDF [BibTex]

PDF [BibTex]


no image
Building Sparse Large Margin Classifiers

Wu, M., Schölkopf, B., BakIr, G.

The 22nd International Conference on Machine Learning (ICML), August 2005 (talk)

ei

PDF [BibTex]

PDF [BibTex]


no image
Learning from Labeled and Unlabeled Data on a Directed Graph

Zhou, D.

The 22nd International Conference on Machine Learning, August 2005 (talk)

Abstract
We propose a general framework for learning from labeled and unlabeled data on a directed graph in which the structure of the graph including the directionality of the edges is considered. The time complexity of the algorithm derived from this framework is nearly linear due to recently developed numerical techniques. In the absence of labeled instances, this framework can be utilized as a spectral clustering method for directed graphs, which generalizes the spectral clustering approach for undirected graphs. We have applied our framework to real-world web classification problems and obtained encouraging results.

ei

PDF [BibTex]

PDF [BibTex]


no image
Machine-Learning Approaches to BCI in Tübingen

Bensch, M., Bogdan, M., Hill, N., Lal, T., Rosenstiel, W., Schölkopf, B., Schröder, M.

Brain-Computer Interface Technology, June 2005, Talk given by NJH. (talk)

ei

[BibTex]

[BibTex]


no image
Measuring Statistical Dependence with Hilbert-Schmidt Norms

Gretton, A., Bousquet, O., Smola, A., Schölkopf, B.

(140), Max Planck Institute for Biological Cybernetics, Tübingen, Germany, June 2005 (techreport)

Abstract
We propose an independence criterion based on the eigenspectrum of covariance operators in reproducing kernel Hilbert spaces (RKHSs), consisting of an empirical estimate of the Hilbert-Schmidt norm of the cross-covariance operator (we term this a Hilbert-Schmidt Independence Criterion, or HSIC). This approach has several advantages, compared with previous kernel-based independence criteria. First, the empirical estimate is simpler than any other kernel dependence test, and requires no user-defined regularisation. Second, there is a clearly defined population quantity which the empirical estimate approaches in the large sample limit, with exponential convergence guaranteed between the two: this ensures that independence tests based on HSIC do not suffer from slow learning rates. Finally, we show in the context of independent component analysis (ICA) that the performance of HSIC is competitive with that of previously published kernel-based criteria, and of other recently published ICA methods.

ei

PDF [BibTex]

PDF [BibTex]


no image
Kernel Constrained Covariance for Dependence Measurement

Gretton, A., Smola, A., Bousquet, O., Herbrich, R., Belitski, A., Augath, M., Murayama, Y., Schölkopf, B., Logothetis, N.

AISTATS, January 2005 (talk)

Abstract
We discuss reproducing kernel Hilbert space (RKHS)-based measures of statistical dependence, with emphasis on constrained covariance (COCO), a novel criterion to test dependence of random variables. We show that COCO is a test for independence if and only if the associated RKHSs are universal. That said, no independence test exists that can distinguish dependent and independent random variables in all circumstances. Dependent random variables can result in a COCO which is arbitrarily close to zero when the source densities are highly non-smooth. All current kernel-based independence tests share this behaviour. We demonstrate exponential convergence between the population and empirical COCO. Finally, we use COCO as a measure of joint neural activity between voxels in MRI recordings of the macaque monkey, and compare the results to the mutual information and the correlation. We also show the effect of removing breathing artefacts from the MRI recording.

ei

PostScript [BibTex]

PostScript [BibTex]


no image
Approximate Inference for Robust Gaussian Process Regression

Kuss, M., Pfingsten, T., Csato, L., Rasmussen, C.

(136), Max Planck Institute for Biological Cybernetics, Tübingen, Germany, 2005 (techreport)

Abstract
Gaussian process (GP) priors have been successfully used in non-parametric Bayesian regression and classification models. Inference can be performed analytically only for the regression model with Gaussian noise. For all other likelihood models inference is intractable and various approximation techniques have been proposed. In recent years expectation-propagation (EP) has been developed as a general method for approximate inference. This article provides a general summary of how expectation-propagation can be used for approximate inference in Gaussian process models. Furthermore we present a case study describing its implementation for a new robust variant of Gaussian process regression. To gain further insights into the quality of the EP approximation we present experiments in which we compare to results obtained by Markov chain Monte Carlo (MCMC) sampling.

ei

PDF [BibTex]

PDF [BibTex]


no image
Maximum-Margin Feature Combination for Detection and Categorization

BakIr, G., Wu, M., Eichhorn, J.

Max Planck Institute for Biological Cybernetics, Tübingen, Germany, 2005 (techreport)

Abstract
In this paper we are concerned with the optimal combination of features of possibly different types for detection and estimation tasks in machine vision. We propose to combine features such that the resulting classifier maximizes the margin between classes. In contrast to existing approaches which are non-convex and/or generative we propose to use a discriminative model leading to convex problem formulation and complexity control. Furthermore we assert that decision functions should not compare apples and oranges by comparing features of different types directly. Instead we propose to combine different similarity measures for each different feature type. Furthermore we argue that the question: ”Which feature type is more discriminative for task X?” is ill-posed and show empirically that the answer to this question might depend on the complexity of the decision function.

ei

PDF [BibTex]

PDF [BibTex]


no image
Towards a Statistical Theory of Clustering. Presented at the PASCAL workshop on clustering, London

von Luxburg, U., Ben-David, S.

Presented at the PASCAL workshop on clustering, London, 2005 (techreport)

Abstract
The goal of this paper is to discuss statistical aspects of clustering in a framework where the data to be clustered has been sampled from some unknown probability distribution. Firstly, the clustering of the data set should reveal some structure of the underlying data rather than model artifacts due to the random sampling process. Secondly, the more sample points we have, the more reliable the clustering should be. We discuss which methods can and cannot be used to tackle those problems. In particular we argue that generalization bounds as they are used in statistical learning theory of classification are unsuitable in a general clustering framework. We suggest that the main replacements of generalization bounds should be convergence proofs and stability considerations. This paper should be considered as a road map paper which identifies important questions and potentially fruitful directions for future research about statistical clustering. We do not attempt to present a complete statistical theory of clustering.

ei

PDF [BibTex]

PDF [BibTex]


no image
Approximate Bayesian Inference for Psychometric Functions using MCMC Sampling

Kuss, M., Jäkel, F., Wichmann, F.

(135), Max Planck Institute for Biological Cybernetics, Tübingen, Germany, 2005 (techreport)

Abstract
In psychophysical studies the psychometric function is used to model the relation between the physical stimulus intensity and the observer's ability to detect or discriminate between stimuli of different intensities. In this report we propose the use of Bayesian inference to extract the information contained in experimental data estimate the parameters of psychometric functions. Since Bayesian inference cannot be performed analytically we describe how a Markov chain Monte Carlo method can be used to generate samples from the posterior distribution over parameters. These samples are used to estimate Bayesian confidence intervals and other characteristics of the posterior distribution. In addition we discuss the parameterisation of psychometric functions and the role of prior distributions in the analysis. The proposed approach is exemplified using artificially generate d data and in a case study for real experimental data. Furthermore, we compare our approach with traditional methods based on maximum-likelihood parameter estimation combined with bootstrap techniques for confidence interval estimation. The appendix provides a description of an implementation for the R environment for statistical computing and provides the code for reproducing the results discussed in the experiment section.

ei

PDF [BibTex]

PDF [BibTex]


no image
Linear and Nonlinear Estimation models applied to Hemodynamic Model

Theodorou, E.

Technical Report-2005-1, Computational Action and Vision Lab University of Minnesota, 2005, clmc (techreport)

Abstract
The relation between BOLD signal and neural activity is still poorly understood. The Gaussian Linear Model known as GLM is broadly used in many fMRI data analysis for recovering the underlying neural activity. Although GLM has been proved to be a really useful tool for analyzing fMRI data it can not be used for describing the complex biophysical process of neural metabolism. In this technical report we make use of a system of Stochastic Differential Equations that is based on Buxton model [1] for describing the underlying computational principles of hemodynamic process. Based on this SDE we built a Kalman Filter estimator so as to estimate the induced neural signal as well as the blood inflow under physiologic and sensor noise. The performance of Kalman Filter estimator is investigated under different physiologic noise characteristics and measurement frequencies.

am

PDF [BibTex]

PDF [BibTex]

2003


no image
Support Vector Channel Selection in BCI

Lal, T., Schröder, M., Hinterberger, T., Weston, J., Bogdan, M., Birbaumer, N., Schölkopf, B.

(120), Max Planck Institute for Biological Cybernetics, Tuebingen, Germany, December 2003 (techreport)

Abstract
Designing a Brain Computer Interface (BCI) system one can choose from a variety of features that may be useful for classifying brain activity during a mental task. For the special case of classifying EEG signals we propose the usage of the state of the art feature selection algorithms Recursive Feature Elimination [3] and Zero-Norm Optimization [13] which are based on the training of Support Vector Machines (SVM) [11]. These algorithms can provide more accurate solutions than standard filter methods for feature selection [14]. We adapt the methods for the purpose of selecting EEG channels. For a motor imagery paradigm we show that the number of used channels can be reduced significantly without increasing the classification error. The resulting best channels agree well with the expected underlying cortical activity patterns during the mental tasks. Furthermore we show how time dependent task specific information can be visualized.

ei

PDF Web [BibTex]

2003


PDF Web [BibTex]


no image
Image Reconstruction by Linear Programming

Tsuda, K., Rätsch, G.

(118), Max Planck Institute for Biological Cybernetics, Tübingen, Germany, October 2003 (techreport)

ei

PDF [BibTex]

PDF [BibTex]


no image
Statistical Learning Theory

Bousquet, O.

Machine Learning Summer School, August 2003 (talk)

ei

PDF [BibTex]

PDF [BibTex]


no image
Remarks on Statistical Learning Theory

Bousquet, O.

Machine Learning Summer School, August 2003 (talk)

ei

PDF [BibTex]

PDF [BibTex]


no image
Ranking on Data Manifolds

Zhou, D., Weston, J., Gretton, A., Bousquet, O., Schölkopf, B.

(113), Max Planck Institute for Biological Cybernetics, 72076 Tuebingen, Germany, June 2003 (techreport)

Abstract
The Google search engine has had a huge success with its PageRank web page ranking algorithm, which exploits global, rather than local, hyperlink structure of the World Wide Web using random walk. This algorithm can only be used for graph data, however. Here we propose a simple universal ranking algorithm for vectorial data, based on the exploration of the intrinsic global geometric structure revealed by a huge amount of data. Experimental results from image and text to bioinformatics illustrates the validity of our algorithm.

ei

PDF [BibTex]

PDF [BibTex]


no image
Kernel Hebbian Algorithm for Iterative Kernel Principal Component Analysis

Kim, K., Franz, M., Schölkopf, B.

(109), MPI f. biologische Kybernetik, Tuebingen, June 2003 (techreport)

Abstract
A new method for performing a kernel principal component analysis is proposed. By kernelizing the generalized Hebbian algorithm, one can iteratively estimate the principal components in a reproducing kernel Hilbert space with only linear order memory complexity. The derivation of the method, a convergence proof, and preliminary applications in image hyperresolution are presented. In addition, we discuss the extension of the method to the online learning of kernel principal components.

ei

PDF [BibTex]

PDF [BibTex]


no image
Learning with Local and Global Consistency

Zhou, D., Bousquet, O., Lal, T., Weston, J., Schölkopf, B.

(112), Max Planck Institute for Biological Cybernetics, Tuebingen, Germany, June 2003 (techreport)

Abstract
We consider the learning problem in the transductive setting. Given a set of points of which only some are labeled, the goal is to predict the label of the unlabeled points. A principled clue to solve such a learning problem is the consistency assumption that a classifying function should be sufficiently smooth with respect to the structure revealed by these known labeled and unlabeled points. We present a simple algorithm to obtain such a smooth solution. Our method yields encouraging experimental results on a number of classification problems and demonstrates effective use of unlabeled data.

ei

[BibTex]

[BibTex]


no image
Implicit Wiener Series

Franz, M., Schölkopf, B.

(114), Max Planck Institute for Biological Cybernetics, June 2003 (techreport)

Abstract
The Wiener series is one of the standard methods to systematically characterize the nonlinearity of a neural system. The classical estimation method of the expansion coefficients via cross-correlation suffers from severe problems that prevent its application to high-dimensional and strongly nonlinear systems. We propose a new estimation method based on regression in a reproducing kernel Hilbert space that overcomes these problems. Numerical experiments show performance advantages in terms of convergence, interpretability and system size that can be handled.

ei

PDF [BibTex]

PDF [BibTex]


no image
Machine Learning approaches to protein ranking: discriminative, semi-supervised, scalable algorithms

Weston, J., Leslie, C., Elisseeff, A., Noble, W.

(111), Max Planck Institute for Biological Cybernetics, Tübingen, Germany, June 2003 (techreport)

Abstract
A key tool in protein function discovery is the ability to rank databases of proteins given a query amino acid sequence. The most successful method so far is a web-based tool called PSI-BLAST which uses heuristic alignment of a profile built using the large unlabeled database. It has been shown that such use of global information via an unlabeled data improves over a local measure derived from a basic pairwise alignment such as performed by PSI-BLAST's predecessor, BLAST. In this article we look at ways of leveraging techniques from the field of machine learning for the problem of ranking. We show how clustering and semi-supervised learning techniques, which aim to capture global structure in data, can significantly improve over PSI-BLAST.

ei

PDF [BibTex]

PDF [BibTex]


no image
The Geometry Of Kernel Canonical Correlation Analysis

Kuss, M., Graepel, T.

(108), Max Planck Institute for Biological Cybernetics, Tübingen, Germany, May 2003 (techreport)

Abstract
Canonical correlation analysis (CCA) is a classical multivariate method concerned with describing linear dependencies between sets of variables. After a short exposition of the linear sample CCA problem and its analytical solution, the article proceeds with a detailed characterization of its geometry. Projection operators are used to illustrate the relations between canonical vectors and variates. The article then addresses the problem of CCA between spaces spanned by objects mapped into kernel feature spaces. An exact solution for this kernel canonical correlation (KCCA) problem is derived from a geometric point of view. It shows that the expansion coefficients of the canonical vectors in their respective feature space can be found by linear CCA in the basis induced by kernel principal component analysis. The effect of mappings into higher dimensional feature spaces is considered critically since it simplifies the CCA problem in general. Then two regularized variants of KCCA are discussed. Relations to other methods are illustrated, e.g., multicategory kernel Fisher discriminant analysis, kernel principal component regression and possible applications thereof in blind source separation.

ei

PDF [BibTex]

PDF [BibTex]


no image
The Kernel Mutual Information

Gretton, A., Herbrich, R., Smola, A.

Max Planck Institute for Biological Cybernetics, April 2003 (techreport)

Abstract
We introduce two new functions, the kernel covariance (KC) and the kernel mutual information (KMI), to measure the degree of independence of several continuous random variables. The former is guaranteed to be zero if and only if the random variables are pairwise independent; the latter shares this property, and is in addition an approximate upper bound on the mutual information, as measured near independence, and is based on a kernel density estimate. We show that Bach and Jordan‘s kernel generalised variance (KGV) is also an upper bound on the same kernel density estimate, but is looser. Finally, we suggest that the addition of a regularising term in the KGV causes it to approach the KMI, which motivates the introduction of this regularisation. The performance of the KC and KMI is verified in the context of instantaneous independent component analysis (ICA), by recovering both artificial and real (musical) signals following linear mixing.

ei

PostScript [BibTex]

PostScript [BibTex]


no image
Rademacher and Gaussian averages in Learning Theory

Bousquet, O.

Universite de Marne-la-Vallee, March 2003 (talk)

ei

PDF [BibTex]

PDF [BibTex]


no image
Introduction: Robots with Cognition?

Franz, MO.

6, pages: 38, (Editors: H.H. Bülthoff, K.R. Gegenfurtner, H.A. Mallot, R. Ulrich, F.A. Wichmann), 6. T{\"u}binger Wahrnehmungskonferenz (TWK), February 2003 (talk)

Abstract
Using robots as models of cognitive behaviour has a long tradition in robotics. Parallel to the historical development in cognitive science, one observes two major, subsequent waves in cognitive robotics. The first is based on ideas of classical, cognitivist Artificial Intelligence (AI). According to the AI view of cognition as rule-based symbol manipulation, these robots typically try to extract symbolic descriptions of the environment from their sensors that are used to update a common, global world representation from which, in turn, the next action of the robot is derived. The AI approach has been successful in strongly restricted and controlled environments requiring well-defined tasks, e.g. in industrial assembly lines. AI-based robots mostly failed, however, in the unpredictable and unstructured environments that have to be faced by mobile robots. This has provoked the second wave in cognitive robotics which tries to achieve cognitive behaviour as an emergent property from the interaction of simple, low-level modules. Robots of the second wave are called animats as their architecture is designed to closely model aspects of real animals. Using only simple reactive mechanisms and Hebbian-type or evolutionary learning, the resulting animats often outperformed the highly complex AI-based robots in tasks such as obstacle avoidance, corridor following etc. While successful in generating robust, insect-like behaviour, typical animats are limited to stereotyped, fixed stimulus-response associations. If one adopts the view that cognition requires a flexible, goal-dependent choice of behaviours and planning capabilities (H.A. Mallot, Kognitionswissenschaft, 1999, 40-48) then it appears that cognitive behaviour cannot emerge from a collection of purely reactive modules. It rather requires environmentally decoupled structures that work without directly engaging the actions that it is concerned with. This poses the current challenge to cognitive robotics: How can we build cognitive robots that show the robustness and the learning capabilities of animats without falling back into the representational paradigm of AI? The speakers of the symposium present their approaches to this question in the context of robot navigation and sensorimotor learning. In the first talk, Prof. Helge Ritter introduces a robot system for imitation learning capable of exploring various alternatives in simulation before actually performing a task. The second speaker, Angelo Arleo, develops a model of spatial memory in rat navigation based on his electrophysiological experiments. He validates the model on a mobile robot which, in some navigation tasks, shows a performance comparable to that of the real rat. A similar model of spatial memory is used to investigate the mechanisms of territory formation in a series of robot experiments presented by Prof. Hanspeter Mallot. In the last talk, we return to the domain of sensorimotor learning where Ralf M{\"o}ller introduces his approach to generate anticipatory behaviour by learning forward models of sensorimotor relationships.

ei

Web [BibTex]

Web [BibTex]


no image
A Note on Parameter Tuning for On-Line Shifting Algorithms

Bousquet, O.

Max Planck Institute for Biological Cybernetics, Tübingen, Germany, 2003 (techreport)

Abstract
In this short note, building on ideas of M. Herbster [2] we propose a method for automatically tuning the parameter of the FIXED-SHARE algorithm proposed by Herbster and Warmuth [3] in the context of on-line learning with shifting experts. We show that this can be done with a memory requirement of $O(nT)$ and that the additional loss incurred by the tuning is the same as the loss incurred for estimating the parameter of a Bernoulli random variable.

ei

PDF PostScript [BibTex]

PDF PostScript [BibTex]


no image
Interactive Images

Toyama, K., Schölkopf, B.

(MSR-TR-2003-64), Microsoft Research, Cambridge, UK, 2003 (techreport)

Abstract
Interactive Images are a natural extension of three recent developments: digital photography, interactive web pages, and browsable video. An interactive image is a multi-dimensional image, displayed two dimensions at a time (like a standard digital image), but with which a user can interact to browse through the other dimensions. One might consider a standard video sequence viewed with a video player as a simple interactive image with time as the third dimension. Interactive images are a generalization of this idea, in which the third (and greater) dimensions may be focus, exposure, white balance, saturation, and other parameters. Interaction is handled via a variety of modes including those we call ordinal, pixel-indexed, cumulative, and comprehensive. Through exploration of three novel forms of interactive images based on color, exposure, and focus, we will demonstrate the compelling nature of interactive images.

ei

Web [BibTex]

Web [BibTex]