Header logo is

Local dimensionality reduction


Conference Paper


If globally high dimensional data has locally only low dimensional distributions, it is advantageous to perform a local dimensionality reduction before further processing the data. In this paper we examine several techniques for local dimensionality reduction in the context of locally weighted linear regression. As possible candidates, we derive local versions of factor analysis regression, principle component regression, principle component regression on joint distributions, and partial least squares regression. After outlining the statistical bases of these methods, we perform Monte Carlo simulations to evaluate their robustness with respect to violations of their statistical assumptions. One surprising outcome is that locally weighted partial least squares regression offers the best average results, thus outperforming even factor analysis, the theoretically most appealing of our candidate techniques.

Author(s): Schaal, S. and Vijayakumar, S. and Atkeson, C. G.
Book Title: Advances in Neural Information Processing Systems 10
Pages: 633-639
Year: 1998
Editors: Jordan, M. I.;Kearns, M. J.;Solla, S. A.
Publisher: MIT Press

Department(s): Autonomous Motion
Bibtex Type: Conference Paper (inproceedings)

Address: Cambridge, MA
Cross Ref: p1245
Note: clmc
URL: http://www-clmc.usc.edu/publications/S/schaal-NIPS1998.pdf


  title = {Local dimensionality reduction},
  author = {Schaal, S. and Vijayakumar, S. and Atkeson, C. G.},
  booktitle = {Advances in Neural Information Processing Systems 10},
  pages = {633-639},
  editors = {Jordan, M. I.;Kearns, M. J.;Solla, S. A.},
  publisher = {MIT Press},
  address = {Cambridge, MA},
  year = {1998},
  note = {clmc},
  crossref = {p1245},
  url = {http://www-clmc.usc.edu/publications/S/schaal-NIPS1998.pdf}