Rank priors for continuous non-linear dimensionality reduction

Discovering the underlying low-dimensional latent structure in high-dimensional perceptual observations (e.g., images, video) can, in many cases, greatly improve performance in recognition and tracking. However, non-linear dimensionality reduction methods are often susceptible to local minima and pe...

Full description

Bibliographic Details
Main Authors: Darrell, Trevor J., Urtasun, Raquel, Geiger, Andreas
Other Authors: Massachusetts Institute of Technology. Computer Science and Artificial Intelligence Laboratory
Format: Article
Language:en_US
Published: Institute of Electrical and Electronics Engineers 2010
Online Access:http://hdl.handle.net/1721.1/59287
_version_ 1826195416826249216
author Darrell, Trevor J.
Urtasun, Raquel
Geiger, Andreas
author2 Massachusetts Institute of Technology. Computer Science and Artificial Intelligence Laboratory
author_facet Massachusetts Institute of Technology. Computer Science and Artificial Intelligence Laboratory
Darrell, Trevor J.
Urtasun, Raquel
Geiger, Andreas
author_sort Darrell, Trevor J.
collection MIT
description Discovering the underlying low-dimensional latent structure in high-dimensional perceptual observations (e.g., images, video) can, in many cases, greatly improve performance in recognition and tracking. However, non-linear dimensionality reduction methods are often susceptible to local minima and perform poorly when initialized far from the global optimum, even when the intrinsic dimensionality is known a priori. In this work we introduce a prior over the dimensionality of the latent space that penalizes high dimensional spaces, and simultaneously optimize both the latent space and its intrinsic dimensionality in a continuous fashion. Ad-hoc initialization schemes are unnecessary with our approach; we initialize the latent space to the observation space and automatically infer the latent dimensionality. We report results applying our prior to various probabilistic non-linear dimensionality reduction tasks, and show that our method can outperform graph-based dimensionality reduction techniques as well as previously suggested initialization strategies. We demonstrate the effectiveness of our approach when tracking and classifying human motion.
first_indexed 2024-09-23T10:12:19Z
format Article
id mit-1721.1/59287
institution Massachusetts Institute of Technology
language en_US
last_indexed 2024-09-23T10:12:19Z
publishDate 2010
publisher Institute of Electrical and Electronics Engineers
record_format dspace
spelling mit-1721.1/592872022-09-30T19:35:58Z Rank priors for continuous non-linear dimensionality reduction Darrell, Trevor J. Urtasun, Raquel Geiger, Andreas Massachusetts Institute of Technology. Computer Science and Artificial Intelligence Laboratory Darrell, Trevor J. Darrell, Trevor J. Urtasun, Raquel Discovering the underlying low-dimensional latent structure in high-dimensional perceptual observations (e.g., images, video) can, in many cases, greatly improve performance in recognition and tracking. However, non-linear dimensionality reduction methods are often susceptible to local minima and perform poorly when initialized far from the global optimum, even when the intrinsic dimensionality is known a priori. In this work we introduce a prior over the dimensionality of the latent space that penalizes high dimensional spaces, and simultaneously optimize both the latent space and its intrinsic dimensionality in a continuous fashion. Ad-hoc initialization schemes are unnecessary with our approach; we initialize the latent space to the observation space and automatically infer the latent dimensionality. We report results applying our prior to various probabilistic non-linear dimensionality reduction tasks, and show that our method can outperform graph-based dimensionality reduction techniques as well as previously suggested initialization strategies. We demonstrate the effectiveness of our approach when tracking and classifying human motion. 2010-10-13T18:13:57Z 2010-10-13T18:13:57Z 2009-08 2009-06 Article http://purl.org/eprint/type/JournalArticle 978-1-4244-3992-8 1063-6919 INSPEC Accession Number: 10835871 http://hdl.handle.net/1721.1/59287 Geiger, A., R. Urtasun, and T. Darrell. “Rank priors for continuous non-linear dimensionality reduction.” Computer Vision and Pattern Recognition, 2009. CVPR 2009. IEEE Conference on. 2009. 880-887. © 2009 Institute of Electrical and Electronics Engineers. en_US http://dx.doi.org/10.1109/CVPRW.2009.5206672 IEEE Conference on Computer Vision and Pattern Recognition, 2009. CVPR 2009 Article is made available in accordance with the publisher's policy and may be subject to US copyright law. Please refer to the publisher's site for terms of use. application/pdf Institute of Electrical and Electronics Engineers IEEE
spellingShingle Darrell, Trevor J.
Urtasun, Raquel
Geiger, Andreas
Rank priors for continuous non-linear dimensionality reduction
title Rank priors for continuous non-linear dimensionality reduction
title_full Rank priors for continuous non-linear dimensionality reduction
title_fullStr Rank priors for continuous non-linear dimensionality reduction
title_full_unstemmed Rank priors for continuous non-linear dimensionality reduction
title_short Rank priors for continuous non-linear dimensionality reduction
title_sort rank priors for continuous non linear dimensionality reduction
url http://hdl.handle.net/1721.1/59287
work_keys_str_mv AT darrelltrevorj rankpriorsforcontinuousnonlineardimensionalityreduction
AT urtasunraquel rankpriorsforcontinuousnonlineardimensionalityreduction
AT geigerandreas rankpriorsforcontinuousnonlineardimensionalityreduction