DocumentCode :
2396201
Title :
Dimensionality reduction using covariance operator inverse regression
Author :
Kim, Minyoung ; Pavlovic, Vladimir
Author_Institution :
Dept. of Comput. Sci., Rutgers Univ., Newark, NJ
fYear :
2008
fDate :
23-28 June 2008
Firstpage :
1
Lastpage :
8
Abstract :
We consider the task of dimensionality reduction for regression (DRR) whose goal is to find a low dimensional representation of input covariates, while preserving the statistical correlation with output targets. DRR is particularly suited for visualization of high dimensional data as well as the efficient regressor design with a reduced input dimension. In this paper we propose a novel nonlinear method for DRR that exploits the kernel Gram matrices of input and output. While most existing DRR techniques rely on the inverse regression, our approach removes the need for explicit slicing of the output space using covariance operators in RKHS. This unique property make DRR applicable to problem domains with high dimensional output data with potentially significant amounts of noise. Although recent kernel dimensionality reduction algorithms make use of RKHS covariance operators to quantify conditional dependency between the input and the targets via the dimension-reduced input, they are either limited to a transduction setting or linear input subspaces and restricted to non-closed-form solutions. In contrast, our approach provides a closed-form solution to the nonlinear basis functions on which any new input point can be easily projected. We demonstrate the benefits of the proposed method in a comprehensive set of evaluations on several important regression problems that arise in computer vision.
Keywords :
correlation methods; covariance analysis; learning (artificial intelligence); matrix algebra; nonlinear functions; regression analysis; covariance operator inverse regression; dimensionality reduction for regression; kernel Gram matrices; linear input subspaces; nonlinear basis functions; nonlinear method; statistical correlation; transduction setting; Closed-form solution; Computer science; Computer vision; Covariance matrix; Data visualization; Kernel; Linear discriminant analysis; Noise reduction; Principal component analysis; Supervised learning;
fLanguage :
English
Publisher :
ieee
Conference_Titel :
Computer Vision and Pattern Recognition, 2008. CVPR 2008. IEEE Conference on
Conference_Location :
Anchorage, AK
ISSN :
1063-6919
Print_ISBN :
978-1-4244-2242-5
Electronic_ISBN :
1063-6919
Type :
conf
DOI :
10.1109/CVPR.2008.4587404
Filename :
4587404
Link To Document :
بازگشت