DocumentCode :
2221851
Title :
A new structure-preserving dimensionality reduction approach and OI-net implementation
Author :
Öten, Remzi ; de Figueiredo, Rui J.P.
Author_Institution :
California Univ., Irvine, CA, USA
Volume :
1
fYear :
1998
fDate :
4-8 May 1998
Firstpage :
690
Abstract :
A new generic nonlinear feature extraction map f is presented based on concepts from approximation theory. Let f map an input data vector x∈ℜn, where n is high, to an appropriate feature vector y∈ℜm, where m is sufficiently low. Also let X={X1,…,XN} denote an available training set in ℜn. In this paper f is derived by requiring that the geometric structure (metric space attributes) of the points f(X)={f(X1),…,f(XN)} in the feature space ℜm be as similar as possible to the structure of the points X in the data space ℜn. This is accomplished by selecting first an appropriate dimension m for the feature space ℜm according to the size N of the available training set X, subject to bounds on the distortion of the data structure caused by f and on the error in the estimation of the underlying likelihood functions in the feature space. The map f(i) is designed by a multi-dimensional scaling (MDS) approach that minimizes the Sammon´s cost function. This approach uses graph-theoretic (minimal spanning tree) and genetic algorithmic concepts to search efficiently the optimal structure-preserving point-to-point mapping of the training samples X 1,…,XN to their images in ℜm,Y1,…,YN. Finally, an optimal interpolating (OI) artificial neural network is used to recover the entire function f:ℜ→ℜm by interpolating the values yi=1,…,N at Xi, i=1,…,N. Preliminary simulation results based on this approach are also given
Keywords :
approximation theory; feature extraction; genetic algorithms; neural nets; trees (mathematics); Sammon´s cost function; approximation theory; generic nonlinear feature extraction map; genetic algorithmic concepts; geometric structure; likelihood functions; metric space attributes; minimal spanning tree; multi-dimensional scaling; optimal interpolating artificial neural network; optimal structure-preserving point-to-point mapping; structure-preserving dimensionality reduction approach; training set; Bayesian methods; Kernel; Nonlinear distortion; Pattern recognition; Recursive estimation; Statistics; Stress; System testing; Training data; Tree graphs;
fLanguage :
English
Publisher :
ieee
Conference_Titel :
Neural Networks Proceedings, 1998. IEEE World Congress on Computational Intelligence. The 1998 IEEE International Joint Conference on
Conference_Location :
Anchorage, AK
ISSN :
1098-7576
Print_ISBN :
0-7803-4859-1
Type :
conf
DOI :
10.1109/IJCNN.1998.682364
Filename :
682364
Link To Document :
بازگشت