Title :
Complexity of learning subspace juntas and ICA
Author :
Vempala, Santosh ; Ying Xiao
Author_Institution :
Sch. of Comput. Sci., Georgia Inst. of Technol., Atlanta, GA, USA
Abstract :
Inspired by feature selection problems in machine learning and statistics, we study classification problems where the label function depends only on an unknown low dimensional relevant subspace of the data (we call this a k-subspace junta). Assuming that the relevant subspace is truly independent of the irrelevant subspace, and that the distribution over the irrelevant subspace is Gaussian, we give a polynomial-time algorithm for recovering the relevant subspace and for learning; additionally, we require only a polynomial number of samples. Our main tool is the solution of a tensor optimization problem. In general, finding the global optimum of a tensor is NP-hard, but we avoid this difficulty by using only local optima.
Keywords :
Gaussian distribution; computational complexity; feature selection; independent component analysis; learning (artificial intelligence); optimisation; polynomials; statistics; tensors; Gaussian distribution; ICA; NP-hard problems; classification problems; feature selection problems; independent component analysis; irrelevant subspace; k-subspace junta; label function; learning subspace juntas; low dimensional relevant subspace; machine learning; polynomial-time algorithm; statistics; tensor optimization problem; Complexity theory; Polynomials; Principal component analysis; Robustness; Standards; Tensile stress; Vectors;
Conference_Titel :
Signals, Systems and Computers, 2013 Asilomar Conference on
Conference_Location :
Pacific Grove, CA
Print_ISBN :
978-1-4799-2388-5
DOI :
10.1109/ACSSC.2013.6810286