DocumentCode :
2645054
Title :
A polynomial-time algorithm for learning noisy linear threshold functions
Author :
Blum, Avrim ; Frieze, Alan ; Kannan, Ravi ; Vempala, Santosh
Author_Institution :
Sch. of Comput. Sci., Carnegie Mellon Univ., Pittsburgh, PA, USA
fYear :
1996
fDate :
14-16 Oct 1996
Firstpage :
330
Lastpage :
338
Abstract :
The authors consider the problem of learning a linear threshold function (a halfspace in n dimensions, also called a “perceptron”). Methods for solving this problem generally fall into two categories. In the absence of noise, this problem can be formulated as a linear program and solved in polynomial time with the ellipsoid algorithm (or interior point methods). On the other hand, simple greedy algorithms such as the perceptron algorithm seem to work well in practice and can be made noise tolerant; but, their running time depends on a separation parameter (which quantifies the amount of “wiggle room” available) and can be exponential in the description length of the input. They show how simple greedy methods can be used to find weak hypotheses (hypotheses that classify noticeably more than half of the examples) in polynomial time, without dependence on any separation parameter. This results in a polynomial-time algorithm for learning linear threshold functions in the PAC model in the presence of random classification noise. The algorithm is based on a new method for removing outliers in data. Specifically, for any set S of points in Rn, each given to b bits of precision, they show that one can remove only a small fraction of S so that in the remaining set T, for every vector v, maxxεT(v·x)2⩽poly(n,b)|T|-1 ΣxεT(v·x)2. After removing these outliers, they are able to show that a modified version of the perceptron learning algorithm works in polynomial time, even in the presence of random classification noise
Keywords :
computational complexity; heuristic programming; learning (artificial intelligence); noise; pattern classification; perceptrons; PAC model; data outlier removal; ellipsoid algorithm; greedy algorithms; input description length; linear program; noise tolerance; noisy linear threshold function learning; perceptron algorithm; polynomial-time algorithm; random classification noise; separation parameter; weak hypothesis finding; Computer science; Ear; Ellipsoids; Greedy algorithms; Linear programming; Machine learning; Machine learning algorithms; Polynomials; Vectors;
fLanguage :
English
Publisher :
ieee
Conference_Titel :
Foundations of Computer Science, 1996. Proceedings., 37th Annual Symposium on
Conference_Location :
Burlington, VT
ISSN :
0272-5428
Print_ISBN :
0-8186-7594-2
Type :
conf
DOI :
10.1109/SFCS.1996.548492
Filename :
548492
Link To Document :
بازگشت