Proceedings of 37th Conference on Foundations of Computer Science (1996)

Burlington, VT

Oct. 14, 1996 to Oct. 16, 1996

ISBN: 0-8186-7594-2

pp: 330

A. Blum , Sch. of Comput. Sci., Carnegie Mellon Univ., Pittsburgh, PA, USA

A. Frieze , Sch. of Comput. Sci., Carnegie Mellon Univ., Pittsburgh, PA, USA

R. Kannan , Sch. of Comput. Sci., Carnegie Mellon Univ., Pittsburgh, PA, USA

S. Vempala , Sch. of Comput. Sci., Carnegie Mellon Univ., Pittsburgh, PA, USA

ABSTRACT

The authors consider the problem of learning a linear threshold function (a halfspace in n dimensions, also called a "perceptron"). Methods for solving this problem generally fall into two categories. In the absence of noise, this problem can be formulated as a linear program and solved in polynomial time with the ellipsoid algorithm (or interior point methods). On the other hand, simple greedy algorithms such as the perceptron algorithm seem to work well in practice and can be made noise tolerant; but, their running time depends on a separation parameter (which quantifies the amount of "wiggle room" available) and can be exponential in the description length of the input. They show how simple greedy methods can be used to find weak hypotheses (hypotheses that classify noticeably more than half of the examples) in polynomial time, without dependence on any separation parameter. This results in a polynomial-time algorithm for learning linear threshold functions in the PAC model in the presence of random classification noise. The algorithm is based on a new method for removing outliers in data. Specifically, for any set S of points in R/sup n/, each given to b bits of precision, they show that one can remove only a small fraction of S so that in the remaining set T, for every vector v, max/sub x/spl epsiv/T/(v/spl middot/x)/sup 2//spl les/poly(n,b)|T|/sup -1//spl Sigma//sub x/spl epsiv/T/(v/spl middot/x)/sup 2/. After removing these outliers, they are able to show that a modified version of the perceptron learning algorithm works in polynomial time, even in the presence of random classification noise.

INDEX TERMS

learning (artificial intelligence); polynomial-time algorithm; noisy linear threshold function learning; linear program; ellipsoid algorithm; greedy algorithms; perceptron algorithm; separation parameter; noise tolerance; input description length; weak hypothesis finding; PAC model; random classification noise; data outlier removal

CITATION

S. Vempala, R. Kannan, A. Blum and A. Frieze, "A polynomial-time algorithm for learning noisy linear threshold functions,"

*Proceedings of 37th Conference on Foundations of Computer Science(FOCS)*, Burlington, VT, 1996, pp. 330.

doi:10.1109/SFCS.1996.548492

CITATIONS