The Community for Technology Leaders
RSS Icon
Subscribe
Pittsburgh, Pennsylvania, USA
Oct. 23, 2005 to Oct. 25, 2005
ISBN: 0-7695-2468-0
pp: 11-20
Adam Tauman Kalai , TTI-Chicago
Adam R. Klivans , UT-Austin
Yishay Mansour , Tel Aviv University
Rocco A. Servedio , Columbia University
ABSTRACT
<p>We give the first algorithm that (under distributional assumptions) efficiently learns halfspaces in the notoriously difficult agnostic framework of Kearns, Schapire, & Sellie, where a learner is given access to labeled examples drawn from a distribution, without restriction on the labels (e.g. adversarial noise). The algorithm constructs a hypothesis whose error rate on future examples is within an additive \varepsilon of the optimal halfspace, in time poly(n) for any constant \varepsilon > 0, under the uniform distribution over {-1,1}^n or the unit sphere in R^n, as well as under any log-concave distribution over R^n. It also agnostically learns Boolean disjunctions in time b^2 (\sqrt n) with respect to any distribution. The new algorithm, essentially L1 polynomial regression, is a noise-tolerant arbitrary-distribution generalization of the "low-degree" Fourier algorithm of Linial, Mansour, & Nisan. We also give a new algorithm for PAC learning halfspaces under the uniform distribution on the unit sphere with the current best bounds on tolerable rate of "malicious noise."</p>
INDEX TERMS
null
CITATION
Adam Tauman Kalai, Adam R. Klivans, Yishay Mansour, Rocco A. Servedio, "Agnostically Learning Halfspaces", FOCS, 2005, 2013 IEEE 54th Annual Symposium on Foundations of Computer Science, 2013 IEEE 54th Annual Symposium on Foundations of Computer Science 2005, pp. 11-20, doi:10.1109/SFCS.2005.13
15 ms
(Ver 2.0)

Marketing Automation Platform Marketing Automation Tool