Subscribe
Pittsburgh, Pennsylvania, USA
Oct. 23, 2005 to Oct. 25, 2005
ISBN: 0-7695-2468-0
pp: 491-500
Anirban Dasgupta , Anirban Dasgupta
John Hopcroft , John Hopcroft
Jon Kleinberg , Jon Kleinberg
Mark Sandler , Mark Sandler
DOI Bookmark: http://doi.ieeecomputersociety.org/10.1109/SFCS.2005.56
ABSTRACT
<p>We consider the problem of learning mixtures of arbitrary symmetric distributions.We formulate sufficient separation conditions and present a learning algorithm with provable guarantees for mixtures of distributions that satisfy these separation conditions. Our bounds are independent of the variances of the distributions; to the best of our knowledge, there were no previous algorithms knownwith provable learning guarantees for distributions having infinite variance and/or expectation.</p> <p>For Gaussians and log-concave distributions, our results match the best known sufficient separation conditions [1, 15]. Our algorithm requires a sample of size o(dk), where d is the number of dimensions and k is the number of distributions in the mixture.Wealso show that for isotropic power-laws, exponential, and Gaussian distributions, our separation condition is optimal up to a constant factor.</p>
INDEX TERMS
null
CITATION
Anirban Dasgupta,
John Hopcroft,
Jon Kleinberg,
Mark Sandler,
"On Learning Mixtures of Heavy-Tailed Distributions", FOCS, 2005, 2013 IEEE 54th Annual Symposium on Foundations of Computer Science,
2013 IEEE 54th Annual Symposium on Foundations of Computer Science 2005, pp. 491-500, doi:10.1109/SFCS.2005.56