46th Annual IEEE Symposium on Foundations of Computer Science (FOCS'05) (2005)

Pittsburgh, Pennsylvania, USA

Oct. 23, 2005 to Oct. 25, 2005

ISBN: 0-7695-2468-0

pp: 491-500

DOI Bookmark: http://doi.ieeecomputersociety.org/10.1109/SFCS.2005.56

Anirban Dasgupta , Anirban Dasgupta

John Hopcroft , John Hopcroft

Jon Kleinberg , Jon Kleinberg

Mark Sandler , Mark Sandler

ABSTRACT

<p>We consider the problem of learning mixtures of arbitrary symmetric distributions.We formulate sufficient separation conditions and present a learning algorithm with provable guarantees for mixtures of distributions that satisfy these separation conditions. Our bounds are independent of the variances of the distributions; to the best of our knowledge, there were no previous algorithms knownwith provable learning guarantees for distributions having infinite variance and/or expectation.</p> <p>For Gaussians and log-concave distributions, our results match the best known sufficient separation conditions [1, 15]. Our algorithm requires a sample of size o(dk), where d is the number of dimensions and k is the number of distributions in the mixture.Wealso show that for isotropic power-laws, exponential, and Gaussian distributions, our separation condition is optimal up to a constant factor.</p>

INDEX TERMS

null

CITATION

A. Dasgupta, J. Hopcroft, J. Kleinberg and M. Sandler, "On Learning Mixtures of Heavy-Tailed Distributions,"

*46th Annual IEEE Symposium on Foundations of Computer Science (FOCS'05)(FOCS)*, Pittsburgh, Pennsylvania, USA, 2005, pp. 491-500.

doi:10.1109/SFCS.2005.56

CITATIONS

SEARCH