
This Article  
 
Share  
Bibliographic References  
Add to:  
Digg Furl Spurl Blink Simpy Del.icio.us Y!MyWeb  
Search  
 
41st Annual Symposium on Foundations of Computer Science
Extracting randomness via repeated condensing
Redondo Beach, California
November 12November 14
ISBN: 0769508502
ASCII Text  x  
O. Reingold, R. Shaltiel, A. Wigderson, "Extracting randomness via repeated condensing," 2013 IEEE 54th Annual Symposium on Foundations of Computer Science, pp. 22, 41st Annual Symposium on Foundations of Computer Science, 2000.  
BibTex  x  
@article{ 10.1109/SFCS.2000.892008, author = {O. Reingold and R. Shaltiel and A. Wigderson}, title = {Extracting randomness via repeated condensing}, journal ={2013 IEEE 54th Annual Symposium on Foundations of Computer Science}, volume = {0}, year = {2000}, issn = {02725428}, pages = {22}, doi = {http://doi.ieeecomputersociety.org/10.1109/SFCS.2000.892008}, publisher = {IEEE Computer Society}, address = {Los Alamitos, CA, USA}, }  
RefWorks Procite/RefMan/Endnote  x  
TY  CONF JO  2013 IEEE 54th Annual Symposium on Foundations of Computer Science TI  Extracting randomness via repeated condensing SN  02725428 SP EP A1  O. Reingold, A1  R. Shaltiel, A1  A. Wigderson, PY  2000 KW  probability; entropy; random processes; computational complexity; randomness extraction; repeated condensing; input probability distribution; entropy; maximum entropy rate; condenser; output distribution; block extraction scheme; recursive winwin case analysis; error correction; random sources; polynomial loss VL  0 JA  2013 IEEE 54th Annual Symposium on Foundations of Computer Science ER   
On an input probability distribution with some (min)entropy an extractor outputs a distribution with a (near) maximum entropy rate (namely the uniform distribution). A natural weakening of this concept is a condenser, whose output distribution has a higher entropy rate than the input distribution (without losing much of the initial entropy). We construct efficient explicit condensers. The condenser constructions combine (variants or more efficient versions of) ideas from several works, including the block extraction scheme of Nisan and Zuckerman (1996), the observation made by Srinivasan and Zuckerman (1994) and Nisan and TaSchma (1999) that a failure of the block extraction scheme is also useful, the recursive "winwin" case analysis of Impagliazzo et al. (1999, 2000), and the error correction of random sources used by Trevisan (1999). As a natural byproduct, (via repeated iterating of condensers), we obtain new extractor constructions. The new extractors give significant qualitative improvements over previous ones for sources of arbitrary minentropy; they are nearly optimal simultaneously in the main two parametersseed length and output length. Specifically, our extractors can make any of these two parameters optimal (up to a constant factor), only at a polylogarithmic loss in the other. Previous constructions require polynomial loss in both cases for general sources. We also give a simple reduction converting "standard" extractors (which are good for an average seed) to "strong " ones (which are good for mast seeds), with essentially the same parameters.
Index Terms:
probability; entropy; random processes; computational complexity; randomness extraction; repeated condensing; input probability distribution; entropy; maximum entropy rate; condenser; output distribution; block extraction scheme; recursive winwin case analysis; error correction; random sources; polynomial loss
Citation:
O. Reingold, R. Shaltiel, A. Wigderson, "Extracting randomness via repeated condensing," focs, pp.22, 41st Annual Symposium on Foundations of Computer Science, 2000
Usage of this product signifies your acceptance of the Terms of Use.