Subscribe

Singer Island, FL

Oct. 24, 1984 to Oct. 26, 1984

ISBN: 0-8186-0591-X

pp: 100-108

A. El Gamal , Stanford University

ABSTRACT

Let X and Y be two random variables with probability distribution p(x,y), joint entropy H(X,Y) and conditional entropies H(X \ Y) and H(Y \ X) . Person P/sub x/ knows X and person P/sub Y/ knows Y. They communicate over a noiseless two-way channel so that both know X and Y. It is proved that, on the average, at least H(X \ Y) + H(Y \ X) bits must be exchanged and that H(X,Y) + 2 bits are sufficient. If p(x.y) > 0 for all (x.y), then at least H(X,Y) bits must be communicated on the average. However, if p (x,y) is uniform over its support set, the average number of bits needed is close to H(X \ Y) + H (Y \ X). Randomized protocols can reduce the amount of communication considerably but only when some probability of error is acceptable.

CITATION

A. El Gamal,
A. Orlitsky,
"Interactive Data Compression",

*FOCS*, 1984, 2013 IEEE 54th Annual Symposium on Foundations of Computer Science, 2013 IEEE 54th Annual Symposium on Foundations of Computer Science 1984, pp. 100-108, doi:10.1109/SFCS.1984.715906