Article
Keywords:
$d$-wise-independent variables; entropy; lower bound
Summary:
How low can the joint entropy of $n$ $d$-wise independent (for $d\geq 2$) discrete random variables be, subject to given constraints on the individual distributions (say, no value may be taken by a variable with probability greater than $p$, for $p< 1$)? This question has been posed and partially answered in a recent work of Babai [{Entropy versus pairwise independence} (preliminary version), {http://people.cs.uchicago.edu/~laci/papers/13augEntropy.pdf}, 2013]. In this paper we improve some of his bounds, prove new bounds in a wider range of parameters and show matching upper bounds in some special cases. In particular, we prove tight lower bounds for the min-entropy (as well as the entropy) of pairwise and three-wise independent balanced binary variables for infinitely many values of $n$.
References:
[Can10] Cantelli F.P.: Intorno ad un teorema fondamentale della teoria del rischio. Bollettino dell' Associazione degli Attuari Italiani 24 (1910), 1–23.
[MS83] MacWilliams F.J., Sloane N.J.A.:
The Theory of Error-Correcting Codes. North Holland Publishing Co., Amsterdam-New York-Oxford, 1977.
Zbl 0657.94010