[1] Barbour, A. D., Holst, L., L., Janson, S.:
Poisson Approximation. Oxford Studies in Probability 2, Clarendon Press, Oxford 1992.
DOI |
MR 1163825
[2] Cover, T. M., Thomas, J. A.:
Elements of Information Theory. Wiley Series in Telecommunications. 1991.
DOI |
MR 1122806
[3] Csiszár, I., Shields, P.:
Information Theory and Statistics: A Tutorial. Foundations and Trends in Communications and Information Theory, Now Publishers Inc., (2004) 4, 417-528.
DOI |
MR 0886841
[4] Diaconis, P., Friedman, D.:
A dozen de Finetti-style results in search of a theory. Ann. Inst. Henri Poincaré 23 (1987), 2, 397-423.
MR 0898502
[5] Harremoës, P.:
Mutual information of contingency tables and related inequalities. In: 2014 IEEE International Symposium on Information Theory, IEEE 2014, pp. 2474-2478.
DOI
[6] Harremoës, P., Johnson, O., Kontoyiannis, I.:
Thinning and information projections. arXiv:1601.04255, 2016.
MR 2807322
[7] Harremoës, P., Ruzankin, P.:
Rate of Convergence to Poisson Law in Terms of Information Divergence. IEEE Trans. Inform Theory 50 (2004), 9, 2145-2149.
DOI |
MR 2097199
[8] Matúš, F.:
Urns and entropies revisited. In: 2017 IEEE International Symposium on Information Theory (ISIT) 2017, pp. 1451-1454.
DOI
[9] Stam, A. J.:
Distance between sampling with and without replacement. Statistica Neerlandica 32 (1978), 2, 81-91.
DOI |
MR 0518630