[1] L. Baladová:
Minimum of average conditional entropy for given minimum probability of error. Kybernetika 2 (1966), 416-422.
MR 0215641 |
Zbl 0199.21502
[7] R. M. Fano:
Transmission of Information: A Statistical Theory of Communications. MIT Press and John Wiley & Sons, New York 1961.
MR 0134389 |
Zbl 0151.24402
[10] M. Gell-Mann, C. Tsallis, eds.:
Nonextensive Entropy - Interdisciplinary Applications. Oxford University Press, Oxford 2004.
MR 2073730 |
Zbl 1127.82004
[11] G. H. Hardy, J. E. Littlewood, G. Polya:
Inequalities. Cambridge University Press, London 1934.
Zbl 0634.26008
[12] J. Havrda, F. Charvát:
Quantification methods of classification processes: concept of structural $\alpha$-entropy. Kybernetika 3 (1967), 30-35.
MR 0209067
[15] A. Novikov:
Optimal sequential procedures with Bayes decision rules. Kybernetika 46 (2010), 754-770.
MR 2722099 |
Zbl 1201.62095
[16] A. Perez:
Information-theoretic risk estimates in statistical decision. Kybernetika 3 (1967), 1-21.
MR 0208775 |
Zbl 0153.48403
[19] A. E. Rastegin:
Continuity estimates on the Tsallis relative entropy. E-print arXiv:1102.5154v2 [math-ph] (2011).
MR 2841748
[20] A. E. Rastegin: Fano type quantum inequalities in terms of $q$-entropies. Quantum Information Processing (2011), doi 10.1007/s11128-011-0347-6.
[21] A. Rényi:
On measures of entropy and information. In: Proc. 4th Berkeley Symposium on Mathematical Statistics and Probability, University of California Press, Berkeley - Los Angeles 1961, pp. 547-561.
MR 0132570 |
Zbl 0106.33001
[22] A. Rényi:
On the amount of missing information in a random variable concerning an event. J. Math. Sci. 1 (1966), 30-33.
MR 0210263
[24] A. Rényi:
On some basic problems of statistics from the point of view of information theory. In: Proc. 5th Berkeley Symposium on Mathematical Statistics and Probability, University of California Press, Berkeley - Los Angeles 1967, pp. 531-543.
MR 0212963 |
Zbl 0201.51905
[27] I. Vajda:
On the statistical decision problem with discrete paprameter space. Kybernetika 3 (1967), 110-126.
MR 0215428
[28] I. Vajda:
Bounds of the minimal error probability on checking a finite or countable number of hypotheses. Problemy Peredachii Informacii 4 (1968), 9-19 (in Russian); translated as Problems of Information Transmission 4 (1968), 6-14.
MR 0267685
[29] K. Życzkowski:
Rényi extrapolation of Shannon entropy. Open Sys. Inform. Dyn. 10 (2003), 297-310; corrigendum in the e-print version arXiv:quant-ph/0305062v2.
MR 1998623 |
Zbl 1030.94022