[1] S. I. Amari:
A foundation of information geometry. vol. 66-A, , 1983, pp. 1–10.
MR 0747878
[2] C. Atkinson, A. F. S. Mitchell:
Rao’s distance measure. vol. 43, , 1981, pp. 345–365.
MR 0665876
[4] J. Burbea:
Informative geometry of probability spaces. vol. 4, , 1986, pp. 347–378.
MR 0867963
[5] J. Burbea, C. R. Rao:
Entropy differential metric, distance and divergence measures in probability spaces: A unified approach. J. Multivariate Analysis 12 (1982a), no. , , 575–596.
DOI 10.1016/0047-259X(82)90065-3 |
MR 0680530
[6] J. Burbea, C. R. Rao:
On the convexity of some divergence measures based on entropy functions. vol. IT-28, , 1982b, pp. 489–495.
MR 0672884
[7] J. Burbea, C. R. Rao:
On the convexity of higher order Jensen differences based on entropy functions. vol. IT-28, , 1982c, pp. 961–963.
MR 0687297
[8] N. N. Cencov:
Statistical Decision Rules and Optimal Inference. vol. , , 1982, pp. .
MR 0645898
[9] I. Csiszár:
Information-type measures of difference of probability distributions and indirect observations. vol. 2, , 1967, pp. 299–318.
MR 0219345
[10] K. Ferentinos, T. Papaioannou:
New parametric measures of information. vol. 51, , 1981, pp. 193–208.
MR 0686839
[11] C. Ferreri:
Hypoentropy and related heterogeneity divergence measures. vol. 40, , 1980, pp. 55–118.
MR 0586545
[12] J. Havrda, F. Charvat: Concept of structural $\alpha $-entropy. vol. 3, , 1967, pp. 30–35.
[13] D. Morales, L. Pardo, L. Salicrú, M. L. Menéndez:
New parametric measures of information based on generalized $R$-divergences. vol. , , 1993, pp. 473–488.
MR 1268437
[14] R. J. Muirhead:
Aspect of Multivariate Statistical Theory. vol. , , 1982, pp. .
MR 0652932
[15] O. Onicescu:
Energie Informationnelle. vol. 263, , 1966, pp. 841–842.
MR 0229478
[16] C. R. Rao:
Information and accuracy attainable in the estimation of statistical parameters. vol. 37, , 1945, pp. 81–91.
MR 0015748
[17] C. R. Rao: Differential Metrics in probability spaces. vol. , , 1987, pp. .
[18] A. Rényi:
On measures of entropy and information. vol. 1, , 1961, pp. 547–561.
MR 0132570
[19] M. Salicrú, M. L. Menéndez, D. Morales, L. Pardo:
Asymptotic distribution of $(h,\Phi )$-entropies. vol. 22(7), , 1993, pp. 2015–2031.
MR 1238377
[21] B. D. Sharma, I. J. Taneja:
Entropy of type $(\alpha , \beta )$ and other generalized measures in information theory. vol. 22, , 1975, pp. 205–215.
MR 0398670
[22] B. D. Sharma, P. Mittal:
New non-additive measures of relative information. vol. 2, , 1975, pp. 122–133.
MR 0476167
[23] I. J. Taneja: A study of generalized measures in information theory. vol. , , 1975, pp. .
[24] I. J. Taneja: On generalized information measures and their applications. vol. 76, , 1989, pp. 327–413.
[25] I. Vajda, K. Vašek:
Majorization, concave entropies and comparison of experiments. vol. 14, , 1985, pp. 105–115.
MR 0806056
[26] J. C. A. Van der Lubbe: $R$-norm information and a general class of measures for certainty and information. M. Sc. Thesis, Delf University of Technology, Dept. E.E., (1977), no. , , . (Dutch)
[27] J. C. A. Van der Lubbe: A generalized probabilistic theory of the measurement of certainty and information. Ph. D. Thesis, Delf University of Technology, Dept. E.E., (1981), no. , , .
[28] R. S. Varma:
Generalizations of Renyi’s entropy of order $\alpha $. vol. 1, , 1966, pp. 34–48.
MR 0210515 |
Zbl 0166.15401