[1] M. W. Birch:
A new proof of the Pearson-Fisher Theorem. Ann. Math. Statist. 35 (1964), 817-824.
MR 0169324 |
Zbl 0259.62017
[2] J. Burbea, C. R. Rao:
Entropy differential metric, distance and divergence measures in probability spaces: A unified approach. J. Multivariate Anal. 12 (1982), 575-596.
MR 0680530 |
Zbl 0526.60015
[4] I. Csiszár:
Generalized entropy and quantization problem. In: Trans. of the Sixth Prague Conference, Academia, Prague 1973, pp. 159-174.
MR 0359995
[5] S. G. Ghurye, B. Johnson:
Discrete approximations to the information integral. Canad. J. Statist. 9 (1981), 27-37.
MR 0638384 |
Zbl 0473.62007
[6] D. Morales L. Pardo M. Salicrú, M. L. Menéndez: Information measures associated to R-divergences. In: Multivariate analysis: Future directions 2. (C. M. Cuadras and C. R. Rao, eds.) Elsevier Science Publishers, B. V. 1982.
[7] M. Salicrú M. L. Menéndez L. Pardo, D. Morales:
Asymptotic distribution of $(h,\phi)$-entropies. Comm. Statist. A -- Theory Methods (to appear).
MR 1238377
[8] I. J. Taneja: On generalized information measures and their applications. Adv. Elect. and Elect. Phys. 76 (1989), 327-413.
[9] I. Vajda, K. Vašek:
Majorization, concave entropies and comparison of experiments. Problems Control Inform. Theory 14 (1985), 105-115.
MR 0806056
[10] K. Zografos K. Ferentinos, T. Papaioannou:
Discrete approximations to the Csiszár, Rényi, and Fisher measures of information. Canadian J. Statist. 14 (1986), 4, 355-366.
MR 0876762