[1] I. Csiszár:
Information-type measures of difference of probability distributions and indirect observations. Studia Sci. Math. Hungar. 2 (1967), 299–318.
MR 0219345
[2] I. Csiszár:
On topological properties of $f$-divergences. Studia Sci. Math. Hungar. 2 (1967), 329–339.
MR 0219346
[3] B. Fuglede, T. Topsøe: Jensen–Shannon divergence and Hilbert space embedding. In: Proc. IEEE Internat. Symposium on Inform. Theory, IEEE Publications, New York 2004, p. 31.
[4] P. Kafka, F. Österreicher, I. Vincze:
On powers of $f$-divergences defining a distance. Stud. Sci. Math. Hungar. 26 (1991), 329–339.
MR 1197090
[5] M. Khosravifard, D. Fooladivanda, T. A. Gulliver:
Confliction of the convexity and metric properties in $f$-divergences. IEICE Trans. on Fundamentals E90-A (2007), 1848–1853.
DOI 10.1093/ietfec/e90-a.9.1848
[6] V. Kůs, D. Morales, I. Vajda:
Extensions of the parametric families of divergences used in statistical inference. Kybernetika 44 (2008), 95–112.
MR 2405058 |
Zbl 1142.62002
[11] F. Öesterreicher:
On a class of perimeter-type distances of probability distributions. Kybernetika 32 (1996), 389–393.
MR 1420130
[12] F. Österreicher, I. Vajda:
A new class of metric divergences on probability spaces and its statistical applications. Ann. Inst. Statist. Math. 55 (2003), 639–653.
DOI 10.1007/BF02517812 |
MR 2007803
[14] I. Vincze:
On the concept and measure of information contained in an observation. In: Contributions to Probability (J. Gani and V. F. Rohatgi, eds.), Academic Press, New York 1981, pp. 207–214.
MR 0618690 |
Zbl 0531.62002