[1] L. L. Campbell:
Characterization of entropy of probability distributions on real lines. Information and control 21 (1972), 329-338.
MR 0323447
[2] A. Hobson:
A new theorem of information theory. J. Stat. Phys. 1 (1969), 383-391.
MR 0321609
[3] P. L. Kannappan:
On Shannon entropy, directed divergence and inaccuracy. Z. Wahrs. verw. Geb. 22 (1972), 95-100.
MR 0308635
[4] S. Kullback:
Information theory and statistics. Dover Publications, Inc., New York (1959).
MR 0103557 |
Zbl 0088.10406
[5] P. N. Rathie P. L. Kannappan:
A directed divergence function of type $\beta$. Information and Control 20 (1972), 38-45.
MR 0414234
[6] A. Rényi:
On measures of entropy and informations. Proc. 4th Berkley Symposium on Probability and Stat., Berkley 1961, Vol. 1, 547-561.
MR 0132570
[7] B. D. Sharma R. Autar:
Relative information functions and their type ($\alpha$, $\beta$). Generalizations. Metrika 21 (1973).
MR 0363704
[8] B. D. Sharma I. J. Taneja:
On axiomatic characterization of information-theoretic measures. J. Statist. Phys. 10 (1974), 4, 337-346.
MR 0403809
[9] I. Vajda:
On the amount of information contained in a sequence of independent observations. Kybernetika 6 (1970), 4, 306-324.
MR 0301831 |
Zbl 0202.17802
[10] I. Vajda:
Limit theorem for total variation of Cartesian product measures. Studia Scientiarum Mathematicarum Hungarica 6 (1971), 317-333.
MR 0310950
[11] C. T. Ng:
Representation for measures of information with the branching property. Information and Control 25 (1974), 45-56.
MR 0342273 |
Zbl 0279.94018