Article
Keywords:
unified $(r,s)$-entropy measure; order statistics; Shannon entropy; logistic distribution.
Summary:
K. M. Wong and S. Chen [9] analyzed the Shannon entropy of a sequence of random variables under order restrictions. Using $(r,s)$-entropies, I. J. Taneja [8], these results are generalized. Upper and lower bounds to the entropy reduction when the sequence is ordered and conditions under which they are achieved are derived. Theorems are presented showing the difference between the average entropy of the individual order statistics and the entropy of a member of the original independent identically distributed (i.i.d.) population. Finally, the entropies of the individual order statistics are studied when the probability density function (p.d.f.) of the original i.i.d. sequence is symmetric about its mean.
References:
[2] N. Balakrishnan and A. C. Cohen:
Order statistics and inference, Estimation methods. Academic Press, 1991.
MR 1084812
[3] I. S. Gradshteyn and I. M. Ryzhik:
Table of integrals, series and products. Academic Press, 1980.
MR 1398882
[4] I. Havrda and F. Charvat:
Quantification method of classification processes: concept of structural $\alpha $-entropy. Kybernetika 3 (1967), 30–35.
MR 0209067
[5] A. Renyi:
On measures of entropy and information. Proc. 4th Berkeley Symp. Math. Statist. and Prob. 1 (1961), 547–561.
MR 0132570 |
Zbl 0106.33001
[7] B. D. Sharma and D. P. Mittal:
New nonadditive measures of entropy for discrete probability distribution. J. Math. Sci. 10 (1975), 28–40.
MR 0539493
[9] K. M. Wong and S. Chen:
The entropy of ordered sequences and order statistics. IEEE Transactions on Information Theory 36(2) (1990), 276–284.
DOI 10.1109/18.52473 |
MR 1052779