[2] A. Basu, S. Sarkar:
The trade-off between robustness and efficiency and the effect of model smoothing. J. Statist. Comput. Simulation 50 (1994), 173–185.
DOI 10.1080/00949659408811609
[4] E. Bofinger:
Goodness-of-fit using sample quantiles. J. Roy. Statist. Soc. Ser. B 35 (1973), 277–284.
MR 0336896
[5] H. Cramér:
Mathematical Methods of Statistics. Princeton University Press, Princeton, 1946.
MR 0016588
[6] N. A. C. Cressie, R. C. Read:
Multinomial goodness-of-fit tests. J. Roy. Statist. Soc. Ser. B 46 (1984), 440–464.
MR 0790631
[7] R. A. Fisher: Statistical Methods for Research Workers (8th edition). London, 1941.
[8] F. Liese, I. Vajda:
Convex Statistical Distances. Teubner, Leipzig, 1987.
MR 0926905
[9] B. G. Lindsay:
Efficiency versus robutness: The case for minimum Hellinger distance and other methods. Ann. Statist. 22 (1994), 1081–1114.
DOI 10.1214/aos/1176325512 |
MR 1292557
[10] M. L. Menéndez, D. Morales, L. Pardo and I. Vajda:
Two approaches to grouping of data and related disparity statistics. Comm. Statist. Theory Methods 27 (1998), 609–633.
DOI 10.1080/03610929808832117 |
MR 1619038
[11] M. L. Menéndez, D. Morales, L. Pardo and I. Vajda:
Minimum divergence estimators based on grouped data. Ann. Inst. Statist. Math. 53 (2001), 277–288.
DOI 10.1023/A:1012466605316 |
MR 1841136
[13] J. Neyman:
Contribution to the theory of the $\chi ^2$ test. In: Proc. Berkeley Symp. Math. Statist. Probab., Berkeley, CA, Berkeley University Press, Berkeley, 1949, pp. 239–273.
MR 0028003
[14] Ch. Park, A. Basu and S. Basu:
Robust minimum distance inference based on combined distances. Comm. Statist. Simulation Comput. 24 (1995), 653–673.
DOI 10.1080/03610919508813265
[15] C. R. Rao:
Asymptotic efficiency and limiting information. In: Proc. 4th Berkeley Symp. Math. Stat. Probab., Berkeley, CA, Berkeley University Press, Berkeley, 1961, pp. 531–545.
MR 0133192 |
Zbl 0156.39802
[16] C. R. Rao:
Linear Statistical Inference and its Applications (2nd edition). Wiley, New York, 1973.
MR 0346957
[17] R. C. Read, N. A. C. Cressie:
Goodness-of-fit Statistics for Discrete Multivariate Data. Springer-Verlag, New York, 1988.
MR 0955054
[19] I. Vajda:
$\chi ^2$-divergence and generalized Fisher information. In: Transactions of the Sixth Prague Conference on Information Theory, Statistical Decision Functions and Random Processes, Academia, Prague, 1973, pp. 223–234.
MR 0356302 |
Zbl 0297.62003
[20] I. Vajda:
Theory of Statistical Inference and Information. Kluwer Academic Publishers, Boston, 1989.
Zbl 0711.62002
[21] B. L. van der Waarden: Mathematische Statistik. Springer-Verlag, Berlin, 1957.
[22] K. Pearson: On the criterion that a given system of deviations from the probable in the case of correlated system of variables is such that it can be reasonably supposed to have arisen from random sampling. Philosophical Magazine 50 (1990), 157–172.
[23] I. Csiszár:
Eine Informationstheoretische Ungleichung und ihre Anwendung auf den Beweis der Ergodizität von Markoffschen Ketten. Publications of the Mathematical Institute of the Hungarian Academy of Sciences, Series A 8 (1963), 85–108.
MR 0164374
[24] M. S. Ali, S. D. Silvey:
A general class of coefficients of divergence of one distribution from another. J. Roy. Statist. Soc. Ser. B 28 (1966), 131–140.
MR 0196777
[25] A. Rényi:
On measures of entropy and information. In: Proceedings of the 4th Berkeley Symposium on Probability Theory and Mathematical Statistics, Vol. 1, University of California Press, Berkeley, 1961, pp. 531–546.
MR 0132570
[26] A. W. Marshall, I. Olkin:
Inequalities: Theory of Majorization and its Applications. Academic Press, New York, 1979.
MR 0552278