[1] A. Barron L. Györfi, E. van der Meulen:
Distribution estimation consistent in total variation and in two types of information divergence. IEEE Trans. Inform. Theory 38 (1992), 1437-1454.
MR 1178189
[2] T. Berger:
Rate Distortion Theory: A Mathematical Basis for Data Compression. Prentice-Hall, Englewood Cliffs, NJ 1971.
MR 0408988
[3] H. Chernoff, E. L. Lehmann:
The use of maximum likelihood estimates in $\chi^2$ tests of goodness of fit. Ann. Math. Statist. 25 (1954), 579-586.
MR 0065109 |
Zbl 0056.37103
[4] B. S. Clarke, A. R. Barron:
Information-theoretic asymptotics and Bayes methods. IEEE Trans. Inform. Theory 36 (1990), 453-471.
MR 1053841
[6] N. Cressie, T. R. C. Read:
Multinomial goodness of fit tests. J. Roy. Statist. Soc. Ser. A 46 (1984), 440-464.
MR 0790631 |
Zbl 0571.62017
[7] I. Csiszár:
Information-type measures of difference of probability distributions and their indirect observation. Studia Sci. Math. Hungar. 2 (1967), 299-318.
MR 0219345
[8] I. Csiszár:
Generalized cutoff rates and Rényi's information measures. IEEE Trans. Inform. Theory 41 (1995), 26-34.
MR 1366742 |
Zbl 0822.94003
[9] R. C. Dahiya, J. Gurland:
Pearson chi-squared test of fit with random intervals. Biometrika 59 (1972), 147-153.
MR 0314191 |
Zbl 0232.62017
[10] A. Gersho, R. M. Gray: Vector Quantization and Signal Compression. Kluwer, Boston 1991.
[11] L. Györfi I. Vajda, E. van der Meulen:
Minimum Hellinger distance point estimates consistent under weak family regularity. Mathem. Methods of Statistics 3 (1994), 25-45.
MR 1272629
[12] L. Györfi I. Vajda, E. van der Meulen:
Parameter estimation by projecting on structural families. In: Proc. 5th Prague Symp. on Asympt. Statistics (P. Mandl and H. Hušková, eds.), Physica Verlag, Wien 1994, pp. 261-272.
MR 1311945
[13] W. C. M. Kallenberg J. Oosterhoff, B. F. Schriever:
The number of classes in chi-squared goodness of fit tests. J. Amer. Statist. Assoc. 80 (1985), 959-968.
MR 0819601
[14] M. Menéndez D. Morales L. Pardo, I. Vajda:
Divergence-based estimation and testing of statistical models of classification. J. Multivariate Anal. 54 (1995), 329-354.
MR 1345543
[16] D. S. Moore:
A chi-squared statistics with random cell boundaries. Ann. Math. Statist. 42 (1971), 147-156.
MR 0275601
[17] D. Morales L. Pardo, I. Vajda:
Asymptotic divergence of estimates of discrete distributions. J. Statist. Plann. Inference 49, 1995.
MR 1368984
[18] F. Österreicher, I. Vajda:
Statistical information and discrimination. IEEE Trans. Inform. Theory 39 (1993), 1036-1039.
MR 1237725
[19] F. H. Ruymgaart:
A note on chi-square statistics with random cell boundaries. Ann. Statist. 3 (1975), 965-968.
MR 0378183 |
Zbl 0325.62015
[20] M. Teboulle, I. Vajda:
Convergence of best $\phi$-entropy estimates. IEEE Trans. Inform. Theory 39 (1993), 297-301.
MR 1211512 |
Zbl 0765.94001
[21] I. Vajda: From perceptron to Boltzman machine: Information processing by cognitive networks. In: Proc. of the Third European School of System Sciences (I. Figuearas, A. Moncho and R. Torres, eds.), Univ. of Valencia, Valencia 1994, pp. 65-68.
[22] A. Veselý, I. Vajda: Classification of random signals by neural networks. In: Proc. of 14th Internat. Congress of Cybernetics, University of Namur, Namur 1996, in print.