Article
Keywords:
minimum $\phi $-divergence estimation; subdivergence; superdivergence; PC simulation; relative efficiency; robustness
Summary:
Point estimators based on minimization of information-theoretic divergences between empirical and hypothetical distribution induce a problem when working with continuous families which are measure-theoretically orthogonal with the family of empirical distributions. In this case, the $\phi$-divergence is always equal to its upper bound, and the minimum $\phi$-divergence estimates are trivial. Broniatowski and Vajda [3] proposed several modifications of the minimum divergence rule to provide a solution to the above mentioned problem. We examine these new estimation methods with respect to consistency, robustness and efficiency through an extended simulation study. We focus on the well-known family of power divergences parametrized by $\alpha \in \mathbb{R}$ in the Gaussian model, and we perform a comparative computer simulation for several randomly selected contaminated and uncontaminated data sets, different sample sizes and different $\phi$-divergence parameters.
References:
[1] M. Broniatowski, A. Keziou:
Minimization of $\phi $-divergences on sets of signed measures. Studia Sci. Math. Hungar. 43 (2006), 403-442.
MR 2273419 |
Zbl 1121.28004
[3] M. Broniatowski, I. Vajda: Several Applications of Divergence Criteria in Continuous Families. Research Report No. 2257. Institute of Information Theory and Automation, Prague 2009.
[4] I. Frýdlová: Minimum Kolmogorov Distance Estimators. Diploma Thesis. Czech Technical University, Prague 2004.
[5] I. Frýdlová: Modified Power Divergence Estimators and Their Performances in Normal Models. In: Proc. FernStat2010, Faculty of Social and Economic Studies UJEP, Ústí n. L. 2010, 28-33.
[9] I. Vajda:
Theory of Statistical Inference and Information. Kluwer, Boston 1989.
Zbl 0711.62002