Article
Keywords:
Jackknife estimator; least squares estimator; linear model; estimator of variance-covariance components; consistency
Summary:
Let $\theta^*$ be a biased estimate of the parameter $\vartheta$ based on all observations $x_1$, $\dots$, $x_n$ and let $\theta_{-i}^*$ ($i=1,2,\dots,n$) be the same estimate of the parameter $\vartheta$ obtained after deletion of the $i$-th observation. If the expectation of the estimators $\theta^*$ and $\theta_{-i}^*$ are expressed as $$ \align \mathrm{E}(\theta^*)&=\vartheta+a(n)b(\vartheta) \\ \mathrm{E}(\theta_{-i}^*)&=\vartheta+a(n-1)b(\vartheta)\qquad i=1,2,\dots,n, \endalign $$ where $a(n)$ is a known sequence of real numbers and $b(\vartheta)$ is a function of $\vartheta$, then this system of equations can be regarded as a linear model. The least squares method gives the generalized jackknife estimator. Using this method, it is possible to obtain the unbiased estimator of the parameter $\vartheta$.
References:
[1] W. Kruskal:
When are Gauss-Markov and least square estimators identical? A coordinate free approach. Ann. Math. Statistics 39 (1968), 70–75.
DOI 10.1214/aoms/1177698505 |
MR 0222998
[2] G. H. Lavergne, J. R. Mathieu:
The jackknife method and the Gauss-Markov estimation. Probability and Math. Statistics 8 (1987), 111–116.
MR 0928124
[4] M. H. Quenouille:
Approximate test of correlation in time-series. J. Roy. Statist. Soc. Ser. B 11 (1949), 68–84.
MR 0032176
[7] F. Štulajter:
Consistency of linear and quadratic least squares estimators in regression models with covariance stationary errors. Applications of Mathematics 36(2) (1991), 149–155.
MR 1097699