AISM 53, 277-288
© 2001 ISM

Minimum divergence estimators based on grouped data

M. Menéndez1, D. Morales2, L. Pardo3 and I. Vajda4

1Department of Applied Mathematics, Technical University of Madrid, 28040 Madrid, Spain
2Operations Research Center, Miguel Hernández University of Elche, 03202 Elche, Spain
3Department of Statistics & O. R., Complutense University of Madrid, 28040 Madrid, Spain
4Institute of Information Theory, Academy of Sciences of the Czech Republic, CZ-18208 Prague, Czech Republic

(Received September 8, 1997; revised July 9, 1999)

Abstract.    The paper considers statistical models with real-valued observations i.i.d. by $F(x,\theta_{0})$ from a family of distribution functions $(F(x,\theta);\theta\in\Theta)$, $\Theta\subset R^{s}$, $s\geq1$. For random quantizations defined by sample quantiles $(F_{n}^{-1}(\lambda _{1}),\ldots, F_{n}^{-1}(\lambda_{m-1}))$ of arbitrary fixed orders $0<\lambda _{1}<\cdots<\lambda_{m-1}<1$, there are studied estimators $\theta_{\phi,n}$ of $\theta_{0}$ which minimize $\phi$-divergences of the theoretical and empirical probabilities. Under an appropriate regularity, all these estimators are shown to be as efficient (first order, in the sense of Rao) as the MLE in the model quantified nonrandomly by $(F^{-1}(\lambda_{1},\theta_{0}),\ldots,F^{-1}(\lambda_{m-1},\theta_{0}))$. Moreover, the Fisher information matrix $I_{m}(\theta_{0},\lambda)$ of the latter model with the equidistant orders ${\pmb{$\lambda$}}=(\lambda_{j}=j/m: 1\leq j\leq m-1)$ arbitrarily closely approximates the Fisher information ${\cal{J}}(\theta_{0})$ of the original model when $m$ is appropriately large. Thus the random binning by a large number of quantiles of equidistant orders leads to appropriate estimates of the above considered type.

Key words and phrases:    Minimum divergence estimators, random quantization, asymptotic normality, efficiency, Fisher information, optimization.

Source ( TeX , DVI )