###
MINIMUM DISPARITY ESTIMATION FOR CONTINUOUS

MODELS: EFFICIENCY, DISTRIBUTIONS AND ROBUSTNESS

###
AYANENDRANATH BASU^{1} AND BRUCE G. LINDSAY^{2}

^{1} *Department of Mathematics, University of Texas at Austin, Austin, TX 78712-1082, U.S.A.*

^{2} *Department of Statistics, Pennsylvania State University, University Park, PA 16802, U.S.A.*
(Received August 9, 1993; revised March 3, 1994)

**Abstract.**
A general class of minimum distance estimators
for continuous models called minimum disparity estimators are
introduced. The conventional technique is to minimize a distance
between a kernel density estimator and the model density. A new
approach is introduced here in which the model and the data are
smoothed with the same kernel. This makes the methods consistent and
asymptotically normal independently of the value of the smoothing
parameter; convergence properties of the kernel density estimate are
no longer necessary. All the minimum distance estimators considered
are shown to be first order efficient provided the kernel is chosen
appropriately. Different minimum disparity estimators are compared
based on their characterizing residual adjustment function (
*RAF*); this function shows that the robustness features of the
estimators can be explained by the shrinkage of certain residuals
towards zero. The value of the second derivative of the *RAF*
at zero, *A*_{2}, provides the trade-off between efficiency and
robustness. The above properties are demonstrated both by theorems
and by simulations.

*Key words and phrases*:
Disparity, Hellinger distance,
Pearson residuals,
*MLE*^{*}, robustness, efficiency, transparent kernels.

**Source**
( TeX ,
DVI ,
PS )