###
MINIMUM DISTANCE REGRESSION-TYPE ESTIMATES

WITH RATES UNDER WEAK DEPENDENCE

###
GEORGE G. ROUSSAS^{ 1} AND YANNIS G. YATRACOS^{ 2}

^{1} *Division of Statistics, University of California,
Davis, CA
95616-8705, U.S.A.*

^{2} *Departement de mathematiques et de statistique,
Université Montréal,*

C.P. 6128, succursale A, Montreal, Quebec, Canada H3C 3J7

and University of California, Santa Barbara
(Received June 1, 1992; revised June 5, 1995)

**Abstract.**
Under weak dependence, a minimum distance
estimate is obtained for a smooth function and its derivatives in a
regression-type framework. The upper bound of the risk depends on
the Kolmogorov entropy of the underlying space and the mixing
coefficient. It is shown that the proposed estimates have the same
rate of convergence, in the *L*_{1}-norm sense, as in the independent
case.

*Key words and phrases*:
Kolmogorov's entropy, minimum
distance estimation, nonparametric regression, *phi*-mixing,
rate of convergence.

**Source**
( TeX ,
DVI ,
PS )