(Received June 1, 1992; revised June 5, 1995)
Abstract. Under weak dependence, a minimum distance estimate is obtained for a smooth function and its derivatives in a regression-type framework. The upper bound of the risk depends on the Kolmogorov entropy of the underlying space and the mixing coefficient. It is shown that the proposed estimates have the same rate of convergence, in the L1-norm sense, as in the independent case.
Key words and phrases: Kolmogorov's entropy, minimum distance estimation, nonparametric regression, phi-mixing, rate of convergence.