AISM 52, 316-331
(Received November 20, 1997; revised October 26, 1998)
Abstract. Following a Markov chain approach, this paper establishes asymptotic properties of the least squares estimator in nonlinear autoregressive (NAR) models. Based on conditions ensuring the stability of the model and allowing the use of a strong law of large number for a wide class of functions, our approach improves some known results on strong consistency and asymptotic normality of the estimator. The exact convergence rate is established by a law of the iterated logarithm. Based on this law and a generalized Akaike's information criterion, we build a strongly consistent procedure for selection of NAR models. Detailed results are given for familiar nonlinear AR models like exponential AR models, threshold models or multilayer feedforward perceptrons.
Key words and phrases: Nonlinear AR process, least squares estimation, law of the iterated logarithm, model selection, multilayer perceptron.
Source ( TeX , DVI )