(Received June 28, 1988; revised March 30, 1990)
Abstract. In maximizing a non-linear function G(theta), it is well known that the steepest descent method has a slow convergence rate. Here we propose a systematic procedure to obtain a 1-1 transformation on the variables theta, so that in the space of the transformed variables, the steepest descent method produces the solution faster. The final solution in the original space is obtained by taking the inverse transformation. We apply the procedure in maximizing the likelihood functions of some generalized distributions which are widely used in modeling count data. It was shown that for these distributions, the steepest descent method via transformations produced the solutions very fast. It is also observed that the proposed procedure can be used to expedite the convergence rate of the first derivative based algorithms, such as Polak-Ribiere, Fletcher and Reeves conjugate gradient methods as well.
Key words and phrases: Generalized distributions, log-likelihood functions, steepest descent method, conjugate gradient method.
Source ( TeX , DVI , PS )