###
A MODIFIED STEEPEST DESCENT METHOD WITH

APPLICATIONS TO MAXIMIZING LIKELIHOOD FUNCTIONS

###
MOHAMED HABIBULLAH^{1} AND S. K. KATTI^{2}

^{1} *Division of Sciences and Mathematics, University of Wisconsin-Superior,*

Superior, WI 54880, U.S.A.

^{2} *Department of Statistics, University of Missouri-Columbia,*

Columbia, MO 65211, U.S.A.
(Received June 28, 1988; revised March 30, 1990)

**Abstract.**
In maximizing a non-linear function
*G*(*theta*), it is well known that the steepest descent method
has a slow convergence rate. Here we propose a systematic procedure to
obtain a 1-1 transformation on the variables *theta*, so that in the
space of the transformed variables, the steepest descent method
produces the solution faster. The final solution in the original space
is obtained by taking the inverse transformation. We apply the
procedure in maximizing the likelihood functions of some generalized
distributions which are widely used in modeling count data. It was
shown that for these distributions, the steepest descent method via
transformations produced the solutions very fast. It is also observed
that the proposed procedure can be used to expedite the convergence
rate of the first derivative based algorithms, such as Polak-Ribiere,
Fletcher and Reeves conjugate gradient methods as well.

*Key words and phrases*:
Generalized distributions,
log-likelihood functions, steepest descent method, conjugate
gradient method.

**Source**
( TeX ,
DVI ,
PS )