AISM 53, 262-276

## Kullback-Leibler information consistent estimation for censored data

### Akio Suzukawa, Hideyuki Imai and Yoshiharu Sato

Division of Systems and Information Engineering, Hokkaido University, Kita 13, Nishi 8, Kitaku, Sapporo 060-8628, Japan

(Received March 11, 1998; revised June 29, 1999)

Abstract.    This paper is intended as an investigation of parametric estimation for the randomly right censored data. In parametric estimation, the Kullback-Leibler information is used as a measure of the divergence of a true distribution generating a data relative to a distribution in an assumed parametric model ${\cal M}$. When the data is uncensored, maximum likelihood estimator (MLE) is a consistent estimator of minimizing the Kullback-Leibler information, even if the assumed model ${\cal M}$ does not contain the true distribution. We call this property minimum Kullback-Leibler information consistency (MKLI-consistency). However, the MLE obtained by maximizing the likelihood function based on the censored data is not MKLI-consistent. As an alternative to the MLE, Oakes(1986, Biometrics, 42, 177-182) proposed an estimator termed approximate maximum likelihood estimator (AMLE) due to its computational advantage and potential for robustness. We show MKLI-consistency and asymptotic normality of the AMLE under the misspecification of the parametric model. In a simulation study, we investigate mean square errors of these two estimators and an estimator which is obtained by treating a jackknife corrected Kaplan-Meier integral as the log-likelihood. On the basis of the simulation results and the asymptotic results, we discuss comparison among these estimators. We also derive information criteria for the MLE and the AMLE under censorship, and which can be used not only for selecting models but also for selecting estimation procedures.

Key words and phrases:    Approximate likelihood, information criterion, Kaplan-Meier estimator, maximum likelihood estimation.

Source ( TeX , DVI )