AISM 52, 737-752
(Received September 17, 1998; revised May 28, 1999)
Abstract. From a Bayesian point of view, in this paper we discuss the influence of a subset of observations on the posterior distributions of parameters in a growth curve model with unstructured covariance. The measure used to assess the influence is based on a Bayesian entropy, namely Kullback-Leibler divergence (KLD). Several new properties of the Bayesian entropy are studied, and analytically closed forms of the KLD measurement both for the matrix-variate normal distribution and the Wishart distribution are established. In the growth curve model, the KLD measurements for all combinations of the parameters are also studied. For illustration, a practical data set is analyzed using the proposed approach, which shows that the diagnostics measurements are useful in practice.
Key words and phrases: Bayesian analysis, case-deletion method, growth curve model, Kullback-Leibler divergence, statistical diagnostics.