AN INFORMATION-THEORETIC FRAMEWORK FOR ROBUSTNESS

STEPHAN MORGENTHALER1 AND CLIFFORD HURVICH2

1 Swiss Federal Institute of Technology, EPFL-DMA, 1015 Lausanne, Switzerland
2 New York University, 735 Tisch Hall, Washington Sq., New York, NY 10003, U.S.A.

(Received December 27, 1988; revised July 13, 1989)

Abstract.    This is a paper about the foundation of robust inference. As a specific example, we consider semiparametric location models that involve a shape parameter. We argue that robust methods result via the selection of a representative shape from a set of allowable shapes. To perform this selection, we need a measure of disparity between the true shape and the shape to be used in the inference. Given such a disparity, we propose to solve a certain minimax problem. The paper discusses in detail the use of the Kullback-Leibler divergence for the selection of shapes. The resulting estimators are shown to have redescending influence functions when the set of allowable shapes contains heavy-tailed members. The paper closes with a brief discussion of the next logical step, namely the representation of a set of shapes by a pair of selected shapes.

Key words and phrases:    Robustness, distributional shapes, Kullback-Leibler divergence.

Source ( TeX , DVI , PS )