banner

The 52nd Statistical Machine Learning Seminar (2022.12.22)

Time: December 22 (Thu), 2022. 16:00 - 17:30
Place: Zoom (online)

Speaker: Heishiro Kanagawa (Newcastle University)
Title: Controlling Moments with Kernel Stein Discrepancies

Abstract: Quantifying the deviation of a probability distribution is challenging when the target distribution is defined by a density with an intractable normalizing constant. The kernel Stein discrepancy (KSD) was proposed to address this problem and has been applied to various tasks including diagnosing approximate MCMC samplers and goodness-of-fit testing for unnormalized statistical models. In this talk, I discuss a convergence control property of the diffusion kernel Stein discrepancy (DKSD), an instance of the KSD proposed by Barp et al. (2019). This talk presents a result extending that of Gorham and Mackey (2017), which showed that the KSD controls the bounded-Lipschitz metric, to functions of polynomial growth. Specifically, the DKSD controls the integral probability metric defined by a class of pseudo-Lipschitz functions, a polynomial generalization of Lipschitz functions. I also demonstrate practical sufficient conditions on the reproducing kernel for the stated property to hold. In particular, we will see that the DKSD detects non-convergence in moments with an appropriate kernel.