第69回統計的機械学習セミナー / The 69th Statistical Machine Learning Seminar (Hybrid)

【Date & Time】
July 31th (Thursday), 2025 16:00 -
Admission Free
【Place】
Seminar Room 4 (3rd floor), The Institute of Statistical Mathamatics
Hybrid :
Please register at the following link and get a Zoom link, if you join by Zoom
https://forms.gle/BgoBY4wR18aP4bcq6
【Speaker】
Zonghao (Hudson) Chen (University College London)
【Title】
(De)-regularized Maximum Mean Discrepancy Gradient Flow
【Abstract】
We introduce a (de)-regularization of the Maximum Mean Discrepancy (DrMMD) and its Wasserstein gradient flow. Existing gradient flows that transport samples from source distribution to target distribution with only target samples, either lack tractable numerical implementation (f-divergence flows) or require strong assumptions, and modifications such as noise injection, to ensure convergence (Maximum Mean Discrepancy flows). In contrast, DrMMD flow can simultaneously (i) guarantee near-global convergence for a broad class of targets in both continuous and discrete time, and (ii) be implemented in closed form using only samples. The former is achieved by leveraging the connection between the DrMMD and the χ2-divergence, while the latter comes by treating DrMMD as MMD with a de-regularized kernel. Our numerical scheme uses an adaptive de-regularization schedule throughout the flow to optimally trade off between discretization errors and deviations from the chi-squared regime. The potential application of the DrMMD flow is demonstrated across several numerical experiments, including a large-scale setting of training student/teacher networks.