第72回統計的機械学習セミナー / The 72nd Statistical Machine Learning Seminar (Hyblid)
- 【Date & Time】
- 16 March, 2026 (Monday) 11:00 - 12:00
Admission Free, No Booking Necessary
- 【Place】
- Seminar Room 4, The Institute of Statistical Mathematics
Online :
Please register from the URL below to get a Zoom link:
https://us06web.zoom.us/meeting/register/DGKMsSrDSDqRW2LlqCYuxQ
- 【Speaker】
- André Uschmajew (Augsburg)
- 【Title】
- Randomized Low-Rank Approximation of Hilbert-Schmidt Operators
- 【Abstract】
-
Low-rank approximation methods for matrices are based on projecting the columns (or rows) to suitable low-dimensional subspaces. Taking
subspaces spanned by individual columns or by random linear combinations of all columns are common alternatives to the computation of optimal
subspaces via the SVD, especially for large or implicitly given matrices. Well-known results on volume sampling or randomized SVD show
that such approaches indeed achieve quasi-optimal approximation errors in expectation. In this talk, we discuss generalizations of such results
to low-rank approximation of Hilbert-Schmidt operators between infinite-dimensional Hilbert spaces. In the first part, we consider the
approximation of vector valued L2 functions in subspaces spanned by point samples, and show existence of quasi-optimal sample points based
on a continuous version of volume sampling. In the second part, we discuss infinite-dimensional extensions of the randomized SVD as
recently proposed by Boullé and Townsend, for which we present an alternative approach. This also includes a novel extension of the
Nyström approximation for self-adjoint positive semi-definite trace class operators. Based on joint work with D. Kressner, T. Ni, and D.
Persson.