skip to content

Mathematical Research at the University of Cambridge

 

Embedding probability distributions into reproducing kernel Hilbert spaces (RKHS) has enabled powerful non-parametric methods such as the maximum mean discrepancy (MMD), a statistical distance with strong theoretical and computational properties. At its core, the MMD relies on kernel mean embeddings (KMEs) to represent distributions as mean functions in RKHS. However, it remains unclear if the mean function is the only meaningful RKHS representation.Inspired by generalised quantiles, we introduce the notion of kernel quantile embeddings (KQEs), along with a consistent estimator. We then use KQEs to construct a family of distances that:(i) are probability metrics under weaker kernel conditions than MMD;(ii) recover a kernelised form of the sliced Wasserstein distance; and(iii) can be efficiently estimated with near-linear cost.Through hypothesis testing, we show that these distances offer a competitive alternative to MMD and its fast approximations. Our findings demonstrate the value of representing distributions in Hilbert space beyond simple mean functions, paving the way for new avenues of research.

Further information

Time:

06May
May 6th 2025
15:30 to 16:00

Venue:

Seminar Room 1, Newton Institute

Speaker:

Masha Naslidnyk (University College London)

Series:

Isaac Newton Institute Seminar Series