skip to content

Mathematical Research at the University of Cambridge

 

<p><span style="color: rgb(0, 0, 0);">We propose a novel class of change of measure inequalities via a unified framework based on the data processing inequality for f-divergences, which is surprisingly elementary yet powerful enough to yield tighter inequalities. We provide change of measure inequalities in terms of a broad family of information measures, including f-divergences (with Kullback-Leibler divergence and $\chi^2$-divergence as special cases), Renyi divergence, and $\alpha$-mutual information (with maximal leakage as a special case). A key advantage of our framework is its flexibility: it readily adapts to a range of settings, including the generalization error analyses, the conditional mutual information framework, PAC-Bayesian theory, differential privacy mechanisms and data memorization problem, with simplified analyses</span></p><p><br></p>

Further information

Time:

13May
May 13th 2026
13:00 to 14:00

Venue:

MR5, CMS Pavilion A

Series:

Information Theory Seminar