Tutorial on Neural OT and Diffusion Models

We gave a tutorial at ICML 2023!

ICML Tutorial 2023

Speakers: Charlotte Bunne and Marco Cuturi

Over the last decade, optimal transport (OT) has evolved from a prize-winning research area in pure mathematics to a recurring theme bursting repeatedly across all machine learning areas. OT, both through its theory and computations, has enabled breakthroughs using a multi-pronged approach, blending elements from convex optimization (e.g., linear and quadratic assignment problems, the Sinkhorn algorithm), analysis (partial differential equations (PDE), links to Monge-Ampère equation), stochastic calculus (diffusion models, Schrödinger bridge), statistics (analysis of sampling algorithms, generalized quantiles, generative model fitting), and deep architectures. Because many of these developments have happened in parallel, the field is increasingly difficult and diverse to grasp for a non-informed audience. The goal of this tutorial will be to provide a unifying perspective that underlines the centrality of OT to the wealth of developments listed above, drawing connections between these approaches both in algorithms and theory, and provide some directions on how the field can further evolve to create new machine learning methods grounded on this exciting toolbox.

Recording

Slides and Script

The slides of the tutorial can be downloaded here.

For a summary of the tutorial, further theoretical connections and references, please find below a script of the tutorial. In case this topic sparked your interest: The script is a chapter of my PhD thesis, which is available for download here.

Further Resources

Extended discussions on theoretical properties and numerical considerations can be found in the following books and review articles:

Further, the above-mentioned methods are implemented in various Python libraries. In particular,

Share the Post:

Related Posts