Harnessing Low Dimensionality in Diffusion Models: From Theory to Practice: Lecture II: Sampling Theory for Diffusion Models

Qing Qu, Yuxin Chen, Liyue Shen

International Conference on Machine Learning 2025 · Tutorial

This talk, the second lecture in a comprehensive tutorial on diffusion models, delves into the intricate mathematical foundations governing the **sampling stage** of these powerful generative models. Presented by Yuxin Chen, with co-speakers Qing Qu and Liyue Shen, the lecture focuses on demystifying the practical efficacy of two widely adopted sampling algorithms: **Denoising Diffusion Probabilistic Models (DDPM)** and **Denoising Diffusion Implicit Models (DDIM)**. The core objective is to bridge the gap between abstract theoretical understanding and the remarkable empirical performance observed in real-world applications.

AI review

This tutorial lecture presents rigorous convergence theory for DDPM and DDIM sampling under realistic conditions — discrete time, imperfect score estimates, minimal distributional assumptions — and delivers a genuinely satisfying theoretical explanation for one of the field's most persistent empirical puzzles: why diffusion models converge in hundreds of steps when naive ambient-dimension analysis predicts millions. The central result, that iteration complexity scales with intrinsic dimension k rather than ambient dimension D, and that this scaling depends critically on the original…