Beyond Self-Repellent Kernels: History-Driven Target Towards Efficient Nonlinear MCMC on General Graphs

Jie Hu, Yi-Ting Ma, Do-Young Eun

International Conference on Machine Learning 2025 · Oral

At ICML 2025, Jie Hu from NC State University, alongside colleagues Yi-Ting Ma and Professor Do-Young Eun, presented a significant advancement in the realm of Markov Chain Monte Carlo (MCMC) sampling with their work titled "Beyond Self-Repellent Kernels: History-Driven Target Towards Efficient Nonlinear MCMC on General Graphs." This presentation introduced **History-Driven Target MCMC (HDT-MCMC)**, a novel framework designed to dramatically enhance the efficiency and scalability of MCMC samplers, particularly when dealing with complex networks and systems. MCMC methods are indispensable tools for drawing samples from intricate, high-dimensional probability distributions, providing critical insights into systems ranging from social networks to physical phenomena where exhaustive analysis is computationally intractable.

AI review

HDT-MCMC is a clean, technically motivated contribution that resolves a genuine and previously documented tension in history-aware MCMC: the SRW framework from the same group achieved near-zero variance but at O(degree) cost per step, which is antithetical to the whole point of lightweight MCMC. The paradigm shift — move the history from the kernel to the target — is simple enough to state in one sentence, and its consequences (restored O(1) cost, compatibility with non-reversible samplers, provable efficiency gain over SRW via random change of time) are non-trivial and well-motivated. The…