Modern Methods in Associative Memory: A Universal Language for Associative Memory

Dmitry Krotov, Benjamin Hoover, Parikshit Ram

International Conference on Machine Learning 2025 · Tutorial

This talk, presented by Benjamin Hoover and Dmitry Krotov at ICML 2025, introduces a groundbreaking "universal language" for describing and building a wide array of associative memory (AM) models, from classical Hopfield networks to modern transformer architectures and diffusion models. The central goal is to establish a core set of rules, terminology, and mathematical frameworks that can unify seemingly disparate energy-based associative memories. By leveraging concepts like **Legendre transforms** and **Lagrangians**, the speakers demonstrate how diverse activation functions and network dynamics can be expressed within a single, consistent theoretical framework.

AI review

A technically coherent tutorial-style presentation that uses Legendre transforms and Lagrangian duality to unify a broad class of associative memory models—from classical Hopfield networks through modern Transformers and diffusion models—under a single energy-based formalism. The mathematical machinery is real and correctly applied, the Energy Transformer reframing is non-trivial, and the diffusion-model correspondence is a genuinely interesting structural observation. However, this reads substantially as synthesis and repackaging of prior work (Ramsauer et al.'s modern Hopfield…