Learning with Expected Signatures: Theory and Applications

Lorenzo Lucchese, Mikko S. Pakkanen, Almut E. D. Veraart

International Conference on Machine Learning 2025 · Oral

This article delves into the theoretical advancements and practical implications presented in the ICML 2025 talk "Learning with Expected Signatures: Theory and Applications." Delivered by Lorenzo Lucchese, in collaboration with his former PhD advisors Mikko S. Pakkanen and Almut E. D. Veraart, the presentation addresses critical challenges in applying **expected signatures**—a powerful non-parametric tool for characterizing probability distributions of paths—to real-world machine learning problems. While expected signatures offer a robust mathematical framework, their utility in practice is often hampered by the discrete, finite, and potentially dependent nature of observed data.

AI review

Lucchese, Pakkanen, and Veraart provide a rigorous statistical theory for empirical expected signatures under realistic data conditions — discrete observations, finite samples, and dependent (long-span) data. The core deliverables are an error decomposition into infill and estimation components, L2 convergence of the infill error under Hölder regularity, consistency and asymptotic normality under stationarity/ergodicity/strong mixing, and a control variate correction for variance reduction near-martingale settings. This is careful, honest statistical theory that fills a genuine gap between…