Sundial: A Family of Highly Capable Time Series Foundation Models
Yong Liu, Guo Qin, Zhiyuan Shi, Zhi Chen, Caiyin Yang, Xiangdong Huang, Jianmin Wang, Mingsheng Long
International Conference on Machine Learning 2025 · Oral
Yong Liu from Tsinghua University presented **Sundial**, a novel family of **time series foundation models** designed to overcome long-standing challenges in time series forecasting. The talk delves into the inherent difficulties of applying large language model (LLM) paradigms to continuous, multi-dimensional time series data, particularly the issues of unstructured variations, ambiguous semantics, and the pervasive uncertainty in predictions. Traditional deep learning models, often relying on simplified priors like mean squared error, tend to produce over-smoothed, deterministic forecasts that fail to capture the true probabilistic nature of future events, leading to **mode collapse**.
AI review
Sundial is a competent and well-executed engineering contribution to the time series foundation model space. The application of flow matching to probabilistic time series forecasting is a reasonable technical choice, patch-wise tokenization is a sensible design decision, and the 1-trillion-point pretraining corpus is a real infrastructure investment. That said, this is an applied systems paper, not a theoretical one — and the theoretical vocabulary it borrows (ARMA inspiration, flow matching, generalization) is deployed descriptively rather than analytically. The 'prior-free' framing is…