Navigating Semantic Drift in Task-Agnostic Class-Incremental Learning

Fangwen Wu, Lechao Cheng, Shengeng Tang, Xiaofeng Zhu, Chaowei Fang, Dingwen Zhang, Meng Wang

International Conference on Machine Learning 2025 · Oral

The field of continual or incremental learning stands as a critical frontier in artificial intelligence, aiming to build models that can acquire new knowledge sequentially without forgetting previously learned information. This talk, "Navigating Semantic Drift in Task-Agnostic Class-Incremental Learning," delivered by Hanmo Liu on behalf of a team of researchers, addresses one of the most significant challenges in this domain: catastrophic forgetting. Specifically, the presentation delves into the nuanced issue of feature distribution shifts that occur as a model learns new tasks, identifying a phenomenon termed "semantic drift."

AI review

This paper addresses catastrophic forgetting in class-incremental learning by decomposing feature shift into mean and covariance components ('semantic drift') and proposing corrections via Mean Shift Compensation and Covariance Calibration using Mahalanobis distance, on top of a frozen ViT with LoRA modules. The framing is clean and the ablation is structured, but the theoretical contribution is thin: the decomposition into first- and second-order statistics is a completely standard move in distribution shift analysis, the 'novel' loss functions appear to be minor variations on…