One-Step Generalization Ratio Guided Optimization for Domain Generalization

Sumin Cho, Dongwon Kim, Kwangsu Kim

International Conference on Machine Learning 2025 · Oral

In the rapidly evolving landscape of machine learning, the ability of models to generalize effectively to unseen data distributions remains a paramount challenge. This talk, presented by Sumin Cho and Dongwon Kim from Sungkyunkwan University, South Korea, introduces **Genie**, a novel optimizer designed to tackle the critical problem of **Domain Generalization (DG)**. DG aims to train models on a set of source domains such that they perform robustly on entirely new, unseen target domains, a task frequently hindered by models learning spurious correlations inherent in the training data rather than truly generalizable features.

AI review

Genie is a competently executed optimizer for domain generalization that builds a preconditioning scheme around the One-Step Generalization Ratio (OSGR). The core idea — scale updates by their generalization contribution rather than their gradient magnitude — is sensible and motivated, and the PAC-Bayes framing provides a theoretical wrapper that is at least gesturing in the right direction. The work sits comfortably in the space of solid, honest applied-theory: it advances a narrow question, the experiments are reasonably broad, and the plug-and-play design lowers friction for adoption…