Near-Optimal Decision Trees in a SPLIT Second
Varun Babbar, Hayden McTavish, Cynthia Rudin, Margo Seltzer
International Conference on Machine Learning 2025 · Oral
This article delves into the groundbreaking work presented by Varun Babbar and Hayden McTavish at ICML 2025, detailing their paper "Near-Optimal Decision Trees in a SPLIT Second." The talk introduces **SPLIT** and **Re-SPLIT**, two novel algorithms designed to revolutionize the construction of interpretable decision trees. At its core, this research addresses a long-standing challenge in machine learning: achieving the high performance of globally optimal decision trees without incurring their prohibitive computational costs, while simultaneously exploring the rich landscape of near-optimal models.
AI review
SPLIT and Re-SPLIT represent a genuinely useful algorithmic contribution to the interpretable ML literature: a principled hybrid strategy that achieves near-optimal decision tree quality at a fraction of the computational cost of exact methods, with a companion algorithm for practical Rashomon set approximation. The core insight — that greedy-optimal performance gaps decay with depth, enabling a lookahead-then-replace strategy — is clean, non-obvious, and well-supported empirically. The polynomial-time Lickety-SPLIT variant is a meaningful theoretical result. The work earns a 4 rather than a…