28 juillet 2025 à 1 août 2025
Fuseau horaire Europe/Paris

Optimizer's Information Criterion: Dissecting and Correcting Bias in Data-Driven Optimization

29 juil. 2025, 11:57
24m
Navier

Navier

Orateur

Tianyu Wang

Description

In data-driven optimization, the sample performance of the obtained decision typically incurs an optimistic bias against the true performance, a phenomenon commonly known as the Optimizer's Curse and intimately related to overfitting in machine learning. We develop a general approach that we call Optimizer's Information Criterion (OIC) to correct this bias. OIC generalizes the celebrated Akaike Information Criterion from the evaluation of model adequacy, used primarily for model selection, to objective performance in data-driven optimization which is used for decision selection. Our approach analytically approximates and cancels out the bias that comprises the interplay between model fitting and downstream optimization. As such, it saves the computation need to repeatedly solve optimization problems in cross-validation, while operates more generally than other bias approximating scheme. We apply OIC to a range of data-driven optimization formulations comprising empirical and parametric models, their regularized counterparts, and furthermore contextual optimization. Finally, we provide numerical validation on the superior performance of our approach under synthetic and real-world datasets.

Author

Co-auteurs

Garud Iyengar (Columbia University) Henry Lam (Columbia University)

Documents de présentation

Aucun document.