Orateur
Description
In stochastic programming, solutions are highly sensitive to approximations—whether from sampling, scenario reduction, or parametric perturbations—especially in nonconvex settings. This work investigates how substitute problems, constructed via Rockafellian functions, can provide robustness against such stochastic approximations. Unlike classical stability analysis centered on local perturbations near (local) minimizers, we employ epi-convergence to assess whether approximating problems, derived from stochastic perturbations, converge globally to the true problem. We demonstrate that, under natural assumptions, these substitute problems exhibit well-behaved epi-convergence even when the original stochastic program does not. Furthermore, we quantify convergence rates, which often translate to Lipschitz-type stability for optimal values and solutions under stochastic perturbations. Our framework bridges Rockafellian relaxation techniques with stochastic programming, offering new tools for analyzing robustness in data-driven or distributionally uncertain settings.