Orateur
Description
We revisit the value of stochastic solutions (VSS) in adaptive stochastic optimization. Given a fixed decision, VSS evaluates its suboptimality in contrast to an optimal solution with knowledge of the underlying probability distribution. For example, for a decision given by the sample average approximation (SAA), VSS interprets the value of collecting more data for better decision-making. When the distributional knowledge is ambiguous, VSS is uncomputable and we propose best-case VSS (VSSB) and worst-case VSS (VSSW) to provide a confidence interval. Specifically, we use a Wasserstein ball to model the ambiguity and show that this confidence interval shrinks linearly as the ball radius diminishes to zero. In addition, we derive tractable and conservative approximations for VSSB and VSSW through Lagrangian relaxation, and further improve these approximations through conjugate duality. Notably, the error bounds for these approximations are also linear in the ball radius. Finally, using the newsvendor and the production-transportation problems, we demonstrate the small VSS of the SAA decisions in contrast to other popular decision-making paradigms and the effectiveness of the proposed approximations.