28 juillet 2025 à 1 août 2025
Fuseau horaire Europe/Paris

Statistical Performance of Subgradient Step-Size Update Rules in Lagrangian Relaxations of Chance-Constrained Optimization Models

29 juil. 2025, 10:45
30m
F202

F202

Contributed talk Chance-constrained programming Chance-constrained programming

Orateur

Dr Bismark Singh (University of Southampton)

Description

Lagrangian relaxation schemes, coupled with a subgradient procedure, are frequently employed to solve chance-constrained optimization models. Subgradient procedures typically rely on step-size update rules. Although there is extensive research on the properties of these step-size update rules, there is little consensus on which rules are most suitable practically; especially, when the underlying model is a computationally challenging instance of a chance-constrained program. To close this gap, we seek to determine whether a single step-size rule can be statistically guaranteed to perform better than others. We couple the Lagrangian procedure with three strategies to identify lower bounds for two-stage chance-constrained programs. We consider two instances of such models that differ in the presence of binary variables in the second-stage. With a series of computational experiments, we demonstrate—in marked contrast to existing theoretical results—that no significant statistical differences in terms of optimality gaps is detected between six well-known step-size update rules. Despite this, our results demonstrate that a Lagrangian procedure provides computational benefit over a naive solution method—regardless of the underlying step-size update rule.

This work is published here: https://doi.org/10.1007/978-3-031-47859-8_26.

Author

Dr Bismark Singh (University of Southampton)

Documents de présentation

Aucun document.