Orateur
Description
Least-squares regression is typically formulated as a quadratic program. This talk presents a novel approach for reducing it to a piecewise linear convex minimization problem within the Risk Quadrangle Framework. Evidently, this problem can be reduced to linear programming. Crucially, this is not a heuristic step: the linearized formulation is statistically justified and shown to be equivalent to quantile regression, estimating the specific quantile corresponding to the mean. As a result, it inherits many benefits of quantile-based approaches, such as effective handling of heteroskedastic data and robustness to outliers.
Building on this result, we introduce two extensions: the expectile quadrangle and the biased mean quadrangle. While the expectile quadrangle is well-established in the literature, the biased mean quadrangle is a novel concept that offers new perspectives for risk management and regression analysis. In the biased mean quadrangle, the risk component can be applied in risk management, while the error component, referred to as the superexpectation error, can be used for biased mean regression.
We conduct extensive numerical experiments to justify our theoretical claims and highlight the computational benefits of our method. Our results highlight the versatility of the Risk Quadrangle Framework in unifying and extending classical regression methods, opening the door to broader applications and deeper theoretical understanding.