Séminaire MAD-Stat

Deterministic Equivalents and Scaling Laws for Random Feature Regression

par M. Theodor Misiakiewicz (Yale University)

Europe/Paris
Auditorium 3 (Toulouse School of Economics)

Auditorium 3

Toulouse School of Economics

Description

In this talk, we consider random feature ridge regression (RFRR), a model that has recently gained renewed interest for investigating puzzling phenomena in deep learning—such as double descent, benign overfitting, and scaling laws. Our main contribution is a general deterministic equivalent for the test error of RFRR. Specifically, under a certain concentration property, we show that the test error is well approximated by a closed-form expression that only depends on the feature map eigenvalues. Notably, our approximation guarantee is non-asymptotic, multiplicative, and independent of the feature map dimension—allowing for infinite-dimensional features. This deterministic equivalent can be used to precisely capture the above phenomenology in RFRR. As an example, we derive sharp excess error rates under standard power-law decay, and tightly characterize the optimal parametrization achieving minimax rate.

This is based on joint work with Basil Saeed (Stanford), Leonardo Defilippis (ENS), and Bruno Loureiro (ENS).