BEGIN:VCALENDAR
VERSION:2.0
PRODID:-//CERN//INDICO//EN
BEGIN:VEVENT
SUMMARY:Séminaire des doctorant·es SO
DTSTART:20251209T101500Z
DTEND:20251209T111500Z
DTSTAMP:20260425T184300Z
UID:indico-event-14459@indico.math.cnrs.fr
DESCRIPTION:Speakers: Iyad Zekhnini (IMT)\, Lucas Monteiro (IMT & ONERA)\,
  Jeremy Boyer (IMT)\n\nLucas Monteiro (IMT & ONERA)\nReliability-based Sha
 pley effects estimation with Normalizing Flows \nWhen studying critical s
 ystems\, such as those used in nuclear or aerospace engineering\, it is cr
 ucial to understand why and how failures occur. At the intersection of sen
 sitivity analysis and reliability analysis lies reliability sensitivity an
 alysis\, which aims to quantify the role played by each variable in the oc
 currence of a failure. In cases where the variables are correlated\, relia
 bility-based Shapley indices are used. However\, their estimation is limit
 ed to systems of dimension less than 10. We propose a new estimation schem
 e for these indices to higher dimensions using Normalizing Flows\, a power
 ful tool derived from generative modeling. In addition\, to manage the dim
 ension\, we use a particular writing of Shapley indices as an expectation 
 based on permutations\, allowing approximation. To quantify the error made
  by the different approximations\, we propose a procedure providing bounds
 . Finally\, we illustrate the performance our method on numerical applicat
 ions.\n\nIyad Zekhnini (IMT)\nHow to handle non-convex optimization proble
 ms to obtain concentration guarantees? Training machine learning models o
 ften results in minimizing the error between the observed data and the mod
 el's predictions. Such a problem is called Empirical Risk Minimization and
  a huge literature provides guarantees that the solutions of this problem 
 get close to the ones of the True Risk (i.e. the expected error over the u
 nknown data distribution) for convex losses. However\, many modern machine
  learning models lead to non-convex risks (e.g. neural networks)\, for whi
 ch very few concentration guarantees exist. Recent works made progress in
  that direction by relying on restrictive assumptions over the geometry of
  the risk functions. In this talk\, we will provide concentrations results
  for the empirical risk minimizers in the non-convex setting by combining 
 tools from functional analysis and non-parametric statistics.\n\nJeremy Bo
 yer (IMT)\nNon stationary empirical processes : Detection and Estimation o
 f time dependent mixtures\nIt is standard in inferential statistics to ass
 ume that our sample $X_1\, ...\, X_n$ consists of independent and identica
 lly distributed random variables. The objective of this work is to remove 
 the assumption of stationarity of the distribution while preserving indepe
 ndence. We assume that $X_i$ has distribution $\\mu_{i/n}$ and we are inte
 rested\, for a given class of functions $\\mathcal{F}$\, in the empirical 
 measure $P_n(f)=n^{-1}\\sum_{i=1}^n f(X_i)$. Under regularity assumptions 
 on $t \\mapsto \\mu_t$ and entropy conditions for $\\mathcal{F}$\, we obta
 in the weak convergence of the associated empirical process\, $\\Gamma_n=\
 \sqrt{n}(P_n-\\overline{\\mu}_n)$\, centered at $\\overline{\\mu}_n=n^{-1}
 \\sum_{i=1}^n \\mu_{i/n}$\, towards an explicit Gaussian process. The aim 
 of this presentation is to apply these results to the study of time-depend
 ent mixture models. We propose a test to determine the order of the mixtur
 e and a second test to estimate the mixture coefficients.\n\nhttps://indic
 o.math.cnrs.fr/event/14459/
LOCATION:Salle K. Johnson (1R3\, 1er étage)
URL:https://indico.math.cnrs.fr/event/14459/
END:VEVENT
END:VCALENDAR
