Mitja Briscik: Improvement of variables interpretability in kernel PCA
Kernel methods have been proven to be a powerful tool for the integration and analysis of highthroughput technologies-generated data. Kernels offer a nonlinear version of any linear algorithm solely based on dot products. The kernelized version of Principal Component Analysis is a valid nonlinear alternative to tackle the nonlinearity of biological sample spaces. However, the black-box nature of kernel PCA needs new methods to interpret the original features. We propose a novel methodology to obtain a data-driven feature importance based on the KPCA representation of the data. The proposed method, kernel PCA Interpretable Gradient (KPCA-IG), provides a computationally fast solution based solely on linear algebra calculations demonstrating its effectiveness in selecting influential variables in high-dimensional high-throughput datasets, potentially unravelling new biological and medical biomarkers.
Léo Andéol: Confident Object Detection via Conformal Prediction: an Application to Railway Signaling
Deploying deep learning models in real-world certified systems requires the ability to provide confidence estimates that accurately reflect their uncertainty. In this paper, we demonstrate the use of the conformal prediction framework to construct reliable and trustworthy predictors for detecting railway signals. Our approach is based on a novel dataset that includes images taken from the perspective of a train operator and state-of-the-art object detectors. We test several conformal approaches and introduce a new method based on conformal risk control. Our findings demonstrate the potential of the conformal prediction framework to evaluate model performance and provide practical guidance for achieving formally guaranteed uncertainty bounds.
Iain Henderson: Physics-informed random fields. Application to Kriging
Many functions of interest represent physical quantities. As such they may be constrained by known physical laws, which typically take the form of partial differential equations (PDEs). In parallel, when a function to be estimated is unknown or partially known, it is possible to view it as a realization (or sample path) of some random field, a Gaussian process for instance. A natural question is thus the following : how can we choose a random field prior so that its realizations respect as much as possible a given PDE constraint? Moreover, such a choice should be compatible with the theoretical framework in which the properties of the PDE are usually formulated (e.g.Sobolev spaces and weak formulations). In this poster, we present a result describing the general second order random fields whose sample paths verify a given general linear PDE, in a precise sense. This result is phrased solely on the two first moments of the process. We then describe explicit covariance functions tailored to the three dimensional wave equation (in the sense of the result previously mentioned), and present how such kernels can be used for efficient ``physics-informed'' Kriging (also known as Gaussian process regression).