Geometry and Statistics in Data Sciences, Paris
de
lundi 5 septembre 2022 (08:30)
à
vendredi 9 décembre 2022 (18:30)
lundi 5 septembre 2022
09:00
High Dimensional Statistics
High Dimensional Statistics
09:00 - 10:30
CARGESE PRE-SCHOOL
11:00
Geometric Statistic
Geometric Statistic
11:00 - 12:30
CARGESE PRE-SCHOOL
14:00
Topological Data Analysis
Topological Data Analysis
14:00 - 15:30
CARGESE PRE-SCHOOL
mardi 6 septembre 2022
09:00
High Dimensional Statistics
High Dimensional Statistics
09:00 - 10:30
CARGESE PRE-SCHOOL
11:00
Geometric Statistic
Geometric Statistic
11:00 - 12:30
CARGESE PRE-SCHOOL
14:00
Topological Data Analysis
Topological Data Analysis
14:00 - 15:30
CARGESE PRE-SCHOOL
mercredi 7 septembre 2022
09:00
High Dimensional Statistics
High Dimensional Statistics
09:00 - 10:30
CARGESE PRE-SCHOOL
11:00
Geometric Statistic
Geometric Statistic
11:00 - 12:30
CARGESE PRE-SCHOOL
14:00
Topological Data Analysis
Topological Data Analysis
14:00 - 15:30
CARGESE PRE-SCHOOL
jeudi 8 septembre 2022
09:00
High Dimensional Statistics
High Dimensional Statistics
09:00 - 10:30
CARGESE PRE-SCHOOL
11:00
Geometric Statistic
Geometric Statistic
11:00 - 12:30
CARGESE PRE-SCHOOL
14:00
Topological Data Analysis
Topological Data Analysis
14:00 - 15:30
CARGESE PRE-SCHOOL
16:00
Topological Data Analysis
Topological Data Analysis
16:00 - 17:30
CARGESE PRE-SCHOOL
vendredi 9 septembre 2022
09:00
High Dimensional Statistics
High Dimensional Statistics
09:00 - 10:30
CARGESE PRE-SCHOOL
11:00
Geometric Statistic
Geometric Statistic
11:00 - 12:30
CARGESE PRE-SCHOOL
samedi 10 septembre 2022
dimanche 11 septembre 2022
lundi 12 septembre 2022
mardi 13 septembre 2022
mercredi 14 septembre 2022
15:30
Welcome coffee 2d floor
Welcome coffee 2d floor
15:30 - 16:30
jeudi 15 septembre 2022
09:00
Eric Klassen - Geometry of Shape Spaces of Curves and Surfaces (1/9)
Eric Klassen - Geometry of Shape Spaces of Curves and Surfaces (1/9)
09:00 - 10:40
11:00
Wolfgang Polonik - Statistical Topological Data Analysis (1/9)
Wolfgang Polonik - Statistical Topological Data Analysis (1/9)
11:00 - 12:40
Room: Amphitheater Darboux
vendredi 16 septembre 2022
samedi 17 septembre 2022
dimanche 18 septembre 2022
lundi 19 septembre 2022
mardi 20 septembre 2022
mercredi 21 septembre 2022
10:00
Joseph Yukich - Asymptotic Analysis of Statistics of Random Geometric Structures (1/2)
Joseph Yukich - Asymptotic Analysis of Statistics of Random Geometric Structures (1/2)
10:00 - 12:00
Room: Amphitheater Darboux
14:00
Joseph Yukich - Asymptotic Analysis of Random Geometric Structures (2/2)
Joseph Yukich - Asymptotic Analysis of Random Geometric Structures (2/2)
14:00 - 16:00
Room: Amphitheater Darboux
jeudi 22 septembre 2022
09:00
Eric Klassen - Geometry of Shape Spaces of Curves and Surfaces (2/9)
Eric Klassen - Geometry of Shape Spaces of Curves and Surfaces (2/9)
09:00 - 10:40
Room: Amphitheater Darboux
11:00
Wolfgang Polonik - Statistical Topological Data Analysis (2/9)
Wolfgang Polonik - Statistical Topological Data Analysis (2/9)
11:00 - 12:40
Room: Amphitheater Darboux
vendredi 23 septembre 2022
samedi 24 septembre 2022
dimanche 25 septembre 2022
lundi 26 septembre 2022
mardi 27 septembre 2022
14:00
Nicolas Charon - A Few Applications of Geometric Measure Theory to Shape Analysis (1/2)
Nicolas Charon - A Few Applications of Geometric Measure Theory to Shape Analysis (1/2)
14:00 - 16:00
Room: Amphitheater Darboux
mercredi 28 septembre 2022
10:00
Nicolas Charon - A Few Applications of Geometric Measure Theory to Shape Analysis (2/2)
Nicolas Charon - A Few Applications of Geometric Measure Theory to Shape Analysis (2/2)
10:00 - 12:00
Room: Amphitheater Darboux
14:00
Quentin Mérigot - Optimal Transport (1/8)
Quentin Mérigot - Optimal Transport (1/8)
14:00 - 15:40
Room: Amphitheater Darboux
16:00
Ery Arias-Castro & Eddie Aamari - Embedding for Data Analysis (1/8)
Ery Arias-Castro & Eddie Aamari - Embedding for Data Analysis (1/8)
16:00 - 17:40
Room: Amphitheater Darboux
jeudi 29 septembre 2022
09:00
Eric Klassen - Geometry of Shape Spaces of Curves and Surfaces (3/9).
Eric Klassen - Geometry of Shape Spaces of Curves and Surfaces (3/9).
09:00 - 10:40
Room: Amphitheater Darboux
11:00
Wolfgang Polonik - Statistical Topological Data Analysis (3/9)
Wolfgang Polonik - Statistical Topological Data Analysis (3/9)
11:00 - 12:40
Room: Amphitheater Darboux
vendredi 30 septembre 2022
samedi 1 octobre 2022
dimanche 2 octobre 2022
lundi 3 octobre 2022
13:45
Introduction
Introduction
13:45 - 14:00
14:00
Sophie Langer- Overcoming the curse of dimensionality with deep neural networks
Sophie Langer- Overcoming the curse of dimensionality with deep neural networks
14:00 - 15:00
Although the application of deep neural networks to real-world problems has become ubiquitous, the question of why they are so effective has not yet been satisfactorily answered. However, some progress has been made in establishing an underlying mathematical foundation. This talk surveys results on statistical risk bounds of deep neural networks. In particular, we focus on the question of when neural networks bypass the curse of dimensionality. Here we discuss results for vanilla feedforward and convolutional neural networks as well as regression and classification settings.
15:00
Adeline Fermanian- Scaling ResNets in the Large-depth Regime
Adeline Fermanian- Scaling ResNets in the Large-depth Regime
15:00 - 16:00
Deep ResNets are recognized for achieving state-of-the-art results in complex machine learning tasks. However, the remarkable performance of these architectures relies on a training procedure that needs to be carefully crafted to avoid vanishing or exploding gradients, particularly as the depth L increases. No consensus has been reached on how to mitigate this issue, although a widely discussed strategy consists in scaling the output of each layer by a factor \alpha_L. We show in a probabilistic setting that with standard i.i.d. initializations, the only non-trivial dynamics is for \alpha_L = 1/ \sqrt{L} (other choices lead either to explosion or to identity mapping). This scaling factor corresponds in the continuous-time limit to a neural stochastic differential equation, contrarily to a widespread interpretation that deep ResNets are discretizations of neural ordinary differential equations. By contrast, in the latter regime, stability is obtained with specific correlated initializations and \alpha_L=1/L. Our analysis suggests a strong interplay between scaling and regularity of the weights as a function of the layer index. Finally, in a series of experiments, we exhibit a continuous range of regimes driven by these two parameters, which jointly impact performance before and after training.
16:30
Mikhail Belkin- Neural networks, wide and deep, singular kernels and Bayes optimality
Mikhail Belkin- Neural networks, wide and deep, singular kernels and Bayes optimality
16:30 - 17:30
Wide and deep neural networks are used in many important practical setting. In this talk I will discuss some aspects of width and depth related to optimization and generalization. I will first discuss what happens when neural networks become infinitely wide, giving a general result for the transition to linearity (i.e., showing that neural networks become linear functions of parameters) for a broad class of wide neural networks corresponding to directed graphs. I will then proceed to the question of depth, showing equivalence between infinitely wide and deep fully connected networks trained with gradient descent and Nadaraya-Watson predictors based on certain singular kernels. Using this connection we show that for certain activation functions these wide and deep networks are (asymptotically) optimal for classification but, interestingly, never for regression. Based on joint work with Chaoyue Liu, Adit Radhakrishnan, Caroline Uhler and Libin Zhu.
mardi 4 octobre 2022
09:00
Clément Berenfeld- Understanding the geometry of high-dimensional data through the reach
Clément Berenfeld- Understanding the geometry of high-dimensional data through the reach
09:00 - 10:00
In high-dimensional statistics, and more particularly in manifold learning, the reach is an ubiquitous regularity parameters that encompasses the well-behavior of the support of the underlying probability measure. Enforcing a reach constraint is, in most geometric inference tasks, a necessity, which raises the question of the estimability of this parameter. We will try to understand how the reach relates to many other important geometric invariants and propose and estimation strategy that relies on estimating the intrinsic metric of the data. (Joint work with Eddie Aamari and Clément Levrard)
10:00
Wolfgang Polonik- Topologically penalized regression on manifolds
Wolfgang Polonik- Topologically penalized regression on manifolds
10:00 - 11:00
We study a regression problem on a compact manifold. In order to take advantage of the underlying geometry and topology of the data, we propose to perform the regression task on the basis of eigenfunctions of the Laplace-Beltrami operator of the manifold that are regularized with topological penalties. We will discuss the approach and the penalties, provide some supporting theory and illustrate the performance of the methodology on some data sets, illustrating the relevance of our approach in the case where the target function is ``topologically smooth”. This is joint work with O. Hacquard, K. Balasubramanian, G. Blanchard and C. Levrard.
11:30
John Harlim- Leveraging the RBF operator estimation for manifold learning.
John Harlim- Leveraging the RBF operator estimation for manifold learning.
11:30 - 12:30
I will discuss the radial-basis function pointwise and weak formulations for approximating Laplacians on functions and vector fields based on randomly sampled point cloud data, whose spectral properties are relevant to manifold learning. For the pointwise formulation, I will demonstrate the importance of the novel local tangent estimation that accounts for the curvature, which crucially improves the quality of the operator estimation. I will report the spectral theoretical convergence results of these formulations and their strengths/weaknesses in practice. Supporting numerical examples, involving the spectral estimation of the Laplace-Beltrami operator and various vector Laplacians such as the Bochner, Hodge, and Lichnerowicz Laplacians will be demonstrated with appropriate comparisons to the standard graph-based approaches.
14:30
Marina Meila - Manifold Learning, Explanations and Eigenflows
Marina Meila - Manifold Learning, Explanations and Eigenflows
14:30 - 15:30
This talk will extend Manifold Learning in two directions. First, we ask if it is possible, in the case of scientific data where quantitative prior knowledge is abundant, to explain a data manifold by new coordinates, chosen from a set of scientifically meaningful functions? Second, we ask how popular Manifold Learning tools and their applications can be recreated in the space of vector fields and flows on a manifold. Central to this approach is the order 1-Laplacian of a manifold, $\Delta_1$, whose eigen-decomposition into gradient, harmonic, and curl, known as the Helmholtz-Hodge Decomposition, provides a basis for all vector fields on a manifold. We present an estimator for $\Delta_1$, and based on it we develop a variety of applications. Among them, visualization of the principal harmonic, gradient or curl flows on a manifold, smoothing and semi-supervised learning of vector fields, 1-Laplacian regularization. In topological data analysis, we describe the 1st-order analogue of spectral clustering, which amounts to prime manifold decomposition. Furthermore, from this decomposition a new algorithm for finding shortest independent loops follows. The algorithms are illustrated on a variety of real data sets. Joint work with Yu-Chia Chen, Samson Koelle, Hanyu Zhang and Ioannis Kevrekidis
15:30
Franck Picard - A probabilistic Graph Coupling View of Dimension Reduction
Franck Picard - A probabilistic Graph Coupling View of Dimension Reduction
15:30 - 16:30
Dimension reduction is a standard task in machine learning, to reduce the complexity and represent the data at hand. Many (and more than many!) methods have been proposed for this purpose, among which the seminal principal component analysis (PCA), that approximates the data linearly with a reduced number of axes. In recent years, the field has witness the emergence of new non linear methods, like the Stochastic Neighbor Embedding method (SNE) and the Uniform Manifold Approximation and Projection method (UMAP), that proposes very efficient low-dimensional representations of the observations. Though widely used, these approaches lack clear probabilistic foundations to enable a full understanding of their properties and limitations. A common feature of these techniques is to be based on a minimization of a cost between input and latent pairwise similarities, but the generative model is still missing. In this work we introduce a unifying statistical framework based on the coupling of hidden graphs using cross entropy. These graphs induce a Markov random field dependency structure among the observations in both input and latent spaces. We show that existing pairwise similarity dimension reduction methods can be retrieved from our framework with particular choices of priors for the graphs. Moreover this reveals that these methods suffer from a statistical deficiency that explains poor performances in conserving coarse-grain dependencies. Our model is leveraged and extended to address this issue while new links are drawn with Laplacian eigenmaps and PCA.
17:00
Alexander Cloninger - Learning on and near Low-Dimensional Subsets of the Wasserstein Manifold
Alexander Cloninger - Learning on and near Low-Dimensional Subsets of the Wasserstein Manifold
17:00 - 18:00
Detecting differences and building classifiers between distributions $\{\mu_i\}_{i=1}^N$, given only finite samples, are important tasks in a number of scientific fields. Optimal transport (OT) has evolved as the most natural concept to measure the distance between distributions, and has gained significant importance in machine learning in recent years. There are some drawbacks to OT: computing OT can be slow, and because OT is a distance metric, it only yields a pairwise distance matrix between distributions rather than embedding those distributions into a vector space. If we make no assumptions on the family of distributions, these drawbacks are difficult to overcome. However, in the case that the measures are generated by push-forwards by elementary transformations, forming a low-dimensional submanifold of the Wasserstein manifold, we can deal with both of these issues on a theoretical and a computational level. In this talk, we'll show how to embed the space of distributions into a Hilbert space via linearized optimal transport (LOT), and how linear techniques can be used to classify different families of distributions generated by elementary transformations and perturbations. The proposed framework significantly reduces both the computational effort and the required training data in supervised settings. Similarly, we'll demonstrate the ability to learn a near isometric embedding of the low-dimensional submanifold. Finally, we'll provide non-asymptotic bounds on the error induced in both the supervised and unsupervised algorithms from finitely sampling the target distributions and projecting the LOT Hilbert space into a finite dimensional subspace. We demonstrate the algorithms in pattern recognition tasks in imaging and provide some medical applications.
18:00
Cocktail
Cocktail
18:00 - 20:30
mercredi 5 octobre 2022
09:00
Claudia Strauch- On high-dimensional Lévy-driven Ornstein–Uhlenbeck processes
Claudia Strauch- On high-dimensional Lévy-driven Ornstein–Uhlenbeck processes
09:00 - 10:00
We investigate the problem of estimating the drift parameter of a high-dimensional Lévy-driven Ornstein–Uhlenbeck process under sparsity constraints. It is shown that both Lasso and Slope estimators achieve the minimax optimal rate of convergence (up to numerical constants), for tuning parameters chosen independently of the confidence level. The results are non-asymptotic and hold both in probability and conditional expectation with respect to an event resembling the restricted eigenvalue condition. Based on joint work with Niklas Dexheimer.
10:00
Botond Szabo- Linear methods for nonlinear inverse problems
Botond Szabo- Linear methods for nonlinear inverse problems
10:00 - 11:00
We consider recovering an unknown function f from a noisy observation of the solution u to a partial differential equation, where for the elliptic differential operator L, the map L(u) can be written as a function of u and f, under Dirichlet boundary condition. A particular example is the time-independent Schrödinger equation. We transform this problem into the linear inverse problem of recovering L(u), and show that Bayesian methods for this problem may yield optimal recovery rates not only for u, but also for f. The prior distribution may be placed on u or its elliptic operator. Adaptive priors are shown to yield adaptive contraction rates for f, thus eliminating the need to know the smoothness of this function. Known results on uncertainty quantification for the linear problem transfer to f as well. The results are illustrated by several numerical simulations. This is a joint work with Geerten Koers and Aad van der Vaart.
11:30
Judith Rousseau- Bayesian nonparametric estimation of a density living near an unknown manifold
Judith Rousseau- Bayesian nonparametric estimation of a density living near an unknown manifold
11:30 - 12:30
In high dimensions it is common to assume that the data have a lower dimensional structure. In this work we consider that the observations are iid and with a distribution whose support is concentrated near a lower dimensional manifold. Neither the manifold nor the density is known. A typical example is for noisy observations on an unknown low dimensional manifold. We consider a family of Bayesian nonparametric density estimators based on location - scale Gaussian mixture priors and we study the asymptotic properties of the posterior distribution. Our work shows in particular that non conjuguate location - scale Gaussian mixture models can adapt to complex geometries and spatially varying regularity. This talk will also review the various aspects of mixtures of Gaussian for density estimation. Joint work with Clément Berenfeld (Dauphine) and Paul Rosa (Oxford)
jeudi 6 octobre 2022
09:00
Denis Belometsny- Dimensionality reduction in reinforcement learning by randomisation
Denis Belometsny- Dimensionality reduction in reinforcement learning by randomisation
09:00 - 10:00
In reinforcement learning an agent interacts with an environment, whose underlying mechanism is unknown, by sequentially taking actions, receiving rewards, and transitioning to the next state. With the goal of maximizing the expected sum of the collected rewards, the agent must carefully balance between exploring in order to gather more information about the environment and exploiting the current knowledge to collect the rewards. In this talk, we are interested in solving this exploration-exploitation dilemma by injecting noise into the agent’s decision-making process in such a way that the dependence of the regret on the dimension of state and action spaces is minimised. We also review some recent approaches towards dimension reduction in RL.
10:00
Gilles Blanchard - Stein effect for estimating many vector means: a "blessing of dimensionality" phenomenon
Gilles Blanchard - Stein effect for estimating many vector means: a "blessing of dimensionality" phenomenon
10:00 - 11:00
Consider the problem of joint estimation of the means for a large number of distributions in R^d using separate, independent data sets from each of them, sometimes also called "multi-task averaging" problem. We propose an improved estimator (compared to the naive empirical means of each data set) to exploit possible similarities between means, without any related information being known in advance. First, for each data set, similar or neighboring means are determined from the data by multiple testing. Then each naive estimator is shrunk towards the local average of its neighbors. We prove that this approach provides a reduction in mean squared error that can be significant when the (effective) dimensionality of the data is large, and when the unknown means exhibit structure such as clustering or concentration on a low-dimensional set. This is directly linked to the fact that the separation distance for testing is smaller than the estimation error in high dimension and generalizes the well-known James-Stein phenomenon. An application of this approach is the estimation of multiple kernel mean embeddings, which plays an important role in many modern applications. (This is based on joined work with Hannah Marienwald and Jean-Baptiste Fermanian)
11:30
Nicolas Verzelen - Optimal Permutation Estimation in Crowd-Sourcing problems
Nicolas Verzelen - Optimal Permutation Estimation in Crowd-Sourcing problems
11:30 - 12:30
Motivated by crowd-sourcing applications, we consider a model where we have partial observations from a bivariate isotonic $n\times d$ matrix with an unknown permutation $\pi^*$ acting on its rows. We consider the twin problems of recovering the permutation $\pi^*$ and estimating the unknown matrix. We introduce a polynomial-time procedure achieving the minimax risk for these two problems, this for all possible values of $n$, $d$, and all possible sampling efforts. Along the way, we establish that, in some regimes, recovering the unknown permutation $\pi^*$ is considerably simpler than estimating the matrix. This is based on a joint work with Alexandra Carpentier (U. Potsdam) and Emmanuel Pilliat (U. Montpellier).
14:30
Claire Lacour - On the use of overfitting for estimator selection
Claire Lacour - On the use of overfitting for estimator selection
14:30 - 15:30
In this talk we consider the problem of estimator selection. In the case of density estimation, we study a method called PCO, which is intermediate between Lepski's method and penalized empirical risk minimization. The key point is the comparison of all the estimators to the overfitted one. We provide some theoretical results which lead to some fully data-driven selection strategy. We will also show the numerical performance of the method. This is a joint work with P. Massart, V. Rivoirard and S. Varet.
15:30
Peter Bartlett - The Dynamics of Sharpness-Aware Minimization.
Peter Bartlett - The Dynamics of Sharpness-Aware Minimization.
15:30 - 16:30
Optimization methodology has been observed to affect statistical performance in high-dimensional prediction problems, and there has been considerable effort devoted to understanding the behavior of optimization methods and the nature of solutions that they find. We consider Sharpness-Aware Minimization (SAM), a gradient-based optimization method that has exhibited performance improvements over gradient de- scent on image and language prediction problems using deep networks. We show that when SAM is applied with a convex quadratic objective, for most random initializa- tions it converges to oscillating between either side of the minimum in the direction with the largest curvature, and we provide bounds on the rate of convergence. In the non-quadratic case, we show that such oscillations encourage drift toward wider minima by effectively performing gradient descent, on a slower time scale, on the spectral norm of the Hessian. (Based on joint work with Olivier Bousquet and Phil Long)
vendredi 7 octobre 2022
09:00
Johannes Schmidt-Hieber - A statistical analysis of an image classification problem
Johannes Schmidt-Hieber - A statistical analysis of an image classification problem
09:00 - 10:00
The availability of massive image databases resulted in the development of scalable machine learning methods such as convolutional neural network (CNNs) filtering and processing these data. While the very recent theoretical work on CNNs focuses on standard nonparametric denoising problems, the variability in image classification datasets does, however, not originate from additive noise but from variation of the shape and other characteristics of the same object across different images. To address this problem, we consider a simple supervised classification problem for object detection on grayscale images. While from the function estimation point of view, every pixel is a variable and large images lead to high-dimensional function recovery tasks suffering from the curse of dimensionality, increasing the number of pixels in our image deformation model enhances the image resolution and makes the object classification problem easier. We propose and theoretically analyze two different procedures. The first method estimates the image deformation by support alignment. Under a minimal separation condition, it is shown that perfect classification is possible. The second method fits a CNN to the data. We derive a rate for the misclassification error depending on the sample size and the number of pixels. Both classifiers are empirically compared on images generated from the MNIST handwritten digit database. The obtained results corroborate the theoretical findings. This is joint work with Sophie Langer (Twente).
10:00
Damien Garreau - What does LIME really see in images?
Damien Garreau - What does LIME really see in images?
10:00 - 11:00
The performance of modern algorithms on certain computer vision tasks such as object recognition is now close to that of humans. This success was achieved at the price of complicated architectures depending on millions of parameters and it has become quite challenging to understand how particular predictions are made. Interpretability methods propose to give us this understanding. In this paper, we study LIME, perhaps one of the most popular. On the theoretical side, we show that when the number of generated examples is large, LIME explanations are concentrated around a limit explanation for which we give an explicit expression. We further this study for elementary shape detectors and linear models. As a consequence of this analysis, we uncover a connection between LIME and integrated gradients, another explanation method. More precisely, the LIME explanations are similar to the sum of integrated gradients over the superpixels used in the preprocessing step of LIME.
11:30
Vasiliki Velona - Learning a partial correlation graph using only a few covariance queries.
Vasiliki Velona - Learning a partial correlation graph using only a few covariance queries.
11:30 - 12:30
In settings where the covariance matrix is too large to even store, we would like to learn the partial correlation graph with as few covariance queries as possible (in a partial correlation graph, an edge exists if the corresponding entry in the inverse covariance matrix is non-zero). In recent work with Gabor Lugosi, Jakub Truszkowski, and Piotr Zwiernik, we showed that it is possible to use only a quasi-linear number of queries if the inverse covariance matrix is sparse enough, in the sense that the partial correlation graph resembles a tree on a global scale. I will explain these results and discuss extensions and applications.
samedi 8 octobre 2022
dimanche 9 octobre 2022
lundi 10 octobre 2022
13:30
Stefan Sommer -Diffusions means in geometric statistics
Stefan Sommer -Diffusions means in geometric statistics
13:30 - 14:30
Analysis and statistics of shape variation and, more generally, manifold valued data can be formulated probabilistically with geodesic distances between shapes exchanged with (-log)likelihoods. This leads to new statistics and estimation algorithms. One example is the notion of diffusion mean. In the talk, I will discuss the motivation behind and construction of diffusion means and discuss properties of the mean, including reduced smeariness when estimating diffusion variance together with the mean. This happens both in the isotropic setting with trivial covariance, and in the anisotropic setting where variance is fitted in all directions. I will connect this to most probable paths to data and algorithms for computing diffusion means, particularly bridge sampling algorithms. Finally, we will discuss ways of sampling the diffusion mean directly by conditioning on the diagonal of product manifolds, thereby avoiding the computationally expensive iterative optimization that is often applied for computing means on manifolds.
14:30
Nina Miolane -Geomstats: a Python package for Geometric Machine Learning
Nina Miolane -Geomstats: a Python package for Geometric Machine Learning
14:30 - 15:30
We introduce Geomstats, an open-source Python package for computations and statistics on nonlinear manifolds that appear in machine learning applications, such as: hyperbolic spaces, spaces of symmetric positive definite matrices, Lie groups of transformations, and many more. We provide object-oriented and extensively unit-tested implementations. Manifolds come equipped with families of Riemannian metrics with associated exponential and logarithmic maps, geodesics, and parallel transport. Statistics and learning algorithms provide methods for estimation, regression, classification, clustering, and dimension reduction on manifolds. All associated operations provide support for different execution backends --- namely NumPy, Autograd, PyTorch, and TensorFlow. This talk presents the package, compares it with related libraries, and provides relevant examples. We show that Geomstats provides reliable building blocks to both foster research in differential geometry and statistics and democratize the use of (Riemannian) geometry in statistics and machine learning. The source code is freely available under the MIT license at https://github.com/geomstats/geomstats.
16:00
Yusu Wang -Weisfeiler-Lehman Meets Gromov-Wasserstein
Yusu Wang -Weisfeiler-Lehman Meets Gromov-Wasserstein
16:00 - 17:00
The Weisfeiler-Lehman (WL) test is a classical procedure for graph isomorphism testing. The WL test has also been widely used both for designing graph kernels and for analyzing graph neural networks. In this talk, I will describe the so-called Weisfeiler-Lehman (WL) distance we recently introduced, which is a new notion of distance between labeled measure Markov chains (LMMCs), of which labeled graphs are special cases. The WL distance extends the WL test (in the sense that the former is positive if and only if the WL test can distinguish the two involved graphs) while at the same time it is polynomial time computable. It is also more discriminating than the distance between graphs used for defining the Wasserstein Weisfeiler-Lehman graph kernel. Inspired by the structure of the WL distance we identify a neural network architecture on LMMCs which turns out to be universal w.r.t. continuous functions defined on the space of all LMMCs (which includes all graphs) endowed with the WL distance. Furthermore, the WL distance turns out to be stable w.r.t. a natural variant of the Gromov-Wasserstein (GW) distance for comparing metric Markov chains that we identify. Hence, the WL distance can also be construed as a polynomial time lower bound for the GW distance which is in general NP-hard to compute. This is joint work with Samantha Chen, Sunhyuk Lim, Facundo Memoli and Zhengchao Wan.
mardi 11 octobre 2022
09:00
Martin Bauer -Elastic Shape Analysis of Surface
Martin Bauer -Elastic Shape Analysis of Surface
09:00 - 10:00
10:00
Eric Klassen (The Square Root Normal Field and Unbalanced Optimal Transport)
Eric Klassen (The Square Root Normal Field and Unbalanced Optimal Transport)
10:00 - 11:00
The Square Root Normal Field (SRNF) is a distance function on shape spaces of surfaces in R^3. Unbalanced Optimal Transport (UOT) is a variant of Optimal Transport in which mass is allowed to expand and contract as it is transported from one point to another. In this talk (joint work of Bauer, Hartman and Klassen) we discuss an unexpected relation between the SRNF distance for oriented surfaces in R^3 and UOT for Borel measures on S^2.
11:30
Steve Oudot (Optimization in topological data analysis)
Steve Oudot (Optimization in topological data analysis)
11:30 - 12:30
This talk will give an overview of the line of work on optimization for topological data analysis, from the initial attempts at differentiating the persistent homology operator, to the recent adaptations of stochastic gradient descent and gradient sampling.
14:30
Omer Bobrowski -Universality in Random persistence Diagrams
Omer Bobrowski -Universality in Random persistence Diagrams
14:30 - 15:30
One of the most elusive challenges within the area of topological data analysis is understanding the distribution of persistence diagrams. Despite much effort, this is still largely an open problem. In this talk we will present a series of conjectures regarding the behavior of persistence diagrams arising from random point-clouds. We claim that, viewed in the right way, persistence values obey a universal probability law, that depends on neither the underlying space nor the original distribution of the point-cloud. We back these conjectures with an exhaustive set of experiments, including both simulated and real data. We will also discuss some heuristic explanations for the possible sources of this phenomenon. Finally, we will demonstrate the power of these conjectures by proposing a new hypothesis testing framework for computing significance values for individual features within persistence diagrams. This is joint work with Primoz Skraba (QMUL).
15:30
Kathryn Hess -Morse-theoretic signal compression and reconstruction
Kathryn Hess -Morse-theoretic signal compression and reconstruction
15:30 - 16:30
In this lecture I will present work of three of my PhD students, Stefania Ebli, Celia Hacker, and Kelly Maggs, on cellular signal processing. In the usual paradigm, the signals on a simplicial or chain complex are processed using the combinatorial Laplacian and the resultant Hodge decomposition. On the other hand, discrete Morse theory has been widely used to speed up computations, by reducing the size of complexes while preserving their global topological properties. Ebli, Hacker, and Maggs have developed an approach to signal compression and reconstruction on chain complexes that leverages the tools of algebraic discrete Morse theory,, which provides a method to reduce and reconstruct a based chain complex together with a set of signals on its cells via deformation retracts, preserving as much as possible the global topological structure of both the complex and the signals. It turns out that any deformation retract of real degreewise finite-dimensional based chain complexes is equivalent to a Morse matching. Moreover, in the case of certain interesting Morse matchings, the reconstruction error is trivial, except on one specific component of the Hodge decomposition. Finally, the authors developed and implemented an algorithm to compute Morse matchings with minimal reconstruction error, of which I will show explicit examples.
18:30
Cocktail (Tour Zamansky, Jussieu)
Cocktail (Tour Zamansky, Jussieu)
18:30 - 20:30
mercredi 12 octobre 2022
09:00
Joannes Krebs -On the law of the iterated logarithm and Bahadur representation in stochastic geometry
Joannes Krebs -On the law of the iterated logarithm and Bahadur representation in stochastic geometry
09:00 - 10:00
Room: Amphitheater Darboux
We study the law of the iterated logarithm and a related strong invariance principle for certain functionals in stochastic geometry. The underlying point process is either a homogeneous Poisson process or a binomial process. Moreover, requiring the functional to be a sum of so-called stabilizing score functionals enables us to derive a Bahadur representation for sample quantiles. The scores are obtained from a homogeneous Poisson process. We also study local fluctuations of the corresponding empirical distribution function and apply the results to trimmed and Winsorized means of the scores. As potential applications, we think of well-known functionals defined on the k-nearest neighbors graph and important functionals in topological data analysis such as the Euler characteristic and persistent Betti numbers as well as statistics defined on Poisson-Voronoi tessellations.
10:00
Katharine Turner - The Extended Persistent Homology Transform for Manifolds with Boundary
Katharine Turner - The Extended Persistent Homology Transform for Manifolds with Boundary
10:00 - 11:00
The Persistent Homology Transform (PHT) is a topological transform which can be use to quantify the difference between subsets of Euclidean space. To each unit vector the transform assigns the persistence module of the height function over that shape with respect to that direction. The PHT is injective on piecewise-linear subsets of Euclidean space, and it has been demonstrably useful in diverse applications. One shortcoming is that shapes with different essential homology (i.e., Betti numbers) have an infinite distance between them. The theory of extended persistence for Morse functions on a manifold was developed by Cohen-Steiner, Edelsbrunner and Harer in 2009 to quantify the support of the essential homology classes. By using extended persistence modules of height functions over a shape, we obtain the extended persistent homology transform (XPHT) which provides a finite distance between shapes even when they have different Betti numbers. I will discuss how the XPHT of a manifold with boundary can be deduced from the XPHT of the boundary which allows for efficient calculation. James Morgan has implemented the required algorithms for 2-dimensional binary images as a forthcoming R-package. Work is also with Vanessa Robins.
11:30
Heather Harington- shape of data in biology.
Heather Harington- shape of data in biology.
11:30 - 12:30
TBA
14:30
Frédéric Barbaresco -Symplectic Foliation Model of Information Geometry for Statistics and Learning on Lie Groups
Frédéric Barbaresco -Symplectic Foliation Model of Information Geometry for Statistics and Learning on Lie Groups
14:30 - 15:30
We present a new symplectic model of Information Geometry [1,2] based on Jean-Marie Souriau's Lie Groups Thermodynamics [3,4]. Souriau model was initially described in chapter IV “Statistical Mechanics” of his book “Structure of dynamical systems” published in 1969. This model gives a purely geometric characterization of Entropy, which appears as an invariant Casimir function in coadjoint representation, characterized by Poisson cohomology. Souriau has proved that we can associate a symplectic manifold to coadjoint orbits of a Lie group by the KKS 2-form (Kirillov, Kostant, Souriau 2-form) in the affine case (affine model of coadjoint operator equivariance via Souriau's cocycle) [5], that we have identified with Koszul-Fisher metric from Information Geometry. Souriau established the generalized Gibbs density covariant under the action of the Lie group. The dual space of the Lie algebra foliates into coadjoint orbits that are also the Entropy level sets that could be interpreted in the framework of Thermodynamics by the fact that dynamics on these symplectic leaves are non-dissipative, whereas transversal dynamics, given by Poisson transverse structure, are dissipative. We will finally introduce Gaussian distribution on the space of Symmetric Positive Definite (SPD) matrices, through Souriau's covariant Gibbs density by considering this space as the pure imaginary axis of the homogeneous Siegel upper half space where Sp(2n,R)/U(n) acts transitively. We will also consider Gibbs density for Siegel Disk where SU(n,n)/S(U(n)xU(n)) acts transitively. Gauss density of SPD matrices is then computed through Souriau's moment map and coadjoint orbits. Souriau’s Lie Groups Thermodynamics model will be further explored in European COST network CaLISTA [6] and European HORIZON-MSCA project CaLIGOLA [7].
15:30
Victor Patrangenaru -Geometry, Topology and Statistics on Object Spaces
Victor Patrangenaru -Geometry, Topology and Statistics on Object Spaces
15:30 - 16:30
jeudi 13 octobre 2022
09:00
Nicolas Charon - Registration of shape graphs with partial matching constraints
Nicolas Charon - Registration of shape graphs with partial matching constraints
09:00 - 10:00
This talk will discuss an extension of the elastic curve registration framework to a general class of geometric objects which we call (weighted) shape graphs, allowing in particular the comparison and matching of 1D geometric data that are partially observed or that exhibit certain topological inconsistencies. Specifically, we generalize the class of second-order invariant Sobolev metrics on the space of unparametrized curves to weighted shape graphs by modelling such objects as varifolds (i.e. directional measures) and combining geometric deformations with a transformation process on the varifold weights. This leads us to introduce a new class of variational problems, show the existence of solutions and derive a specific numerical scheme to tackle the corresponding discrete optimization problems.
10:00
Irène Kaltenmark - Curves and surfaces. Partial matching in the space of varifolds.
Irène Kaltenmark - Curves and surfaces. Partial matching in the space of varifolds.
10:00 - 11:00
The matching of analogous shapes is a central problem in computational anatomy. However, inter-individual variability, pathological anomalies or acquisition methods sometimes challenge the assumption of global homology between shapes. In this talk, I will present an asymmetric data attachment term characterizing the inclusion of one shape in another. This term is based on projection on the nearest neighbor with respect to the metrics of varifold spaces. Varifolds are representations of geometric objects, including curves and surfaces. Their specificity is to take into account the tangent spaces of these objects and to be robust to the choice of parametrization. This new data attachment term extends the scope of application of the pre-existing methods of matching by large diffeomorphic deformations (LDDMM). The partial registration is indeed induced by a diffeomorphic deformation of the source shape. The anatomical (topological) characteristics of this shape are thus preserved. This is a joint work with Pierre-Louis Antonsanti and Joan Glaunès.
11:30
Herbert Edelsbrunner - Chromatic Delaunay mosaics for chromatic point data.
Herbert Edelsbrunner - Chromatic Delaunay mosaics for chromatic point data.
11:30 - 12:30
The chromatic Delaunay mosaic of s+1 finite sets in d dimensions is an (s+d)-dimensional Delaunay mosaic that represents the individual sets as well as their interactions. For example, it contains a (non-standard) dual of the overlay of the Voronoi tessellations of any subset of the s+1 colors. We prove bounds on the size of the chromatic Delaunay mosaic, in the worst and average case, and suggest how to use image, kernel, and cokernel persistence to get stable diagrams describing the interaction of the points of different colors. Acknowledgements. This is incomplete and ongoing joint work with Ranita Biswas, Sebastiano Cultrera, Ondrej Draganov, and Morteza Saghafian, all at IST Austria.
14:30
Claire Brecheteau - Approximating data with a union of ellipsoids and clustering.
Claire Brecheteau - Approximating data with a union of ellipsoids and clustering.
14:30 - 15:30
I will introduce a surrogate for the distance function to the support of a distribution, which sublevel sets are unions of balls or of ellipsoids. I will expose different results, including rates of convergence for the approximation of these surrogates with their empirical versions, built from pointclouds. I will explain how to use such estimators to cluster data with a geometric structure. The results have been published in the papers [1,2], and are still in progress.
15:30
Dominique Attali- Reconstructing manifolds by weighted $\ell_1$-norm minimization
Dominique Attali- Reconstructing manifolds by weighted $\ell_1$-norm minimization
15:30 - 16:30
In many practical situations, the shape of interest is only known through a finite set of data points. Given as input those data points, it is then natural to try to construct a triangulation of the shape, that is, a set of simplices whose union is homeomorphic to the shape. This problem has given rise to many research works in the computational geometry community, motivated by applications to 3D model reconstruction and manifold learning. In this talk, we focus on one particular instance of the shape reconstruction problem, in which the shape we wish to reconstruct is an orientable smooth $d$-manifold embedded in $\mathbb{R}^N$. We reformulate the problem of searching for a triangulation as a convex minimization problem, whose objective function is a weighted $\ell_1$-norm. I will then present the result in \cite{socg2022} which says that, under appropriate conditions, the solution of our minimization problem is indeed a triangulation of the manifold and that this triangulation coincides with a variant of the tangential Delaunay complex. This is a joint work with André Lieutier.
vendredi 14 octobre 2022
09:00
Barbara Gris - Defining Data-Driven Deformation Models
Barbara Gris - Defining Data-Driven Deformation Models
09:00 - 10:00
Studying shapes through large deformations allows to define a metric on a space of shapes from a metric on a space of deformations. When the set of considered deformations is not rel-evant to the observed data, the geodesic paths for this metric can be deceiving from a model-ling point of view. To overcome this issue, the notion of deformation module allows to incorpo-rate prior coming from the data in the set of considered deformations and the metric. I will pre-sent this framework, as well as the IMODAL library which enables to perform registration through such structured deformations. This Python library is modular: adapted priors can be easily defined by the user, several priors can be combined into a global one and various types of data can be considered such as curves, meshes or images. This is a joint work with Benjamin Charlier, Leander Lacroix and Alain Trouvé.
10:00
Laurent Younes -Stochastic Gradient Descent for Large-Scale LDDMM
Laurent Younes -Stochastic Gradient Descent for Large-Scale LDDMM
10:00 - 11:00
11:30
Stephen Preston - Isometric immersions and the waving of flags
Stephen Preston - Isometric immersions and the waving of flags
11:30 - 12:30
A physical flag can be modeled geometrically as an isometric immersion of a rectangle into space, with one edge fixed along the flagpole. Its motion, in the absence of gravity and wind, can be modeled as a geodesic in the space of all isometric immersions, where the Riemannian metric is inherited from the kinetic energy on the much larger space of all immersions. In this talk I will show how generically such an isometric immersion can be described completely by the curve describing the top or bottom edge, which gives a global version of a classical local result in differential geometry. Using this, I will show how to derive the geodesic equation, which turns out to be a highly nonlinear, nonlocal coupled system of two wave equations in one space variable, with tension determined by solving an ODE system. The new model has the potential to describe motion of cloth with much fewer variables than the traditional method of strongly constraining three functions of two space variables. This is joint work with Martin Bauer and Jakob Moeller-Andersen
samedi 15 octobre 2022
dimanche 16 octobre 2022
lundi 17 octobre 2022
mardi 18 octobre 2022
14:00
Mikhail Belkin - Mathematical Aspects of Deep Learning (1/2)
Mikhail Belkin - Mathematical Aspects of Deep Learning (1/2)
14:00 - 15:30
Room: Amphitheater Darboux
16:00
Yusu Wang - Some Theoretical Aspects of Graph Neural Networks (and Higher Order Variants) (1/2)
Yusu Wang - Some Theoretical Aspects of Graph Neural Networks (and Higher Order Variants) (1/2)
16:00 - 17:30
Room: Amphitheater Darboux
mercredi 19 octobre 2022
09:00
Mikhail Belkin - Mathematical Aspects of Deep Learning (2/2)
Mikhail Belkin - Mathematical Aspects of Deep Learning (2/2)
09:00 - 10:30
Room: Amphitheater Darboux
11:00
Yusu Wang - Some Theoretical Aspects of Graph Neural Networks (and Higher Order Variants) (2/2)
Yusu Wang - Some Theoretical Aspects of Graph Neural Networks (and Higher Order Variants) (2/2)
11:00 - 12:30
Room: Amphitheater Darboux
14:00
Quentin Mérigot - Optimal Transport (2/8)
Quentin Mérigot - Optimal Transport (2/8)
14:00 - 15:40
Room: Amphitheater Darboux
16:00
Ery Arias-Castro & Eddie Aamari - Embedding for Data Analysis (2/8)
Ery Arias-Castro & Eddie Aamari - Embedding for Data Analysis (2/8)
16:00 - 17:40
Room: Amphitheater Darboux
jeudi 20 octobre 2022
09:00
Eric Klassen - Geometry of Shape Spaces of Curves and Surfaces (4/9).
Eric Klassen - Geometry of Shape Spaces of Curves and Surfaces (4/9).
09:00 - 10:40
Room: Amphitheater Darboux
11:00
Wolfgang Polonik - Statistical Topological Data Analysis (4/9)
Wolfgang Polonik - Statistical Topological Data Analysis (4/9)
11:00 - 12:40
Room: Amphitheater Darboux
vendredi 21 octobre 2022
samedi 22 octobre 2022
dimanche 23 octobre 2022
lundi 24 octobre 2022
mardi 25 octobre 2022
mercredi 26 octobre 2022
14:00
Quentin Mérigot - Optimal Transport (3/8)
Quentin Mérigot - Optimal Transport (3/8)
14:00 - 15:40
Room: Amphitheater Darboux
16:00
Ery Arias-Castro & Eddie Aamari - Embedding for Data Analysis (3/8)
Ery Arias-Castro & Eddie Aamari - Embedding for Data Analysis (3/8)
16:00 - 17:40
Sessions of October 26th and November 2nd on Zoom only
jeudi 27 octobre 2022
09:00
Eric Klassen - Geometry of Shape Spaces of Curves and Surfaces (5/9).
Eric Klassen - Geometry of Shape Spaces of Curves and Surfaces (5/9).
09:00 - 10:40
Room: Amphitheater Darboux
11:00
Wolfgang Polonik - Statistical Topological Data Analysis (5/9)
Wolfgang Polonik - Statistical Topological Data Analysis (5/9)
11:00 - 12:40
Room: Amphitheater Darboux
vendredi 28 octobre 2022
samedi 29 octobre 2022
dimanche 30 octobre 2022
lundi 31 octobre 2022
mardi 1 novembre 2022
mercredi 2 novembre 2022
14:00
Quentin Mérigot - Optimal Transport (4/8)
Quentin Mérigot - Optimal Transport (4/8)
14:00 - 15:40
Room: Amphitheater Darboux
16:00
Ery Arias-Castro & Eddie Aamari - Embedding for Data Analysis (4/8)
Ery Arias-Castro & Eddie Aamari - Embedding for Data Analysis (4/8)
16:00 - 17:40
Sessions of October 26th and November 2nd on Zoom only
jeudi 3 novembre 2022
09:00
Eric Klassen - Geometry of Shape Spaces of Curves and Surfaces (6/9).
Eric Klassen - Geometry of Shape Spaces of Curves and Surfaces (6/9).
09:00 - 10:40
Room: Amphitheater Darboux
11:00
Wolfgang Polonik - Statistical Topological Data Analysis (6/9).
Wolfgang Polonik - Statistical Topological Data Analysis (6/9).
11:00 - 12:40
Room: Amphitheater Darboux
vendredi 4 novembre 2022
samedi 5 novembre 2022
dimanche 6 novembre 2022
lundi 7 novembre 2022
mardi 8 novembre 2022
09:00
Accueil
Accueil
09:00 - 09:15
Room: Amphitheater Darboux
09:15
Introduction de la journée (AMIES & Semestre Gesda).
Introduction de la journée (AMIES & Semestre Gesda).
09:15 - 09:30
09:30
Baptiste Labarthe (Metafora) Analyse Topologique des données : applications prometteuses aux données de cytométrie et au diagnostic médical.
Baptiste Labarthe (Metafora) Analyse Topologique des données : applications prometteuses aux données de cytométrie et au diagnostic médical.
09:30 - 10:00
10:00
Klervi Le Gall (UN) Analyse en Composantes Principales sur l’espace des fonctions à valeurs sur la variété des rotations 3-dimensionnelles: application à l’évaluation du déficit ambulatoire chez les patients atteints de SEP.
Klervi Le Gall (UN) Analyse en Composantes Principales sur l’espace des fonctions à valeurs sur la variété des rotations 3-dimensionnelles: application à l’évaluation du déficit ambulatoire chez les patients atteints de SEP.
10:00 - 10:25
10:25
Rémi Perrichon (ENAC) Statistique et géométrie au service de l’analyse de trajectoires d’avions.
Rémi Perrichon (ENAC) Statistique et géométrie au service de l’analyse de trajectoires d’avions.
10:25 - 10:50
10:50
pause
pause
10:50 - 11:15
11:15
Table ronde avec Frédéric Barbaresco (Thales), Nicolas Bousquet (EDF) et Stéphanie Allassonnière (Univ. Paris Cité).
Table ronde avec Frédéric Barbaresco (Thales), Nicolas Bousquet (EDF) et Stéphanie Allassonnière (Univ. Paris Cité).
11:15 - 12:30
12:30
buffet
buffet
12:30 - 14:00
14:00
posters
posters
14:00 - 15:00
15:00
Présentation commune des librairies et plateformes GeomStats, Gudhi et TTK.
Présentation commune des librairies et plateformes GeomStats, Gudhi et TTK.
15:00 - 17:00
17:15
Atelier GeomStats
Atelier GeomStats
17:15 - 18:00
Atelier Gudhi
Atelier Gudhi
17:15 - 18:00
Atelier TTK
Atelier TTK
17:15 - 18:00
mercredi 9 novembre 2022
14:00
Quentin Mérigot - Optimal Transport (5/8)
Quentin Mérigot - Optimal Transport (5/8)
14:00 - 15:40
Room: Amphitheater Darboux
16:00
Ery Arias-Castro & Eddie Aamari - Embedding for Data Analysis (5/8)
Ery Arias-Castro & Eddie Aamari - Embedding for Data Analysis (5/8)
16:00 - 17:40
Room: Amphitheater Darboux
jeudi 10 novembre 2022
09:00
Eric Klassen - Geometry of Shape Spaces of Curves and Surfaces (7/9)
Eric Klassen - Geometry of Shape Spaces of Curves and Surfaces (7/9)
09:00 - 10:40
Room: Amphitheater Darboux
11:00
Wolfgang Polonik - Statistical Topological Data Analysis (7/9).
Wolfgang Polonik - Statistical Topological Data Analysis (7/9).
11:00 - 12:40
Room: Amphitheater Darboux
vendredi 11 novembre 2022
samedi 12 novembre 2022
dimanche 13 novembre 2022
lundi 14 novembre 2022
mardi 15 novembre 2022
14:30
Kathryn Hess - Topological Approaches to Neuroscience (1/2).
Kathryn Hess - Topological Approaches to Neuroscience (1/2).
14:30 - 16:00
Room: Amphitheater Darboux
mercredi 16 novembre 2022
10:00
Kathryn Hess - Topological Approaches to Neuroscience (2/2)
Kathryn Hess - Topological Approaches to Neuroscience (2/2)
10:00 - 11:30
Room: Amphitheater Darboux
14:00
Quentin Mérigot - Optimal Transport (6/8)
Quentin Mérigot - Optimal Transport (6/8)
14:00 - 15:40
Room: Amphitheater Darboux
16:00
Ery Arias-Castro & Eddie Aamari - Embedding for Data Analysis (6/8)
Ery Arias-Castro & Eddie Aamari - Embedding for Data Analysis (6/8)
16:00 - 17:40
Room: Amphitheater Darboux
jeudi 17 novembre 2022
09:00
Eric Klassen - Geometry of Shape Spaces of Curves and Surfaces (8/9).
Eric Klassen - Geometry of Shape Spaces of Curves and Surfaces (8/9).
09:00 - 10:40
Room: Amphitheater Darboux
11:00
Wolfgang Polonik - Statistical Topological Data Analysis (8/9).
Wolfgang Polonik - Statistical Topological Data Analysis (8/9).
11:00 - 12:40
Room: Amphitheater Darboux
vendredi 18 novembre 2022
samedi 19 novembre 2022
dimanche 20 novembre 2022
lundi 21 novembre 2022
13:00
Registration
Registration
13:00 - 14:00
14:00
Arthur Gretton
Arthur Gretton
14:00 - 15:00
KALE Flow: A Relaxed KL Gradient Flow for Probabilities with Disjoint Support
15:00
Blanche Buet
Blanche Buet
15:00 - 16:00
A varifold perspective on discrete surfaces
16:30
Christophe Ley
Christophe Ley
16:30 - 17:30
Advances in statistics via tools from Stein's Method
mardi 22 novembre 2022
10:00
Théo Lacombe
Théo Lacombe
10:00 - 11:00
TBA
11:30
Bodhisattva Sen
Bodhisattva Sen
11:30 - 12:30
Distribution-free testing for Multivariate symmetry using Optimal Transport
14:30
Johan Segers
Johan Segers
14:30 - 15:30
Graphical and uniform consistency of estimated optimal transport plans
15:30
Gabriel Peyré
Gabriel Peyré
15:30 - 16:30
Unbalanced Optimal Transport across Metric Measured Spaces
17:00
Quentin Berthet
Quentin Berthet
17:00 - 18:00
Mirror Sinkhorn: Fast Online Optimization on Transport Polytopes
18:00
Cocktail
Cocktail
18:00 - 20:00
mercredi 23 novembre 2022
10:00
Quentin Paris
Quentin Paris
10:00 - 11:00
Online learning with exponential weights in metric spaces with the measure contraction property
11:30
Giovanni Peccati
Giovanni Peccati
11:30 - 12:30
TBA
14:30
Agnès Desolneux
Agnès Desolneux
14:30 - 15:30
A Wasserstein-type distance in the space of Gaussian mixture models
15:30
Nicolas Courty
Nicolas Courty
15:30 - 16:30
Sliced Wasserstein on Manifolds : Spherical and Hyperbolical cases
jeudi 24 novembre 2022
13:00
Jérôme Dedecker
Jérôme Dedecker
13:00 - 14:00
Some bounds for the Wasserstein distance between the empirical measure and the marginal distibution of a sequence of i.i.d. random variables
14:00
Bharath Sriperumbudur
Bharath Sriperumbudur
14:00 - 15:00
Spectral regularized kernel two-sample tests
15:30
Jean Feydy
Jean Feydy
15:30 - 16:30
Computational Optimal Transport : mature tools and open problems
16:30
Thibaut Le Gouic
Thibaut Le Gouic
16:30 - 17:30
An Algorithmic Solution to the Blotto Game using Multi-marginal Couplings
vendredi 25 novembre 2022
09:00
François-Xavier Vialard
François-Xavier Vialard
09:00 - 10:00
Statistical estimation of optimal transport potentials
10:00
Olga Mula
Olga Mula
10:00 - 11:00
Structured prediction with sparse Wasserstein barycenters
11:30
Elsa Cazelles
Elsa Cazelles
11:30 - 12:30
Barycenters for probability distributions based on optimal weak mass transport
samedi 26 novembre 2022
dimanche 27 novembre 2022
lundi 28 novembre 2022
mardi 29 novembre 2022
14:00
Stephen Preston - Riemannian Geometry on Lie Groups (1/2)
Stephen Preston - Riemannian Geometry on Lie Groups (1/2)
14:00 - 16:00
Room: Amphitheater Darboux
mercredi 30 novembre 2022
10:00
Stephen Preston - Riemannian Geometry on Lie Groups (2/2)
Stephen Preston - Riemannian Geometry on Lie Groups (2/2)
10:00 - 12:00
Room: Amphitheater Darboux
14:00
Quentin Mérigot - Optimal Transport (7/8)
Quentin Mérigot - Optimal Transport (7/8)
14:00 - 15:40
Room: Amphitheater Darboux
16:00
Ery Arias-Castro & Eddie Aamari - Embedding for Data Analysis (7/8)
Ery Arias-Castro & Eddie Aamari - Embedding for Data Analysis (7/8)
16:00 - 17:40
Room: Amphitheater Darboux
jeudi 1 décembre 2022
09:00
Eric Klassen - Geometry of Shape Spaces of Curves and Surfaces (9/9).
Eric Klassen - Geometry of Shape Spaces of Curves and Surfaces (9/9).
09:00 - 10:40
Room: Amphitheater Darboux
11:00
Wolfgang Polonik - Statistical Topological Data Analysis (9/9).
Wolfgang Polonik - Statistical Topological Data Analysis (9/9).
11:00 - 12:40
Room: Amphitheater Darboux
vendredi 2 décembre 2022
samedi 3 décembre 2022
dimanche 4 décembre 2022
lundi 5 décembre 2022
mardi 6 décembre 2022
mercredi 7 décembre 2022
14:00
Quentin Mérigot - Optimal Transport (8/8)
Quentin Mérigot - Optimal Transport (8/8)
14:00 - 15:40
Room: Amphitheater Darboux
16:00
Ery Arias-Castro & Eddie Aamari - Embedding for Data Analysis (8/8)
Ery Arias-Castro & Eddie Aamari - Embedding for Data Analysis (8/8)
16:00 - 17:40
Room: Amphitheater Darboux
jeudi 8 décembre 2022
vendredi 9 décembre 2022