Geometry and Statistics in Data Sciences, Paris
from
Monday, September 5, 2022 (8:30 AM)
to
Friday, December 9, 2022 (6:30 PM)
Monday, September 5, 2022
9:00 AM
High Dimensional Statistics
High Dimensional Statistics
9:00 AM - 10:30 AM
CARGESE PRE-SCHOOL
11:00 AM
Geometric Statistic
Geometric Statistic
11:00 AM - 12:30 PM
CARGESE PRE-SCHOOL
2:00 PM
Topological Data Analysis
Topological Data Analysis
2:00 PM - 3:30 PM
CARGESE PRE-SCHOOL
Tuesday, September 6, 2022
9:00 AM
High Dimensional Statistics
High Dimensional Statistics
9:00 AM - 10:30 AM
CARGESE PRE-SCHOOL
11:00 AM
Geometric Statistic
Geometric Statistic
11:00 AM - 12:30 PM
CARGESE PRE-SCHOOL
2:00 PM
Topological Data Analysis
Topological Data Analysis
2:00 PM - 3:30 PM
CARGESE PRE-SCHOOL
Wednesday, September 7, 2022
9:00 AM
High Dimensional Statistics
High Dimensional Statistics
9:00 AM - 10:30 AM
CARGESE PRE-SCHOOL
11:00 AM
Geometric Statistic
Geometric Statistic
11:00 AM - 12:30 PM
CARGESE PRE-SCHOOL
2:00 PM
Topological Data Analysis
Topological Data Analysis
2:00 PM - 3:30 PM
CARGESE PRE-SCHOOL
Thursday, September 8, 2022
9:00 AM
High Dimensional Statistics
High Dimensional Statistics
9:00 AM - 10:30 AM
CARGESE PRE-SCHOOL
11:00 AM
Geometric Statistic
Geometric Statistic
11:00 AM - 12:30 PM
CARGESE PRE-SCHOOL
2:00 PM
Topological Data Analysis
Topological Data Analysis
2:00 PM - 3:30 PM
CARGESE PRE-SCHOOL
4:00 PM
Topological Data Analysis
Topological Data Analysis
4:00 PM - 5:30 PM
CARGESE PRE-SCHOOL
Friday, September 9, 2022
9:00 AM
High Dimensional Statistics
High Dimensional Statistics
9:00 AM - 10:30 AM
CARGESE PRE-SCHOOL
11:00 AM
Geometric Statistic
Geometric Statistic
11:00 AM - 12:30 PM
CARGESE PRE-SCHOOL
Saturday, September 10, 2022
Sunday, September 11, 2022
Monday, September 12, 2022
Tuesday, September 13, 2022
Wednesday, September 14, 2022
3:30 PM
Welcome coffee 2d floor
Welcome coffee 2d floor
3:30 PM - 4:30 PM
Thursday, September 15, 2022
9:00 AM
Eric Klassen - Geometry of Shape Spaces of Curves and Surfaces (1/9)
Eric Klassen - Geometry of Shape Spaces of Curves and Surfaces (1/9)
9:00 AM - 10:40 AM
11:00 AM
Wolfgang Polonik - Statistical Topological Data Analysis (1/9)
Wolfgang Polonik - Statistical Topological Data Analysis (1/9)
11:00 AM - 12:40 PM
Room: Amphitheater Darboux
Friday, September 16, 2022
Saturday, September 17, 2022
Sunday, September 18, 2022
Monday, September 19, 2022
Tuesday, September 20, 2022
Wednesday, September 21, 2022
10:00 AM
Joseph Yukich - Asymptotic Analysis of Statistics of Random Geometric Structures (1/2)
Joseph Yukich - Asymptotic Analysis of Statistics of Random Geometric Structures (1/2)
10:00 AM - 12:00 PM
Room: Amphitheater Darboux
2:00 PM
Joseph Yukich - Asymptotic Analysis of Random Geometric Structures (2/2)
Joseph Yukich - Asymptotic Analysis of Random Geometric Structures (2/2)
2:00 PM - 4:00 PM
Room: Amphitheater Darboux
Thursday, September 22, 2022
9:00 AM
Eric Klassen - Geometry of Shape Spaces of Curves and Surfaces (2/9)
Eric Klassen - Geometry of Shape Spaces of Curves and Surfaces (2/9)
9:00 AM - 10:40 AM
Room: Amphitheater Darboux
11:00 AM
Wolfgang Polonik - Statistical Topological Data Analysis (2/9)
Wolfgang Polonik - Statistical Topological Data Analysis (2/9)
11:00 AM - 12:40 PM
Room: Amphitheater Darboux
Friday, September 23, 2022
Saturday, September 24, 2022
Sunday, September 25, 2022
Monday, September 26, 2022
Tuesday, September 27, 2022
2:00 PM
Nicolas Charon - A Few Applications of Geometric Measure Theory to Shape Analysis (1/2)
Nicolas Charon - A Few Applications of Geometric Measure Theory to Shape Analysis (1/2)
2:00 PM - 4:00 PM
Room: Amphitheater Darboux
Wednesday, September 28, 2022
10:00 AM
Nicolas Charon - A Few Applications of Geometric Measure Theory to Shape Analysis (2/2)
Nicolas Charon - A Few Applications of Geometric Measure Theory to Shape Analysis (2/2)
10:00 AM - 12:00 PM
Room: Amphitheater Darboux
2:00 PM
Quentin Mérigot - Optimal Transport (1/8)
Quentin Mérigot - Optimal Transport (1/8)
2:00 PM - 3:40 PM
Room: Amphitheater Darboux
4:00 PM
Ery Arias-Castro & Eddie Aamari - Embedding for Data Analysis (1/8)
Ery Arias-Castro & Eddie Aamari - Embedding for Data Analysis (1/8)
4:00 PM - 5:40 PM
Room: Amphitheater Darboux
Thursday, September 29, 2022
9:00 AM
Eric Klassen - Geometry of Shape Spaces of Curves and Surfaces (3/9).
Eric Klassen - Geometry of Shape Spaces of Curves and Surfaces (3/9).
9:00 AM - 10:40 AM
Room: Amphitheater Darboux
11:00 AM
Wolfgang Polonik - Statistical Topological Data Analysis (3/9)
Wolfgang Polonik - Statistical Topological Data Analysis (3/9)
11:00 AM - 12:40 PM
Room: Amphitheater Darboux
Friday, September 30, 2022
Saturday, October 1, 2022
Sunday, October 2, 2022
Monday, October 3, 2022
1:45 PM
Introduction
Introduction
1:45 PM - 2:00 PM
2:00 PM
Sophie Langer- Overcoming the curse of dimensionality with deep neural networks
Sophie Langer- Overcoming the curse of dimensionality with deep neural networks
2:00 PM - 3:00 PM
Although the application of deep neural networks to real-world problems has become ubiquitous, the question of why they are so effective has not yet been satisfactorily answered. However, some progress has been made in establishing an underlying mathematical foundation. This talk surveys results on statistical risk bounds of deep neural networks. In particular, we focus on the question of when neural networks bypass the curse of dimensionality. Here we discuss results for vanilla feedforward and convolutional neural networks as well as regression and classification settings.
3:00 PM
Adeline Fermanian- Scaling ResNets in the Large-depth Regime
Adeline Fermanian- Scaling ResNets in the Large-depth Regime
3:00 PM - 4:00 PM
Deep ResNets are recognized for achieving state-of-the-art results in complex machine learning tasks. However, the remarkable performance of these architectures relies on a training procedure that needs to be carefully crafted to avoid vanishing or exploding gradients, particularly as the depth L increases. No consensus has been reached on how to mitigate this issue, although a widely discussed strategy consists in scaling the output of each layer by a factor \alpha_L. We show in a probabilistic setting that with standard i.i.d. initializations, the only non-trivial dynamics is for \alpha_L = 1/ \sqrt{L} (other choices lead either to explosion or to identity mapping). This scaling factor corresponds in the continuous-time limit to a neural stochastic differential equation, contrarily to a widespread interpretation that deep ResNets are discretizations of neural ordinary differential equations. By contrast, in the latter regime, stability is obtained with specific correlated initializations and \alpha_L=1/L. Our analysis suggests a strong interplay between scaling and regularity of the weights as a function of the layer index. Finally, in a series of experiments, we exhibit a continuous range of regimes driven by these two parameters, which jointly impact performance before and after training.
4:30 PM
Mikhail Belkin- Neural networks, wide and deep, singular kernels and Bayes optimality
Mikhail Belkin- Neural networks, wide and deep, singular kernels and Bayes optimality
4:30 PM - 5:30 PM
Wide and deep neural networks are used in many important practical setting. In this talk I will discuss some aspects of width and depth related to optimization and generalization. I will first discuss what happens when neural networks become infinitely wide, giving a general result for the transition to linearity (i.e., showing that neural networks become linear functions of parameters) for a broad class of wide neural networks corresponding to directed graphs. I will then proceed to the question of depth, showing equivalence between infinitely wide and deep fully connected networks trained with gradient descent and Nadaraya-Watson predictors based on certain singular kernels. Using this connection we show that for certain activation functions these wide and deep networks are (asymptotically) optimal for classification but, interestingly, never for regression. Based on joint work with Chaoyue Liu, Adit Radhakrishnan, Caroline Uhler and Libin Zhu.
Tuesday, October 4, 2022
9:00 AM
Clément Berenfeld- Understanding the geometry of high-dimensional data through the reach
Clément Berenfeld- Understanding the geometry of high-dimensional data through the reach
9:00 AM - 10:00 AM
In high-dimensional statistics, and more particularly in manifold learning, the reach is an ubiquitous regularity parameters that encompasses the well-behavior of the support of the underlying probability measure. Enforcing a reach constraint is, in most geometric inference tasks, a necessity, which raises the question of the estimability of this parameter. We will try to understand how the reach relates to many other important geometric invariants and propose and estimation strategy that relies on estimating the intrinsic metric of the data. (Joint work with Eddie Aamari and Clément Levrard)
10:00 AM
Wolfgang Polonik- Topologically penalized regression on manifolds
Wolfgang Polonik- Topologically penalized regression on manifolds
10:00 AM - 11:00 AM
We study a regression problem on a compact manifold. In order to take advantage of the underlying geometry and topology of the data, we propose to perform the regression task on the basis of eigenfunctions of the Laplace-Beltrami operator of the manifold that are regularized with topological penalties. We will discuss the approach and the penalties, provide some supporting theory and illustrate the performance of the methodology on some data sets, illustrating the relevance of our approach in the case where the target function is ``topologically smooth”. This is joint work with O. Hacquard, K. Balasubramanian, G. Blanchard and C. Levrard.
11:30 AM
John Harlim- Leveraging the RBF operator estimation for manifold learning.
John Harlim- Leveraging the RBF operator estimation for manifold learning.
11:30 AM - 12:30 PM
I will discuss the radial-basis function pointwise and weak formulations for approximating Laplacians on functions and vector fields based on randomly sampled point cloud data, whose spectral properties are relevant to manifold learning. For the pointwise formulation, I will demonstrate the importance of the novel local tangent estimation that accounts for the curvature, which crucially improves the quality of the operator estimation. I will report the spectral theoretical convergence results of these formulations and their strengths/weaknesses in practice. Supporting numerical examples, involving the spectral estimation of the Laplace-Beltrami operator and various vector Laplacians such as the Bochner, Hodge, and Lichnerowicz Laplacians will be demonstrated with appropriate comparisons to the standard graph-based approaches.
2:30 PM
Marina Meila - Manifold Learning, Explanations and Eigenflows
Marina Meila - Manifold Learning, Explanations and Eigenflows
2:30 PM - 3:30 PM
This talk will extend Manifold Learning in two directions. First, we ask if it is possible, in the case of scientific data where quantitative prior knowledge is abundant, to explain a data manifold by new coordinates, chosen from a set of scientifically meaningful functions? Second, we ask how popular Manifold Learning tools and their applications can be recreated in the space of vector fields and flows on a manifold. Central to this approach is the order 1-Laplacian of a manifold, $\Delta_1$, whose eigen-decomposition into gradient, harmonic, and curl, known as the Helmholtz-Hodge Decomposition, provides a basis for all vector fields on a manifold. We present an estimator for $\Delta_1$, and based on it we develop a variety of applications. Among them, visualization of the principal harmonic, gradient or curl flows on a manifold, smoothing and semi-supervised learning of vector fields, 1-Laplacian regularization. In topological data analysis, we describe the 1st-order analogue of spectral clustering, which amounts to prime manifold decomposition. Furthermore, from this decomposition a new algorithm for finding shortest independent loops follows. The algorithms are illustrated on a variety of real data sets. Joint work with Yu-Chia Chen, Samson Koelle, Hanyu Zhang and Ioannis Kevrekidis
3:30 PM
Franck Picard - A probabilistic Graph Coupling View of Dimension Reduction
Franck Picard - A probabilistic Graph Coupling View of Dimension Reduction
3:30 PM - 4:30 PM
Dimension reduction is a standard task in machine learning, to reduce the complexity and represent the data at hand. Many (and more than many!) methods have been proposed for this purpose, among which the seminal principal component analysis (PCA), that approximates the data linearly with a reduced number of axes. In recent years, the field has witness the emergence of new non linear methods, like the Stochastic Neighbor Embedding method (SNE) and the Uniform Manifold Approximation and Projection method (UMAP), that proposes very efficient low-dimensional representations of the observations. Though widely used, these approaches lack clear probabilistic foundations to enable a full understanding of their properties and limitations. A common feature of these techniques is to be based on a minimization of a cost between input and latent pairwise similarities, but the generative model is still missing. In this work we introduce a unifying statistical framework based on the coupling of hidden graphs using cross entropy. These graphs induce a Markov random field dependency structure among the observations in both input and latent spaces. We show that existing pairwise similarity dimension reduction methods can be retrieved from our framework with particular choices of priors for the graphs. Moreover this reveals that these methods suffer from a statistical deficiency that explains poor performances in conserving coarse-grain dependencies. Our model is leveraged and extended to address this issue while new links are drawn with Laplacian eigenmaps and PCA.
5:00 PM
Alexander Cloninger - Learning on and near Low-Dimensional Subsets of the Wasserstein Manifold
Alexander Cloninger - Learning on and near Low-Dimensional Subsets of the Wasserstein Manifold
5:00 PM - 6:00 PM
Detecting differences and building classifiers between distributions $\{\mu_i\}_{i=1}^N$, given only finite samples, are important tasks in a number of scientific fields. Optimal transport (OT) has evolved as the most natural concept to measure the distance between distributions, and has gained significant importance in machine learning in recent years. There are some drawbacks to OT: computing OT can be slow, and because OT is a distance metric, it only yields a pairwise distance matrix between distributions rather than embedding those distributions into a vector space. If we make no assumptions on the family of distributions, these drawbacks are difficult to overcome. However, in the case that the measures are generated by push-forwards by elementary transformations, forming a low-dimensional submanifold of the Wasserstein manifold, we can deal with both of these issues on a theoretical and a computational level. In this talk, we'll show how to embed the space of distributions into a Hilbert space via linearized optimal transport (LOT), and how linear techniques can be used to classify different families of distributions generated by elementary transformations and perturbations. The proposed framework significantly reduces both the computational effort and the required training data in supervised settings. Similarly, we'll demonstrate the ability to learn a near isometric embedding of the low-dimensional submanifold. Finally, we'll provide non-asymptotic bounds on the error induced in both the supervised and unsupervised algorithms from finitely sampling the target distributions and projecting the LOT Hilbert space into a finite dimensional subspace. We demonstrate the algorithms in pattern recognition tasks in imaging and provide some medical applications.
6:00 PM
Cocktail
Cocktail
6:00 PM - 8:30 PM
Wednesday, October 5, 2022
9:00 AM
Claudia Strauch- On high-dimensional Lévy-driven Ornstein–Uhlenbeck processes
Claudia Strauch- On high-dimensional Lévy-driven Ornstein–Uhlenbeck processes
9:00 AM - 10:00 AM
We investigate the problem of estimating the drift parameter of a high-dimensional Lévy-driven Ornstein–Uhlenbeck process under sparsity constraints. It is shown that both Lasso and Slope estimators achieve the minimax optimal rate of convergence (up to numerical constants), for tuning parameters chosen independently of the confidence level. The results are non-asymptotic and hold both in probability and conditional expectation with respect to an event resembling the restricted eigenvalue condition. Based on joint work with Niklas Dexheimer.
10:00 AM
Botond Szabo- Linear methods for nonlinear inverse problems
Botond Szabo- Linear methods for nonlinear inverse problems
10:00 AM - 11:00 AM
We consider recovering an unknown function f from a noisy observation of the solution u to a partial differential equation, where for the elliptic differential operator L, the map L(u) can be written as a function of u and f, under Dirichlet boundary condition. A particular example is the time-independent Schrödinger equation. We transform this problem into the linear inverse problem of recovering L(u), and show that Bayesian methods for this problem may yield optimal recovery rates not only for u, but also for f. The prior distribution may be placed on u or its elliptic operator. Adaptive priors are shown to yield adaptive contraction rates for f, thus eliminating the need to know the smoothness of this function. Known results on uncertainty quantification for the linear problem transfer to f as well. The results are illustrated by several numerical simulations. This is a joint work with Geerten Koers and Aad van der Vaart.
11:30 AM
Judith Rousseau- Bayesian nonparametric estimation of a density living near an unknown manifold
Judith Rousseau- Bayesian nonparametric estimation of a density living near an unknown manifold
11:30 AM - 12:30 PM
In high dimensions it is common to assume that the data have a lower dimensional structure. In this work we consider that the observations are iid and with a distribution whose support is concentrated near a lower dimensional manifold. Neither the manifold nor the density is known. A typical example is for noisy observations on an unknown low dimensional manifold. We consider a family of Bayesian nonparametric density estimators based on location - scale Gaussian mixture priors and we study the asymptotic properties of the posterior distribution. Our work shows in particular that non conjuguate location - scale Gaussian mixture models can adapt to complex geometries and spatially varying regularity. This talk will also review the various aspects of mixtures of Gaussian for density estimation. Joint work with Clément Berenfeld (Dauphine) and Paul Rosa (Oxford)
Thursday, October 6, 2022
9:00 AM
Denis Belometsny- Dimensionality reduction in reinforcement learning by randomisation
Denis Belometsny- Dimensionality reduction in reinforcement learning by randomisation
9:00 AM - 10:00 AM
In reinforcement learning an agent interacts with an environment, whose underlying mechanism is unknown, by sequentially taking actions, receiving rewards, and transitioning to the next state. With the goal of maximizing the expected sum of the collected rewards, the agent must carefully balance between exploring in order to gather more information about the environment and exploiting the current knowledge to collect the rewards. In this talk, we are interested in solving this exploration-exploitation dilemma by injecting noise into the agent’s decision-making process in such a way that the dependence of the regret on the dimension of state and action spaces is minimised. We also review some recent approaches towards dimension reduction in RL.
10:00 AM
Gilles Blanchard - Stein effect for estimating many vector means: a "blessing of dimensionality" phenomenon
Gilles Blanchard - Stein effect for estimating many vector means: a "blessing of dimensionality" phenomenon
10:00 AM - 11:00 AM
Consider the problem of joint estimation of the means for a large number of distributions in R^d using separate, independent data sets from each of them, sometimes also called "multi-task averaging" problem. We propose an improved estimator (compared to the naive empirical means of each data set) to exploit possible similarities between means, without any related information being known in advance. First, for each data set, similar or neighboring means are determined from the data by multiple testing. Then each naive estimator is shrunk towards the local average of its neighbors. We prove that this approach provides a reduction in mean squared error that can be significant when the (effective) dimensionality of the data is large, and when the unknown means exhibit structure such as clustering or concentration on a low-dimensional set. This is directly linked to the fact that the separation distance for testing is smaller than the estimation error in high dimension and generalizes the well-known James-Stein phenomenon. An application of this approach is the estimation of multiple kernel mean embeddings, which plays an important role in many modern applications. (This is based on joined work with Hannah Marienwald and Jean-Baptiste Fermanian)
11:30 AM
Nicolas Verzelen - Optimal Permutation Estimation in Crowd-Sourcing problems
Nicolas Verzelen - Optimal Permutation Estimation in Crowd-Sourcing problems
11:30 AM - 12:30 PM
Motivated by crowd-sourcing applications, we consider a model where we have partial observations from a bivariate isotonic $n\times d$ matrix with an unknown permutation $\pi^*$ acting on its rows. We consider the twin problems of recovering the permutation $\pi^*$ and estimating the unknown matrix. We introduce a polynomial-time procedure achieving the minimax risk for these two problems, this for all possible values of $n$, $d$, and all possible sampling efforts. Along the way, we establish that, in some regimes, recovering the unknown permutation $\pi^*$ is considerably simpler than estimating the matrix. This is based on a joint work with Alexandra Carpentier (U. Potsdam) and Emmanuel Pilliat (U. Montpellier).
2:30 PM
Claire Lacour - On the use of overfitting for estimator selection
Claire Lacour - On the use of overfitting for estimator selection
2:30 PM - 3:30 PM
In this talk we consider the problem of estimator selection. In the case of density estimation, we study a method called PCO, which is intermediate between Lepski's method and penalized empirical risk minimization. The key point is the comparison of all the estimators to the overfitted one. We provide some theoretical results which lead to some fully data-driven selection strategy. We will also show the numerical performance of the method. This is a joint work with P. Massart, V. Rivoirard and S. Varet.
3:30 PM
Peter Bartlett - The Dynamics of Sharpness-Aware Minimization.
Peter Bartlett - The Dynamics of Sharpness-Aware Minimization.
3:30 PM - 4:30 PM
Optimization methodology has been observed to affect statistical performance in high-dimensional prediction problems, and there has been considerable effort devoted to understanding the behavior of optimization methods and the nature of solutions that they find. We consider Sharpness-Aware Minimization (SAM), a gradient-based optimization method that has exhibited performance improvements over gradient de- scent on image and language prediction problems using deep networks. We show that when SAM is applied with a convex quadratic objective, for most random initializa- tions it converges to oscillating between either side of the minimum in the direction with the largest curvature, and we provide bounds on the rate of convergence. In the non-quadratic case, we show that such oscillations encourage drift toward wider minima by effectively performing gradient descent, on a slower time scale, on the spectral norm of the Hessian. (Based on joint work with Olivier Bousquet and Phil Long)
Friday, October 7, 2022
9:00 AM
Johannes Schmidt-Hieber - A statistical analysis of an image classification problem
Johannes Schmidt-Hieber - A statistical analysis of an image classification problem
9:00 AM - 10:00 AM
The availability of massive image databases resulted in the development of scalable machine learning methods such as convolutional neural network (CNNs) filtering and processing these data. While the very recent theoretical work on CNNs focuses on standard nonparametric denoising problems, the variability in image classification datasets does, however, not originate from additive noise but from variation of the shape and other characteristics of the same object across different images. To address this problem, we consider a simple supervised classification problem for object detection on grayscale images. While from the function estimation point of view, every pixel is a variable and large images lead to high-dimensional function recovery tasks suffering from the curse of dimensionality, increasing the number of pixels in our image deformation model enhances the image resolution and makes the object classification problem easier. We propose and theoretically analyze two different procedures. The first method estimates the image deformation by support alignment. Under a minimal separation condition, it is shown that perfect classification is possible. The second method fits a CNN to the data. We derive a rate for the misclassification error depending on the sample size and the number of pixels. Both classifiers are empirically compared on images generated from the MNIST handwritten digit database. The obtained results corroborate the theoretical findings. This is joint work with Sophie Langer (Twente).
10:00 AM
Damien Garreau - What does LIME really see in images?
Damien Garreau - What does LIME really see in images?
10:00 AM - 11:00 AM
The performance of modern algorithms on certain computer vision tasks such as object recognition is now close to that of humans. This success was achieved at the price of complicated architectures depending on millions of parameters and it has become quite challenging to understand how particular predictions are made. Interpretability methods propose to give us this understanding. In this paper, we study LIME, perhaps one of the most popular. On the theoretical side, we show that when the number of generated examples is large, LIME explanations are concentrated around a limit explanation for which we give an explicit expression. We further this study for elementary shape detectors and linear models. As a consequence of this analysis, we uncover a connection between LIME and integrated gradients, another explanation method. More precisely, the LIME explanations are similar to the sum of integrated gradients over the superpixels used in the preprocessing step of LIME.
11:30 AM
Vasiliki Velona - Learning a partial correlation graph using only a few covariance queries.
Vasiliki Velona - Learning a partial correlation graph using only a few covariance queries.
11:30 AM - 12:30 PM
In settings where the covariance matrix is too large to even store, we would like to learn the partial correlation graph with as few covariance queries as possible (in a partial correlation graph, an edge exists if the corresponding entry in the inverse covariance matrix is non-zero). In recent work with Gabor Lugosi, Jakub Truszkowski, and Piotr Zwiernik, we showed that it is possible to use only a quasi-linear number of queries if the inverse covariance matrix is sparse enough, in the sense that the partial correlation graph resembles a tree on a global scale. I will explain these results and discuss extensions and applications.
Saturday, October 8, 2022
Sunday, October 9, 2022
Monday, October 10, 2022
1:30 PM
Stefan Sommer -Diffusions means in geometric statistics
Stefan Sommer -Diffusions means in geometric statistics
1:30 PM - 2:30 PM
Analysis and statistics of shape variation and, more generally, manifold valued data can be formulated probabilistically with geodesic distances between shapes exchanged with (-log)likelihoods. This leads to new statistics and estimation algorithms. One example is the notion of diffusion mean. In the talk, I will discuss the motivation behind and construction of diffusion means and discuss properties of the mean, including reduced smeariness when estimating diffusion variance together with the mean. This happens both in the isotropic setting with trivial covariance, and in the anisotropic setting where variance is fitted in all directions. I will connect this to most probable paths to data and algorithms for computing diffusion means, particularly bridge sampling algorithms. Finally, we will discuss ways of sampling the diffusion mean directly by conditioning on the diagonal of product manifolds, thereby avoiding the computationally expensive iterative optimization that is often applied for computing means on manifolds.
2:30 PM
Nina Miolane -Geomstats: a Python package for Geometric Machine Learning
Nina Miolane -Geomstats: a Python package for Geometric Machine Learning
2:30 PM - 3:30 PM
We introduce Geomstats, an open-source Python package for computations and statistics on nonlinear manifolds that appear in machine learning applications, such as: hyperbolic spaces, spaces of symmetric positive definite matrices, Lie groups of transformations, and many more. We provide object-oriented and extensively unit-tested implementations. Manifolds come equipped with families of Riemannian metrics with associated exponential and logarithmic maps, geodesics, and parallel transport. Statistics and learning algorithms provide methods for estimation, regression, classification, clustering, and dimension reduction on manifolds. All associated operations provide support for different execution backends --- namely NumPy, Autograd, PyTorch, and TensorFlow. This talk presents the package, compares it with related libraries, and provides relevant examples. We show that Geomstats provides reliable building blocks to both foster research in differential geometry and statistics and democratize the use of (Riemannian) geometry in statistics and machine learning. The source code is freely available under the MIT license at https://github.com/geomstats/geomstats.
4:00 PM
Yusu Wang -Weisfeiler-Lehman Meets Gromov-Wasserstein
Yusu Wang -Weisfeiler-Lehman Meets Gromov-Wasserstein
4:00 PM - 5:00 PM
The Weisfeiler-Lehman (WL) test is a classical procedure for graph isomorphism testing. The WL test has also been widely used both for designing graph kernels and for analyzing graph neural networks. In this talk, I will describe the so-called Weisfeiler-Lehman (WL) distance we recently introduced, which is a new notion of distance between labeled measure Markov chains (LMMCs), of which labeled graphs are special cases. The WL distance extends the WL test (in the sense that the former is positive if and only if the WL test can distinguish the two involved graphs) while at the same time it is polynomial time computable. It is also more discriminating than the distance between graphs used for defining the Wasserstein Weisfeiler-Lehman graph kernel. Inspired by the structure of the WL distance we identify a neural network architecture on LMMCs which turns out to be universal w.r.t. continuous functions defined on the space of all LMMCs (which includes all graphs) endowed with the WL distance. Furthermore, the WL distance turns out to be stable w.r.t. a natural variant of the Gromov-Wasserstein (GW) distance for comparing metric Markov chains that we identify. Hence, the WL distance can also be construed as a polynomial time lower bound for the GW distance which is in general NP-hard to compute. This is joint work with Samantha Chen, Sunhyuk Lim, Facundo Memoli and Zhengchao Wan.
Tuesday, October 11, 2022
9:00 AM
Martin Bauer -Elastic Shape Analysis of Surface
Martin Bauer -Elastic Shape Analysis of Surface
9:00 AM - 10:00 AM
10:00 AM
Eric Klassen (The Square Root Normal Field and Unbalanced Optimal Transport)
Eric Klassen (The Square Root Normal Field and Unbalanced Optimal Transport)
10:00 AM - 11:00 AM
The Square Root Normal Field (SRNF) is a distance function on shape spaces of surfaces in R^3. Unbalanced Optimal Transport (UOT) is a variant of Optimal Transport in which mass is allowed to expand and contract as it is transported from one point to another. In this talk (joint work of Bauer, Hartman and Klassen) we discuss an unexpected relation between the SRNF distance for oriented surfaces in R^3 and UOT for Borel measures on S^2.
11:30 AM
Steve Oudot (Optimization in topological data analysis)
Steve Oudot (Optimization in topological data analysis)
11:30 AM - 12:30 PM
This talk will give an overview of the line of work on optimization for topological data analysis, from the initial attempts at differentiating the persistent homology operator, to the recent adaptations of stochastic gradient descent and gradient sampling.
2:30 PM
Omer Bobrowski -Universality in Random persistence Diagrams
Omer Bobrowski -Universality in Random persistence Diagrams
2:30 PM - 3:30 PM
One of the most elusive challenges within the area of topological data analysis is understanding the distribution of persistence diagrams. Despite much effort, this is still largely an open problem. In this talk we will present a series of conjectures regarding the behavior of persistence diagrams arising from random point-clouds. We claim that, viewed in the right way, persistence values obey a universal probability law, that depends on neither the underlying space nor the original distribution of the point-cloud. We back these conjectures with an exhaustive set of experiments, including both simulated and real data. We will also discuss some heuristic explanations for the possible sources of this phenomenon. Finally, we will demonstrate the power of these conjectures by proposing a new hypothesis testing framework for computing significance values for individual features within persistence diagrams. This is joint work with Primoz Skraba (QMUL).
3:30 PM
Kathryn Hess -Morse-theoretic signal compression and reconstruction
Kathryn Hess -Morse-theoretic signal compression and reconstruction
3:30 PM - 4:30 PM
In this lecture I will present work of three of my PhD students, Stefania Ebli, Celia Hacker, and Kelly Maggs, on cellular signal processing. In the usual paradigm, the signals on a simplicial or chain complex are processed using the combinatorial Laplacian and the resultant Hodge decomposition. On the other hand, discrete Morse theory has been widely used to speed up computations, by reducing the size of complexes while preserving their global topological properties. Ebli, Hacker, and Maggs have developed an approach to signal compression and reconstruction on chain complexes that leverages the tools of algebraic discrete Morse theory,, which provides a method to reduce and reconstruct a based chain complex together with a set of signals on its cells via deformation retracts, preserving as much as possible the global topological structure of both the complex and the signals. It turns out that any deformation retract of real degreewise finite-dimensional based chain complexes is equivalent to a Morse matching. Moreover, in the case of certain interesting Morse matchings, the reconstruction error is trivial, except on one specific component of the Hodge decomposition. Finally, the authors developed and implemented an algorithm to compute Morse matchings with minimal reconstruction error, of which I will show explicit examples.
6:30 PM
Cocktail (Tour Zamansky, Jussieu)
Cocktail (Tour Zamansky, Jussieu)
6:30 PM - 8:30 PM
Wednesday, October 12, 2022
9:00 AM
Joannes Krebs -On the law of the iterated logarithm and Bahadur representation in stochastic geometry
Joannes Krebs -On the law of the iterated logarithm and Bahadur representation in stochastic geometry
9:00 AM - 10:00 AM
Room: Amphitheater Darboux
We study the law of the iterated logarithm and a related strong invariance principle for certain functionals in stochastic geometry. The underlying point process is either a homogeneous Poisson process or a binomial process. Moreover, requiring the functional to be a sum of so-called stabilizing score functionals enables us to derive a Bahadur representation for sample quantiles. The scores are obtained from a homogeneous Poisson process. We also study local fluctuations of the corresponding empirical distribution function and apply the results to trimmed and Winsorized means of the scores. As potential applications, we think of well-known functionals defined on the k-nearest neighbors graph and important functionals in topological data analysis such as the Euler characteristic and persistent Betti numbers as well as statistics defined on Poisson-Voronoi tessellations.
10:00 AM
Katharine Turner - The Extended Persistent Homology Transform for Manifolds with Boundary
Katharine Turner - The Extended Persistent Homology Transform for Manifolds with Boundary
10:00 AM - 11:00 AM
The Persistent Homology Transform (PHT) is a topological transform which can be use to quantify the difference between subsets of Euclidean space. To each unit vector the transform assigns the persistence module of the height function over that shape with respect to that direction. The PHT is injective on piecewise-linear subsets of Euclidean space, and it has been demonstrably useful in diverse applications. One shortcoming is that shapes with different essential homology (i.e., Betti numbers) have an infinite distance between them. The theory of extended persistence for Morse functions on a manifold was developed by Cohen-Steiner, Edelsbrunner and Harer in 2009 to quantify the support of the essential homology classes. By using extended persistence modules of height functions over a shape, we obtain the extended persistent homology transform (XPHT) which provides a finite distance between shapes even when they have different Betti numbers. I will discuss how the XPHT of a manifold with boundary can be deduced from the XPHT of the boundary which allows for efficient calculation. James Morgan has implemented the required algorithms for 2-dimensional binary images as a forthcoming R-package. Work is also with Vanessa Robins.
11:30 AM
Heather Harington- shape of data in biology.
Heather Harington- shape of data in biology.
11:30 AM - 12:30 PM
TBA
2:30 PM
Frédéric Barbaresco -Symplectic Foliation Model of Information Geometry for Statistics and Learning on Lie Groups
Frédéric Barbaresco -Symplectic Foliation Model of Information Geometry for Statistics and Learning on Lie Groups
2:30 PM - 3:30 PM
We present a new symplectic model of Information Geometry [1,2] based on Jean-Marie Souriau's Lie Groups Thermodynamics [3,4]. Souriau model was initially described in chapter IV “Statistical Mechanics” of his book “Structure of dynamical systems” published in 1969. This model gives a purely geometric characterization of Entropy, which appears as an invariant Casimir function in coadjoint representation, characterized by Poisson cohomology. Souriau has proved that we can associate a symplectic manifold to coadjoint orbits of a Lie group by the KKS 2-form (Kirillov, Kostant, Souriau 2-form) in the affine case (affine model of coadjoint operator equivariance via Souriau's cocycle) [5], that we have identified with Koszul-Fisher metric from Information Geometry. Souriau established the generalized Gibbs density covariant under the action of the Lie group. The dual space of the Lie algebra foliates into coadjoint orbits that are also the Entropy level sets that could be interpreted in the framework of Thermodynamics by the fact that dynamics on these symplectic leaves are non-dissipative, whereas transversal dynamics, given by Poisson transverse structure, are dissipative. We will finally introduce Gaussian distribution on the space of Symmetric Positive Definite (SPD) matrices, through Souriau's covariant Gibbs density by considering this space as the pure imaginary axis of the homogeneous Siegel upper half space where Sp(2n,R)/U(n) acts transitively. We will also consider Gibbs density for Siegel Disk where SU(n,n)/S(U(n)xU(n)) acts transitively. Gauss density of SPD matrices is then computed through Souriau's moment map and coadjoint orbits. Souriau’s Lie Groups Thermodynamics model will be further explored in European COST network CaLISTA [6] and European HORIZON-MSCA project CaLIGOLA [7].
3:30 PM
Victor Patrangenaru -Geometry, Topology and Statistics on Object Spaces
Victor Patrangenaru -Geometry, Topology and Statistics on Object Spaces
3:30 PM - 4:30 PM
Thursday, October 13, 2022
9:00 AM
Nicolas Charon - Registration of shape graphs with partial matching constraints
Nicolas Charon - Registration of shape graphs with partial matching constraints
9:00 AM - 10:00 AM
This talk will discuss an extension of the elastic curve registration framework to a general class of geometric objects which we call (weighted) shape graphs, allowing in particular the comparison and matching of 1D geometric data that are partially observed or that exhibit certain topological inconsistencies. Specifically, we generalize the class of second-order invariant Sobolev metrics on the space of unparametrized curves to weighted shape graphs by modelling such objects as varifolds (i.e. directional measures) and combining geometric deformations with a transformation process on the varifold weights. This leads us to introduce a new class of variational problems, show the existence of solutions and derive a specific numerical scheme to tackle the corresponding discrete optimization problems.
10:00 AM
Irène Kaltenmark - Curves and surfaces. Partial matching in the space of varifolds.
Irène Kaltenmark - Curves and surfaces. Partial matching in the space of varifolds.
10:00 AM - 11:00 AM
The matching of analogous shapes is a central problem in computational anatomy. However, inter-individual variability, pathological anomalies or acquisition methods sometimes challenge the assumption of global homology between shapes. In this talk, I will present an asymmetric data attachment term characterizing the inclusion of one shape in another. This term is based on projection on the nearest neighbor with respect to the metrics of varifold spaces. Varifolds are representations of geometric objects, including curves and surfaces. Their specificity is to take into account the tangent spaces of these objects and to be robust to the choice of parametrization. This new data attachment term extends the scope of application of the pre-existing methods of matching by large diffeomorphic deformations (LDDMM). The partial registration is indeed induced by a diffeomorphic deformation of the source shape. The anatomical (topological) characteristics of this shape are thus preserved. This is a joint work with Pierre-Louis Antonsanti and Joan Glaunès.
11:30 AM
Herbert Edelsbrunner - Chromatic Delaunay mosaics for chromatic point data.
Herbert Edelsbrunner - Chromatic Delaunay mosaics for chromatic point data.
11:30 AM - 12:30 PM
The chromatic Delaunay mosaic of s+1 finite sets in d dimensions is an (s+d)-dimensional Delaunay mosaic that represents the individual sets as well as their interactions. For example, it contains a (non-standard) dual of the overlay of the Voronoi tessellations of any subset of the s+1 colors. We prove bounds on the size of the chromatic Delaunay mosaic, in the worst and average case, and suggest how to use image, kernel, and cokernel persistence to get stable diagrams describing the interaction of the points of different colors. Acknowledgements. This is incomplete and ongoing joint work with Ranita Biswas, Sebastiano Cultrera, Ondrej Draganov, and Morteza Saghafian, all at IST Austria.
2:30 PM
Claire Brecheteau - Approximating data with a union of ellipsoids and clustering.
Claire Brecheteau - Approximating data with a union of ellipsoids and clustering.
2:30 PM - 3:30 PM
I will introduce a surrogate for the distance function to the support of a distribution, which sublevel sets are unions of balls or of ellipsoids. I will expose different results, including rates of convergence for the approximation of these surrogates with their empirical versions, built from pointclouds. I will explain how to use such estimators to cluster data with a geometric structure. The results have been published in the papers [1,2], and are still in progress.
3:30 PM
Dominique Attali- Reconstructing manifolds by weighted $\ell_1$-norm minimization
Dominique Attali- Reconstructing manifolds by weighted $\ell_1$-norm minimization
3:30 PM - 4:30 PM
In many practical situations, the shape of interest is only known through a finite set of data points. Given as input those data points, it is then natural to try to construct a triangulation of the shape, that is, a set of simplices whose union is homeomorphic to the shape. This problem has given rise to many research works in the computational geometry community, motivated by applications to 3D model reconstruction and manifold learning. In this talk, we focus on one particular instance of the shape reconstruction problem, in which the shape we wish to reconstruct is an orientable smooth $d$-manifold embedded in $\mathbb{R}^N$. We reformulate the problem of searching for a triangulation as a convex minimization problem, whose objective function is a weighted $\ell_1$-norm. I will then present the result in \cite{socg2022} which says that, under appropriate conditions, the solution of our minimization problem is indeed a triangulation of the manifold and that this triangulation coincides with a variant of the tangential Delaunay complex. This is a joint work with André Lieutier.
Friday, October 14, 2022
9:00 AM
Barbara Gris - Defining Data-Driven Deformation Models
Barbara Gris - Defining Data-Driven Deformation Models
9:00 AM - 10:00 AM
Studying shapes through large deformations allows to define a metric on a space of shapes from a metric on a space of deformations. When the set of considered deformations is not rel-evant to the observed data, the geodesic paths for this metric can be deceiving from a model-ling point of view. To overcome this issue, the notion of deformation module allows to incorpo-rate prior coming from the data in the set of considered deformations and the metric. I will pre-sent this framework, as well as the IMODAL library which enables to perform registration through such structured deformations. This Python library is modular: adapted priors can be easily defined by the user, several priors can be combined into a global one and various types of data can be considered such as curves, meshes or images. This is a joint work with Benjamin Charlier, Leander Lacroix and Alain Trouvé.
10:00 AM
Laurent Younes -Stochastic Gradient Descent for Large-Scale LDDMM
Laurent Younes -Stochastic Gradient Descent for Large-Scale LDDMM
10:00 AM - 11:00 AM
11:30 AM
Stephen Preston - Isometric immersions and the waving of flags
Stephen Preston - Isometric immersions and the waving of flags
11:30 AM - 12:30 PM
A physical flag can be modeled geometrically as an isometric immersion of a rectangle into space, with one edge fixed along the flagpole. Its motion, in the absence of gravity and wind, can be modeled as a geodesic in the space of all isometric immersions, where the Riemannian metric is inherited from the kinetic energy on the much larger space of all immersions. In this talk I will show how generically such an isometric immersion can be described completely by the curve describing the top or bottom edge, which gives a global version of a classical local result in differential geometry. Using this, I will show how to derive the geodesic equation, which turns out to be a highly nonlinear, nonlocal coupled system of two wave equations in one space variable, with tension determined by solving an ODE system. The new model has the potential to describe motion of cloth with much fewer variables than the traditional method of strongly constraining three functions of two space variables. This is joint work with Martin Bauer and Jakob Moeller-Andersen
Saturday, October 15, 2022
Sunday, October 16, 2022
Monday, October 17, 2022
Tuesday, October 18, 2022
2:00 PM
Mikhail Belkin - Mathematical Aspects of Deep Learning (1/2)
Mikhail Belkin - Mathematical Aspects of Deep Learning (1/2)
2:00 PM - 3:30 PM
Room: Amphitheater Darboux
4:00 PM
Yusu Wang - Some Theoretical Aspects of Graph Neural Networks (and Higher Order Variants) (1/2)
Yusu Wang - Some Theoretical Aspects of Graph Neural Networks (and Higher Order Variants) (1/2)
4:00 PM - 5:30 PM
Room: Amphitheater Darboux
Wednesday, October 19, 2022
9:00 AM
Mikhail Belkin - Mathematical Aspects of Deep Learning (2/2)
Mikhail Belkin - Mathematical Aspects of Deep Learning (2/2)
9:00 AM - 10:30 AM
Room: Amphitheater Darboux
11:00 AM
Yusu Wang - Some Theoretical Aspects of Graph Neural Networks (and Higher Order Variants) (2/2)
Yusu Wang - Some Theoretical Aspects of Graph Neural Networks (and Higher Order Variants) (2/2)
11:00 AM - 12:30 PM
Room: Amphitheater Darboux
2:00 PM
Quentin Mérigot - Optimal Transport (2/8)
Quentin Mérigot - Optimal Transport (2/8)
2:00 PM - 3:40 PM
Room: Amphitheater Darboux
4:00 PM
Ery Arias-Castro & Eddie Aamari - Embedding for Data Analysis (2/8)
Ery Arias-Castro & Eddie Aamari - Embedding for Data Analysis (2/8)
4:00 PM - 5:40 PM
Room: Amphitheater Darboux
Thursday, October 20, 2022
9:00 AM
Eric Klassen - Geometry of Shape Spaces of Curves and Surfaces (4/9).
Eric Klassen - Geometry of Shape Spaces of Curves and Surfaces (4/9).
9:00 AM - 10:40 AM
Room: Amphitheater Darboux
11:00 AM
Wolfgang Polonik - Statistical Topological Data Analysis (4/9)
Wolfgang Polonik - Statistical Topological Data Analysis (4/9)
11:00 AM - 12:40 PM
Room: Amphitheater Darboux
Friday, October 21, 2022
Saturday, October 22, 2022
Sunday, October 23, 2022
Monday, October 24, 2022
Tuesday, October 25, 2022
Wednesday, October 26, 2022
2:00 PM
Quentin Mérigot - Optimal Transport (3/8)
Quentin Mérigot - Optimal Transport (3/8)
2:00 PM - 3:40 PM
Room: Amphitheater Darboux
4:00 PM
Ery Arias-Castro & Eddie Aamari - Embedding for Data Analysis (3/8)
Ery Arias-Castro & Eddie Aamari - Embedding for Data Analysis (3/8)
4:00 PM - 5:40 PM
Sessions of October 26th and November 2nd on Zoom only
Thursday, October 27, 2022
9:00 AM
Eric Klassen - Geometry of Shape Spaces of Curves and Surfaces (5/9).
Eric Klassen - Geometry of Shape Spaces of Curves and Surfaces (5/9).
9:00 AM - 10:40 AM
Room: Amphitheater Darboux
11:00 AM
Wolfgang Polonik - Statistical Topological Data Analysis (5/9)
Wolfgang Polonik - Statistical Topological Data Analysis (5/9)
11:00 AM - 12:40 PM
Room: Amphitheater Darboux
Friday, October 28, 2022
Saturday, October 29, 2022
Sunday, October 30, 2022
Monday, October 31, 2022
Tuesday, November 1, 2022
Wednesday, November 2, 2022
2:00 PM
Quentin Mérigot - Optimal Transport (4/8)
Quentin Mérigot - Optimal Transport (4/8)
2:00 PM - 3:40 PM
Room: Amphitheater Darboux
4:00 PM
Ery Arias-Castro & Eddie Aamari - Embedding for Data Analysis (4/8)
Ery Arias-Castro & Eddie Aamari - Embedding for Data Analysis (4/8)
4:00 PM - 5:40 PM
Sessions of October 26th and November 2nd on Zoom only
Thursday, November 3, 2022
9:00 AM
Eric Klassen - Geometry of Shape Spaces of Curves and Surfaces (6/9).
Eric Klassen - Geometry of Shape Spaces of Curves and Surfaces (6/9).
9:00 AM - 10:40 AM
Room: Amphitheater Darboux
11:00 AM
Wolfgang Polonik - Statistical Topological Data Analysis (6/9).
Wolfgang Polonik - Statistical Topological Data Analysis (6/9).
11:00 AM - 12:40 PM
Room: Amphitheater Darboux
Friday, November 4, 2022
Saturday, November 5, 2022
Sunday, November 6, 2022
Monday, November 7, 2022
Tuesday, November 8, 2022
9:00 AM
Accueil
Accueil
9:00 AM - 9:15 AM
Room: Amphitheater Darboux
9:15 AM
Introduction de la journée (AMIES & Semestre Gesda).
Introduction de la journée (AMIES & Semestre Gesda).
9:15 AM - 9:30 AM
9:30 AM
Baptiste Labarthe (Metafora) Analyse Topologique des données : applications prometteuses aux données de cytométrie et au diagnostic médical.
Baptiste Labarthe (Metafora) Analyse Topologique des données : applications prometteuses aux données de cytométrie et au diagnostic médical.
9:30 AM - 10:00 AM
10:00 AM
Klervi Le Gall (UN) Analyse en Composantes Principales sur l’espace des fonctions à valeurs sur la variété des rotations 3-dimensionnelles: application à l’évaluation du déficit ambulatoire chez les patients atteints de SEP.
Klervi Le Gall (UN) Analyse en Composantes Principales sur l’espace des fonctions à valeurs sur la variété des rotations 3-dimensionnelles: application à l’évaluation du déficit ambulatoire chez les patients atteints de SEP.
10:00 AM - 10:25 AM
10:25 AM
Rémi Perrichon (ENAC) Statistique et géométrie au service de l’analyse de trajectoires d’avions.
Rémi Perrichon (ENAC) Statistique et géométrie au service de l’analyse de trajectoires d’avions.
10:25 AM - 10:50 AM
10:50 AM
pause
pause
10:50 AM - 11:15 AM
11:15 AM
Table ronde avec Frédéric Barbaresco (Thales), Nicolas Bousquet (EDF) et Stéphanie Allassonnière (Univ. Paris Cité).
Table ronde avec Frédéric Barbaresco (Thales), Nicolas Bousquet (EDF) et Stéphanie Allassonnière (Univ. Paris Cité).
11:15 AM - 12:30 PM
12:30 PM
buffet
buffet
12:30 PM - 2:00 PM
2:00 PM
posters
posters
2:00 PM - 3:00 PM
3:00 PM
Présentation commune des librairies et plateformes GeomStats, Gudhi et TTK.
Présentation commune des librairies et plateformes GeomStats, Gudhi et TTK.
3:00 PM - 5:00 PM
5:15 PM
Atelier GeomStats
Atelier GeomStats
5:15 PM - 6:00 PM
Atelier Gudhi
Atelier Gudhi
5:15 PM - 6:00 PM
Atelier TTK
Atelier TTK
5:15 PM - 6:00 PM
Wednesday, November 9, 2022
2:00 PM
Quentin Mérigot - Optimal Transport (5/8)
Quentin Mérigot - Optimal Transport (5/8)
2:00 PM - 3:40 PM
Room: Amphitheater Darboux
4:00 PM
Ery Arias-Castro & Eddie Aamari - Embedding for Data Analysis (5/8)
Ery Arias-Castro & Eddie Aamari - Embedding for Data Analysis (5/8)
4:00 PM - 5:40 PM
Room: Amphitheater Darboux
Thursday, November 10, 2022
9:00 AM
Eric Klassen - Geometry of Shape Spaces of Curves and Surfaces (7/9)
Eric Klassen - Geometry of Shape Spaces of Curves and Surfaces (7/9)
9:00 AM - 10:40 AM
Room: Amphitheater Darboux
11:00 AM
Wolfgang Polonik - Statistical Topological Data Analysis (7/9).
Wolfgang Polonik - Statistical Topological Data Analysis (7/9).
11:00 AM - 12:40 PM
Room: Amphitheater Darboux
Friday, November 11, 2022
Saturday, November 12, 2022
Sunday, November 13, 2022
Monday, November 14, 2022
Tuesday, November 15, 2022
2:30 PM
Kathryn Hess - Topological Approaches to Neuroscience (1/2).
Kathryn Hess - Topological Approaches to Neuroscience (1/2).
2:30 PM - 4:00 PM
Room: Amphitheater Darboux
Wednesday, November 16, 2022
10:00 AM
Kathryn Hess - Topological Approaches to Neuroscience (2/2)
Kathryn Hess - Topological Approaches to Neuroscience (2/2)
10:00 AM - 11:30 AM
Room: Amphitheater Darboux
2:00 PM
Quentin Mérigot - Optimal Transport (6/8)
Quentin Mérigot - Optimal Transport (6/8)
2:00 PM - 3:40 PM
Room: Amphitheater Darboux
4:00 PM
Ery Arias-Castro & Eddie Aamari - Embedding for Data Analysis (6/8)
Ery Arias-Castro & Eddie Aamari - Embedding for Data Analysis (6/8)
4:00 PM - 5:40 PM
Room: Amphitheater Darboux
Thursday, November 17, 2022
9:00 AM
Eric Klassen - Geometry of Shape Spaces of Curves and Surfaces (8/9).
Eric Klassen - Geometry of Shape Spaces of Curves and Surfaces (8/9).
9:00 AM - 10:40 AM
Room: Amphitheater Darboux
11:00 AM
Wolfgang Polonik - Statistical Topological Data Analysis (8/9).
Wolfgang Polonik - Statistical Topological Data Analysis (8/9).
11:00 AM - 12:40 PM
Room: Amphitheater Darboux
Friday, November 18, 2022
Saturday, November 19, 2022
Sunday, November 20, 2022
Monday, November 21, 2022
1:00 PM
Registration
Registration
1:00 PM - 2:00 PM
2:00 PM
Arthur Gretton
Arthur Gretton
2:00 PM - 3:00 PM
KALE Flow: A Relaxed KL Gradient Flow for Probabilities with Disjoint Support
3:00 PM
Blanche Buet
Blanche Buet
3:00 PM - 4:00 PM
A varifold perspective on discrete surfaces
4:30 PM
Christophe Ley
Christophe Ley
4:30 PM - 5:30 PM
Advances in statistics via tools from Stein's Method
Tuesday, November 22, 2022
10:00 AM
Théo Lacombe
Théo Lacombe
10:00 AM - 11:00 AM
TBA
11:30 AM
Bodhisattva Sen
Bodhisattva Sen
11:30 AM - 12:30 PM
Distribution-free testing for Multivariate symmetry using Optimal Transport
2:30 PM
Johan Segers
Johan Segers
2:30 PM - 3:30 PM
Graphical and uniform consistency of estimated optimal transport plans
3:30 PM
Gabriel Peyré
Gabriel Peyré
3:30 PM - 4:30 PM
Unbalanced Optimal Transport across Metric Measured Spaces
5:00 PM
Quentin Berthet
Quentin Berthet
5:00 PM - 6:00 PM
Mirror Sinkhorn: Fast Online Optimization on Transport Polytopes
6:00 PM
Cocktail
Cocktail
6:00 PM - 8:00 PM
Wednesday, November 23, 2022
10:00 AM
Quentin Paris
Quentin Paris
10:00 AM - 11:00 AM
Online learning with exponential weights in metric spaces with the measure contraction property
11:30 AM
Giovanni Peccati
Giovanni Peccati
11:30 AM - 12:30 PM
TBA
2:30 PM
Agnès Desolneux
Agnès Desolneux
2:30 PM - 3:30 PM
A Wasserstein-type distance in the space of Gaussian mixture models
3:30 PM
Nicolas Courty
Nicolas Courty
3:30 PM - 4:30 PM
Sliced Wasserstein on Manifolds : Spherical and Hyperbolical cases
Thursday, November 24, 2022
1:00 PM
Jérôme Dedecker
Jérôme Dedecker
1:00 PM - 2:00 PM
Some bounds for the Wasserstein distance between the empirical measure and the marginal distibution of a sequence of i.i.d. random variables
2:00 PM
Bharath Sriperumbudur
Bharath Sriperumbudur
2:00 PM - 3:00 PM
Spectral regularized kernel two-sample tests
3:30 PM
Jean Feydy
Jean Feydy
3:30 PM - 4:30 PM
Computational Optimal Transport : mature tools and open problems
4:30 PM
Thibaut Le Gouic
Thibaut Le Gouic
4:30 PM - 5:30 PM
An Algorithmic Solution to the Blotto Game using Multi-marginal Couplings
Friday, November 25, 2022
9:00 AM
François-Xavier Vialard
François-Xavier Vialard
9:00 AM - 10:00 AM
Statistical estimation of optimal transport potentials
10:00 AM
Olga Mula
Olga Mula
10:00 AM - 11:00 AM
Structured prediction with sparse Wasserstein barycenters
11:30 AM
Elsa Cazelles
Elsa Cazelles
11:30 AM - 12:30 PM
Barycenters for probability distributions based on optimal weak mass transport
Saturday, November 26, 2022
Sunday, November 27, 2022
Monday, November 28, 2022
Tuesday, November 29, 2022
2:00 PM
Stephen Preston - Riemannian Geometry on Lie Groups (1/2)
Stephen Preston - Riemannian Geometry on Lie Groups (1/2)
2:00 PM - 4:00 PM
Room: Amphitheater Darboux
Wednesday, November 30, 2022
10:00 AM
Stephen Preston - Riemannian Geometry on Lie Groups (2/2)
Stephen Preston - Riemannian Geometry on Lie Groups (2/2)
10:00 AM - 12:00 PM
Room: Amphitheater Darboux
2:00 PM
Quentin Mérigot - Optimal Transport (7/8)
Quentin Mérigot - Optimal Transport (7/8)
2:00 PM - 3:40 PM
Room: Amphitheater Darboux
4:00 PM
Ery Arias-Castro & Eddie Aamari - Embedding for Data Analysis (7/8)
Ery Arias-Castro & Eddie Aamari - Embedding for Data Analysis (7/8)
4:00 PM - 5:40 PM
Room: Amphitheater Darboux
Thursday, December 1, 2022
9:00 AM
Eric Klassen - Geometry of Shape Spaces of Curves and Surfaces (9/9).
Eric Klassen - Geometry of Shape Spaces of Curves and Surfaces (9/9).
9:00 AM - 10:40 AM
Room: Amphitheater Darboux
11:00 AM
Wolfgang Polonik - Statistical Topological Data Analysis (9/9).
Wolfgang Polonik - Statistical Topological Data Analysis (9/9).
11:00 AM - 12:40 PM
Room: Amphitheater Darboux
Friday, December 2, 2022
Saturday, December 3, 2022
Sunday, December 4, 2022
Monday, December 5, 2022
Tuesday, December 6, 2022
Wednesday, December 7, 2022
2:00 PM
Quentin Mérigot - Optimal Transport (8/8)
Quentin Mérigot - Optimal Transport (8/8)
2:00 PM - 3:40 PM
Room: Amphitheater Darboux
4:00 PM
Ery Arias-Castro & Eddie Aamari - Embedding for Data Analysis (8/8)
Ery Arias-Castro & Eddie Aamari - Embedding for Data Analysis (8/8)
4:00 PM - 5:40 PM
Room: Amphitheater Darboux
Thursday, December 8, 2022
Friday, December 9, 2022