# Geometry and Statistics in Data Sciences, Paris

Europe/Paris
Amphitheater Darboux (IHP)

### Amphitheater Darboux

#### IHP

11, Rue Pierre et Marie Curie 75005 Paris
Description

## Geometry and Statistics in Data Sciences

Thematic quarter program at Institut Henri Poincaré, Paris

September 5th - December 9th, 2022

Outline

The goal of this Thematic quarter is to highlight the rich interactions between Statistics, Probability theory, Geometry and Topology in the field of Mathematics of AI. It will allow young researchers, Masters students and PhD students to explore cross-disciplinary topics.

Program

Long Courses

• Wednesdays 14:00 - 18:00 | Starting September 28th
• Thursdays 9:00 - 13:00 | Starting September 15th

Mini-Courses

• September, Wednesday 21st AM & PM
• Joseph Yukich (Lehigh University)
• Asymptotic Analysis of Statistics of Random Geometric Structures
• September, Tuesday 27th PM & Wednesday 28th AM
• Nicolas Charon (John Hopkins)
• A Few Applications of Geometric Measure Theory to Shape Analysis
• October, Tuesday 18th PM & Wednesday 19th AM
• Mikhail Belkin (UC San Diego)
• Mathematical Aspects of Deep Learning
• Yusu Wang (UC San Diego)
• Some Theoretical Aspects of Graph Neural Networks (and Higher Order Variants)
• November, Tuesday 15th PM & Wednesday 16th AM
• November, Tuesday 29th PM & Wednesday 30th AM

Invited Professors

Registration:

Registration for the quarter (at the bottom of this page) will allow you to be informed of the holding of the various events during the semester, in particular for the courses (schedules, zoom links, etc.). For logistic reasons, registration for the quarter does not give you an automatic registration for the workshops, so please also register for each workshop you wish to attend.

Scientific Committee

• Charles Bouveyron (Université Côte d’Azur)
• Marco Cuturi (Google Brain, ENSAE)
• Gabor Lugosi (Pompeu Fabra University)
• Pascal Massart (Université Paris-Saclay)
• Mathilde Mougeot (Centre Borelli, ENS Paris-Saclay, ENSIIE)
• Xavier Pennec (Inria)
• Sara Van de Geer (ETH Zürich)

Organizing Committee

• Eddie Aamari (LPSM, CNRS)
• Catherine Aaron (LMBP, Université Clermont Auvergne)
• Frédéric Chazal (LMO, INRIA)
• Aurélie Fischer (LPSM, Université de Paris)
• Marc Hoffmann (CEREMADE, Paris Dauphine)
• Alice Le Brigant (SAMM, Paris 1 Panthéon Sorbonne)
• Clément Levrard (LPSM, Université de Paris)
• Bertrand Michel (LMJL, Ecole Centrale Nantes)

Support

Registration
Pre-registration for the trimester
Participants
• Aayush Gautam
• Alessandro Leite
• Alessandro Maria Selvitella
• Alex Buchel
• Alex Delalande
• Alexander Cloninger
• Alexandre Guérin
• Alexandre Miot
• Alexis Bismuth
• Alice Le Brigant
• Amin Mohebbi
• Anafi Jafar
• Andreas Bock
• Andrew Warren
• Andrzej Biś
• Anne Zhao
• Anthony NOUY
• Antonio Ocello
• Anurag Sharma
• Arianna Di Bernardo
• Arianna Salili-James
• Arthur Stéphanovitch
• Aurélie FISCHER
• Aymeric Stamm
• Ayodele Abimbola Idowu
• Baalu Belay KETEMA
• Benjamin Gess
• Benjamin Wandelt
• Benoit Oriol
• Bernard JULIA
• Bernhard Eisvogel
• Bertrand Michel
• Binh Nguyen
• Blanche BUET
• CARLOS ROMERO
• Castro G. HOUNMENOU
• catherine aaron
• Catherine Labruère
• Charles Arnal
• Charles Marchetti
• Charly Andral
• Charly Boricaud
• Christian Houdré
• Christian NGNIE
• Christophe Ley
• Christophe pichon
• Claire Brécheteau
• Clement Levrard
• Cécile Durot
• D Yogeshwaran
• DAMARIS KILANGO
• Damien Garreau
• David Donoho
• Davide Gurnari
• Decock Jérémie
• Diego Varela
• Dominic Dayta
• Dr. Rafael Stekolshchik
• Eddie Aamari
• Elaheh Akbarifathkouhi
• ELENA BORTOLATO
• Elodie Maignant
• Eloi Tanguy
• Emin Durmishi
• Emmanuel Caron
• Emmanuel Hartman
• Eric Klassen
• erika pellegrino
• Etienne Lasalle
• Eunseong Bae
• Fanyu Cui
• Farida Enikeeva
• Fatna ABDEDOU
• Felipe Tobar
• Ferdinand Le Coz
• Florence DA SILVA
• François Petit
• Frederic BARBARESCO
• Frédéric Chazal
• gael kermarrec
• Gauthier Thurin
• GHOST STUDENT
• Gianni Franchi
• Gilles Blanchard
• Gilles Mordant
• Guillaume Braun
• Guillaume Serieys
• Henrique Goulart
• HORTENCE PHALONNE YIEPNOU NANA
• Houri Ziaeepour
• Houssam Boukhecham
• Hugo Chardon
• Hugo Henneuse
• Imane Rezgui
• Iuri Macocco
• Ixandra Achitouv
• Jacob Bamberger
• Jacob Leygonie
• Jafar Mohammed
• JEAN BAPTISTE PATENOU
• Jean Bernard LASSERRE
• Jean-Jacques GODEME
• Jean-Michel Alimi
• JESUS DE LOERA
• Jianyu MA
• Jisu Kim
• Joan Alexis Glaunès
• Joaquin Diaz-Alonso
• Joel Right Dzokou Talla
• Johan Segers
• John Harvey
• Jose Martin Mijangos Tovar
• Joseph Yukich
• José Gregorio GOMEZ-GARCIA
• Juhyun Park
• Jules Tsukahara
• Julie Mordacq
• Karim Haroun
• Karl John
• Kathryn Hess Bellwald
• Kevin Basita
• Kexin SHAO
• Kimsy Tor
• Koissi Savi
• Komi Afassinou
• Kristóf Huszár
• Lamia Aoudia
• Lamperti Letizia
• laure ferraris
• Lorenzo Audibert
• Luis Carvalho
• Luis Felipe Vargas Rojas
• Magalie Fromont
• Marc Glisse
• marc lambert
• Mariem Abaach
• Mariem Abaach
• martin lesourd
• María José Llop
• Mathilde MOUGEOT
• Matteo Pegoraro
• Maxence Noble
• Maxime Guillaud
• Michel Delfour
• Milton Wong
• Mircea Petrache
• Mohamed Ndaoud
• Mohammed Bajja
• Mohan Yang
• nabil MUSTAFA
• Nathan De Carvalho
• Nicolas Berkouk
• Nicolas Charon
• Nicolas Chenavier
• Nicolas CONANEC
• Nicolas Elias Igolnikov
• Nicolas Schreuder
• Nikita Malik
• Niklas Hellmer
• Nimesh Agrawal
• Nimrah Mustafa
• Olga Mula
• Olympio Hacquard
• Omar Rivasplata
• OPEOLUWA OGUNDIPE
• Oskar Laverny
• Ottavio KHALIFA
• Palle Jorgensen
• Parin Chaipunya
• Pascal SUNGU NGOY
• Paul Kagori
• Paul Kagori
• Paul Pegon
• Paulin Tshiunza Tshibuabua
• Paulo Dawid
• pawan kumar
• Perrine Chassat
• Peter Whalley
• Pierre Marion
• pierre orhan
• Pierre Pansu
• Pulkit Gopalani
• Qiang Du
• Qingsong Wang
• Quang-Duc DAO
• Quentin Mérigot
• QUOC-TUNG LE
• Raghav Dev
• Raphael Barboni
• Raphaël Romero
• Rien van de Weygaert
• Romain De Angeli
• Romain Périer
• Ryan Cotsakis
• Samuel Asante Gyamerah
• Sandeep Kumar
• Sandipan Bhattacherjee
• Sandy Frank Kwamou Ngaha
• Sandy Frank Kwamou Ngaha
• Selim Soufargi
• Shamoona Jabeen
• Shamoona Jabeen
• Shanqing LIU
• Simon Prunet
• Simone Azeglio
• Smegnsh Yeruk
• Sofiia Minasian
• Sohom Mukherjee
• Sothea Has
• Stephen Preston
• Sylvie Lhermitte
• Tanya Schmah
• Theotime Kures
• Thomas Pierron
• Thomas Wang
• Théo Bertrand
• Théo Dumont
• Théo Lacombe
• Tom Szwagier
• Umut Simsekli
• URIEL NGUEFACK YEFOU
• Vahideh Vahidifar
• Valentina Ros
• Vanitha Mysore Krishna
• VEERENDRA KUMAR
• Victor-Emmanuel Brunel
• Wojciech Reise
• Wolfgang Polonik
• Xiaoyu (Victor) Wang
• Yang Qi
• Yannick KERGOSIEN
• Yikun Zhang
• Yuxiu Shao
Contact
• Monday, September 5
• 1
High Dimensional Statistics CARGESE

#### CARGESE

CARGESE PRE-SCHOOL

• 2
Geometric Statistic CARGESE

#### CARGESE

CARGESE PRE-SCHOOL

• 3
Topological Data Analysis CARGESE

#### CARGESE

CARGESE PRE-SCHOOL

• Tuesday, September 6
• 4
High Dimensional Statistics CARGESE

#### CARGESE

CARGESE PRE-SCHOOL

• 5
Geometric Statistic CARGESE

#### CARGESE

CARGESE PRE-SCHOOL

• 6
Topological Data Analysis CARGESE

#### CARGESE

CARGESE PRE-SCHOOL

• Wednesday, September 7
• 7
High Dimensional Statistics CARGESE

#### CARGESE

CARGESE PRE-SCHOOL

• 8
Geometric Statistic CARGSES

#### CARGSES

CARGESE PRE-SCHOOL

• 9
Topological Data Analysis CARGESE

#### CARGESE

CARGESE PRE-SCHOOL

• Thursday, September 8
• 10
High Dimensional Statistics CARGESE

#### CARGESE

CARGESE PRE-SCHOOL

• 11
Geometric Statistic CARGESE

#### CARGESE

CARGESE PRE-SCHOOL

• 12
Topological Data Analysis CARGESE

#### CARGESE

CARGESE PRE-SCHOOL

• 13
Topological Data Analysis CARGESE

#### CARGESE

CARGESE PRE-SCHOOL

• Friday, September 9
• 14
High Dimensional Statistics CARGESE

#### CARGESE

CARGESE PRE-SCHOOL

• 15
Geometric Statistic CARGSESE

#### CARGSESE

CARGESE PRE-SCHOOL

• Wednesday, September 14
• Welcome coffee 2d floor
• Thursday, September 15
• 16
Eric Klassen - Geometry of Shape Spaces of Curves and Surfaces (1/9)
• 17
Wolfgang Polonik - Statistical Topological Data Analysis (1/9) Amphitheater Darboux

### Amphitheater Darboux

#### IHP

11, Rue Pierre et Marie Curie 75005 Paris
• Wednesday, September 21
• 18
Joseph Yukich - Asymptotic Analysis of Statistics of Random Geometric Structures (1/2) Amphitheater Darboux

### Amphitheater Darboux

#### IHP

11, Rue Pierre et Marie Curie 75005 Paris
• 19
Joseph Yukich - Asymptotic Analysis of Random Geometric Structures (2/2) Amphitheater Darboux

### Amphitheater Darboux

#### IHP

11, Rue Pierre et Marie Curie 75005 Paris
• Thursday, September 22
• 20
Eric Klassen - Geometry of Shape Spaces of Curves and Surfaces (2/9) Amphitheater Darboux

### Amphitheater Darboux

#### IHP

11, Rue Pierre et Marie Curie 75005 Paris
• 21
Wolfgang Polonik - Statistical Topological Data Analysis (2/9) Amphitheater Darboux

### Amphitheater Darboux

#### IHP

11, Rue Pierre et Marie Curie 75005 Paris
• Tuesday, September 27
• 22
Nicolas Charon - A Few Applications of Geometric Measure Theory to Shape Analysis (1/2) Amphitheater Darboux

### Amphitheater Darboux

#### IHP

11, Rue Pierre et Marie Curie 75005 Paris
• Wednesday, September 28
• 23
Nicolas Charon - A Few Applications of Geometric Measure Theory to Shape Analysis (2/2) Amphitheater Darboux

### Amphitheater Darboux

#### IHP

11, Rue Pierre et Marie Curie 75005 Paris
• 24
Quentin Mérigot - Optimal Transport (1/8) Amphitheater Darboux

### Amphitheater Darboux

#### IHP

11, Rue Pierre et Marie Curie 75005 Paris
• 25
Ery Arias-Castro & Eddie Aamari - Embedding for Data Analysis (1/8) Amphitheater Darboux

### Amphitheater Darboux

#### IHP

11, Rue Pierre et Marie Curie 75005 Paris
• Thursday, September 29
• 26
Eric Klassen - Geometry of Shape Spaces of Curves and Surfaces (3/9). Amphitheater Darboux

### Amphitheater Darboux

#### IHP

11, Rue Pierre et Marie Curie 75005 Paris
• 27
Wolfgang Polonik - Statistical Topological Data Analysis (3/9) Amphitheater Darboux

### Amphitheater Darboux

#### IHP

11, Rue Pierre et Marie Curie 75005 Paris
• Monday, October 3
• 28
Introduction Amphitheater Hermite, IHP

#### Amphitheater Hermite, IHP

• 29
Sophie Langer- Overcoming the curse of dimensionality with deep neural networks Amphitheater Hermite, IHP

#### Amphitheater Hermite, IHP

Although the application of deep neural networks to real-world problems has become ubiquitous, the question of why they are so effective has not yet been satisfactorily answered. However, some progress has been made in establishing an underlying mathematical foundation. This talk surveys results on statistical risk bounds of deep neural networks. In particular, we focus on the question of when neural networks bypass the curse of dimensionality. Here we discuss results for vanilla feedforward and convolutional neural networks as well as regression and classification settings.

• 30
Adeline Fermanian- Scaling ResNets in the Large-depth Regime Amphitheater Hermite, IHP

#### Amphitheater Hermite, IHP

Deep ResNets are recognized for achieving state-of-the-art results in complex machine learning tasks. However, the remarkable performance of these architectures relies on a training procedure that needs to be carefully crafted to avoid vanishing or exploding gradients, particularly as the depth L increases. No consensus has been reached on how to mitigate this issue, although a widely discussed strategy consists in scaling the output of each layer by a factor \alpha_L. We show in a probabilistic setting that with standard i.i.d. initializations, the only non-trivial dynamics is for \alpha_L = 1/ \sqrt{L} (other choices lead either to explosion or to identity mapping). This scaling factor corresponds in the continuous-time limit to a neural stochastic differential equation, contrarily to a widespread interpretation that deep ResNets are discretizations of neural ordinary differential equations. By contrast, in the latter regime, stability is obtained with specific correlated initializations and \alpha_L=1/L. Our analysis suggests a strong interplay between scaling and regularity of the weights as a function of the layer index. Finally, in a series of experiments, we exhibit a continuous range of regimes driven by these two parameters, which jointly impact performance before and after training.

• 31
Mikhail Belkin- Neural networks, wide and deep, singular kernels and Bayes optimality Amphitheater Hermite, IHP

#### Amphitheater Hermite, IHP

Wide and deep neural networks are used in many important practical setting.
In this talk I will discuss some aspects of width and depth related to optimization and generalization.
I will first discuss what happens when neural networks become infinitely wide,
giving a general result for the transition to linearity (i.e., showing that neural networks become linear functions of parameters) for a broad class of wide neural networks corresponding to directed graphs.
I will then proceed to the question of depth, showing equivalence between infinitely wide and deep fully connected networks trained with gradient descent and Nadaraya-Watson predictors based on certain singular kernels.
Using this connection we show that for certain activation functions these wide and deep networks are (asymptotically) optimal for classification but, interestingly, never for regression.
Based on joint work with Chaoyue Liu, Adit Radhakrishnan, Caroline Uhler and Libin Zhu.

• Tuesday, October 4
• 32
Clément Berenfeld- Understanding the geometry of high-dimensional data through the reach Amphitheater Hermite, IHP

#### Amphitheater Hermite, IHP

In high-dimensional statistics, and more particularly in manifold learning, the reach is an ubiquitous regularity parameters that encompasses the well-behavior of the support of the underlying probability measure. Enforcing a reach constraint is, in most geometric inference tasks, a necessity, which raises the question of the estimability of this parameter.
We will try to understand how the reach relates to many other important geometric invariants and propose and estimation strategy that relies on estimating the intrinsic metric of the data.
(Joint work with Eddie Aamari and Clément Levrard)

• 33
Wolfgang Polonik- Topologically penalized regression on manifolds Amphitheater Hermite, IHP

#### Amphitheater Hermite, IHP

We study a regression problem on a compact manifold. In order to take advantage of the underlying geometry and topology of the data, we propose to perform the regression task on the basis of eigenfunctions of the Laplace-Beltrami operator of the manifold that are regularized with topological penalties. We will discuss the approach and the penalties, provide some supporting theory and illustrate the performance of the methodology on some data sets, illustrating the relevance of our approach in the case where the target function is topologically smooth”. This is joint work with O. Hacquard, K. Balasubramanian, G. Blanchard and C. Levrard.

• 34
John Harlim- Leveraging the RBF operator estimation for manifold learning. Amphitheater Hermite, IHP

#### Amphitheater Hermite, IHP

I will discuss the radial-basis function pointwise and weak formulations for approximating Laplacians on functions and vector fields based on randomly sampled point cloud data, whose spectral properties are relevant to manifold learning. For the pointwise formulation, I will demonstrate the importance of the novel local tangent estimation that accounts for the curvature, which crucially improves the quality of the operator estimation. I will report the spectral theoretical convergence results of these formulations and their strengths/weaknesses in practice. Supporting numerical examples, involving the spectral estimation of the Laplace-Beltrami operator and various vector Laplacians such as the Bochner, Hodge, and Lichnerowicz Laplacians will be demonstrated with appropriate comparisons to the standard graph-based approaches.

• 35
Marina Meila - Manifold Learning, Explanations and Eigenflows Amphitheater Hermite, IHP

#### Amphitheater Hermite, IHP

This talk will extend Manifold Learning in two directions.
First, we ask if it is possible, in the case of scientific data where quantitative prior knowledge is abundant, to explain a data manifold by new coordinates, chosen from a set of scientifically meaningful functions?
Second, we ask how popular Manifold Learning tools and their applications can be recreated in the space of vector fields and flows on a manifold.
Central to this approach is the order 1-Laplacian of a manifold, $\Delta_1$, whose eigen-decomposition into gradient, harmonic, and curl, known as the Helmholtz-Hodge Decomposition, provides a basis for all vector fields on a manifold. We present an estimator for $\Delta_1$, and based on it we develop a variety of applications. Among them, visualization of the principal harmonic, gradient or curl flows on a manifold, smoothing and semi-supervised learning of vector fields, 1-Laplacian regularization. In topological data analysis, we describe the 1st-order analogue of spectral clustering, which amounts to prime manifold decomposition. Furthermore, from this decomposition a new algorithm for finding shortest independent loops follows. The algorithms are illustrated on a variety of real data sets.
Joint work with Yu-Chia Chen, Samson Koelle, Hanyu Zhang and Ioannis Kevrekidis

• 36
Franck Picard - A probabilistic Graph Coupling View of Dimension Reduction Amphitheater Hermite, IHP

#### Amphitheater Hermite, IHP

Dimension reduction is a standard task in machine learning, to reduce the complexity and represent the data at hand. Many (and more than many!) methods have been proposed for this purpose, among which the seminal principal component analysis (PCA), that approximates the data linearly with a reduced number of axes. In recent years, the field has witness the emergence of new non linear methods, like the Stochastic Neighbor Embedding method (SNE) and the Uniform Manifold Approximation and Projection method (UMAP), that proposes very efficient low-dimensional representations of the observations. Though widely used, these approaches lack clear probabilistic foundations to enable a full understanding of their properties and limitations. A common feature of these techniques is to be based on a minimization of a cost between input and latent pairwise similarities, but the generative model is still missing. In this work we introduce a unifying statistical framework based on the coupling of hidden graphs using cross entropy. These graphs induce a Markov random field dependency structure among the observations in both input and latent spaces. We show that existing pairwise similarity dimension reduction methods can be retrieved from our framework with particular choices of priors for the graphs. Moreover this reveals that these methods suffer from a statistical deficiency that explains poor performances in conserving coarse-grain dependencies. Our model is leveraged and extended to address this issue while new links are drawn with Laplacian eigenmaps and PCA.

• 37
Alexander Cloninger - Learning on and near Low-Dimensional Subsets of the Wasserstein Manifold Amphitheater Hermite, IHP

#### Amphitheater Hermite, IHP

Detecting differences and building classifiers between distributions $\{\mu_i\}_{i=1}^N$, given only finite samples, are important tasks in a number of scientific fields. Optimal transport (OT) has evolved as the most natural concept to measure the distance between distributions, and has gained significant importance in machine learning in recent years. There are some drawbacks to OT: computing OT can be slow, and because OT is a distance metric, it only yields a pairwise distance matrix between distributions rather than embedding those distributions into a vector space. If we make no assumptions on the family of distributions, these drawbacks are difficult to overcome. However, in the case that the measures are generated by push-forwards by elementary transformations, forming a low-dimensional submanifold of the Wasserstein manifold, we can deal with both of these issues on a theoretical and a computational level. In this talk, we'll show how to embed the space of distributions into a Hilbert space via linearized optimal transport (LOT), and how linear techniques can be used to classify different families of distributions generated by elementary transformations and perturbations. The proposed framework significantly reduces both the computational effort and the required training data in supervised settings. Similarly, we'll demonstrate the ability to learn a near isometric embedding of the low-dimensional submanifold. Finally, we'll provide non-asymptotic bounds on the error induced in both the supervised and unsupervised algorithms from finitely sampling the target distributions and projecting the LOT Hilbert space into a finite dimensional subspace. We demonstrate the algorithms in pattern recognition tasks in imaging and provide some medical applications.

• 38
Cocktail
• Wednesday, October 5
• 39
Claudia Strauch- On high-dimensional Lévy-driven Ornstein–Uhlenbeck processes Amphitheater Hermite, IHP

#### Amphitheater Hermite, IHP

We investigate the problem of estimating the drift parameter of a high-dimensional Lévy-driven Ornstein–Uhlenbeck process under sparsity constraints. It is shown that both Lasso and Slope estimators achieve the minimax optimal rate of convergence (up to numerical constants), for tuning parameters chosen independently of the confidence level. The results are non-asymptotic and hold both in probability and conditional expectation with respect to an event resembling the restricted eigenvalue condition.
Based on joint work with Niklas Dexheimer.

• 40
Botond Szabo- Linear methods for nonlinear inverse problems Amphitheater Hermite, IHP

#### Amphitheater Hermite, IHP

We consider recovering an unknown function f from a noisy observation of the solution u to a partial differential equation, where for the elliptic differential operator L, the map L(u) can be written as a function of u and f, under Dirichlet boundary condition. A particular example is the time-independent Schrödinger equation. We transform this problem into the linear inverse problem of recovering L(u), and show that Bayesian methods for this problem may yield optimal recovery rates not only for u, but also for f. The prior distribution may be placed on u or its elliptic operator. Adaptive priors are shown to yield adaptive contraction rates for f, thus eliminating the need to know the smoothness of this function. Known results on uncertainty quantification for the linear problem transfer to f as well. The results are illustrated by several numerical simulations.
This is a joint work with Geerten Koers and Aad van der Vaart.

• 41
Judith Rousseau- Bayesian nonparametric estimation of a density living near an unknown manifold Amphitheater Hermite, IHP

#### Amphitheater Hermite, IHP

In high dimensions it is common to assume that the data have a lower dimensional structure. In this work we consider that the observations are iid and with a distribution whose support is concentrated near a lower dimensional manifold. Neither the manifold nor the density is known. A typical example is for noisy observations on an unknown low dimensional manifold.
We consider a family of Bayesian nonparametric density estimators based on location - scale Gaussian mixture priors and we study the asymptotic properties of the posterior distribution. Our work shows in particular that non conjuguate location - scale Gaussian mixture models can adapt to complex geometries and spatially varying regularity. This talk will also review the various aspects of mixtures of Gaussian for density estimation.
Joint work with Clément Berenfeld (Dauphine) and Paul Rosa (Oxford)

• Thursday, October 6
• 42
Denis Belometsny- Dimensionality reduction in reinforcement learning by randomisation Amphitheater Hermite, IHP

#### Amphitheater Hermite, IHP

In reinforcement learning an agent interacts with an environment, whose underlying mechanism is unknown, by sequentially taking actions, receiving rewards, and transitioning to the next state. With the goal of maximizing the expected sum of the collected rewards, the agent must carefully balance between exploring in order to gather more information about the environment and exploiting the current knowledge to collect the rewards. In this talk, we are interested in solving this exploration-exploitation dilemma by injecting noise into the agent’s decision-making process in such a way that the dependence of the regret on the dimension of state and action spaces is minimised. We also review some recent approaches towards dimension reduction in RL.

• 43
Gilles Blanchard - Stein effect for estimating many vector means: a "blessing of dimensionality" phenomenon Amphitheater Hermite, IHP

#### Amphitheater Hermite, IHP

Consider the problem of joint estimation of the means for a large number of distributions in R^d using separate, independent data sets from each of them, sometimes also called "multi-task averaging" problem.
We propose an improved estimator (compared to the naive empirical means of each data set) to exploit possible similarities between means, without any related information being known in advance. First, for each data set, similar or neighboring means are determined from the data by multiple testing. Then each naive estimator is shrunk towards the local average of its neighbors. We prove that this approach provides a reduction in mean squared error that can be significant when the (effective) dimensionality of the data is large, and when the unknown means exhibit structure such as clustering or concentration on a low-dimensional set. This is directly linked to the fact that the separation distance for testing is smaller than the estimation error in high dimension and generalizes the well-known James-Stein phenomenon. An application of this approach is the estimation of multiple kernel mean embeddings, which plays an important role in many modern applications.
(This is based on joined work with Hannah Marienwald and Jean-Baptiste Fermanian)

• 44
Nicolas Verzelen - Optimal Permutation Estimation in Crowd-Sourcing problems Amphitheater Hermite, IHP

#### Amphitheater Hermite, IHP

Motivated by crowd-sourcing applications, we consider a model where we have partial observations from a bivariate isotonic $n\times d$ matrix with an unknown permutation $\pi^*$ acting on its rows. We consider the twin problems of recovering the permutation $\pi^*$ and estimating the unknown matrix. We introduce a polynomial-time procedure achieving the minimax risk for these two problems, this for all possible values of $n$, $d$, and all possible sampling efforts. Along the way, we establish that, in some regimes, recovering the unknown permutation $\pi^*$ is considerably simpler than estimating the matrix. This is based on a joint work with Alexandra Carpentier (U. Potsdam) and Emmanuel Pilliat (U. Montpellier).

• 45
Claire Lacour - On the use of overfitting for estimator selection Amphitheater Hermite, IHP

#### Amphitheater Hermite, IHP

In this talk we consider the problem of estimator selection. In the case of density estimation, we study a method called PCO, which is intermediate between Lepski's method and penalized empirical risk minimization. The key point is the comparison of all the estimators to the overfitted one. We provide some theoretical results which lead to some fully data-driven selection strategy. We will also show the numerical performance of the method.
This is a joint work with P. Massart, V. Rivoirard and S. Varet.

• 46
Peter Bartlett - The Dynamics of Sharpness-Aware Minimization. Amphitheater Hermite, IHP

#### Amphitheater Hermite, IHP

Optimization methodology has been observed to affect statistical performance in
high-dimensional prediction problems, and there has been considerable effort devoted
to understanding the behavior of optimization methods and the nature of solutions
that they find. We consider Sharpness-Aware Minimization (SAM), a gradient-based
optimization method that has exhibited performance improvements over gradient de-
scent on image and language prediction problems using deep networks. We show that when SAM is applied with a convex quadratic objective, for most random initializa-
tions it converges to oscillating between either side of the minimum in the direction
with the largest curvature, and we provide bounds on the rate of convergence. In
the non-quadratic case, we show that such oscillations encourage drift toward wider
minima by effectively performing gradient descent, on a slower time scale, on the
spectral norm of the Hessian. (Based on joint work with Olivier Bousquet and Phil
Long)

• Friday, October 7
• 47
Johannes Schmidt-Hieber - A statistical analysis of an image classification problem Amphitheater Hermite, IHP

#### Amphitheater Hermite, IHP

The availability of massive image databases resulted in the development of scalable machine learning
methods such as convolutional neural network (CNNs) filtering and processing these data. While the very recent theoretical work on CNNs focuses on standard nonparametric denoising problems, the variability in image classification datasets does, however, not originate from additive noise but from variation of the shape and other characteristics of the same object across different images. To address this problem, we consider a simple supervised classification problem for object detection on grayscale images. While from the function estimation point of view, every pixel is a variable and large images lead to high-dimensional function recovery tasks suffering from the curse of dimensionality, increasing the number of pixels in our image deformation model enhances the image resolution and makes the object classification problem easier. We propose and theoretically analyze two different procedures. The first method estimates the image deformation by support alignment. Under a minimal separation condition, it is shown that perfect classification is possible. The second method fits a CNN to the data. We derive a rate for the misclassification error depending on the sample size and the number of pixels. Both classifiers are empirically compared on images generated from the MNIST handwritten digit database. The obtained results corroborate the theoretical findings.
This is joint work with Sophie Langer (Twente).

• 48
Damien Garreau - What does LIME really see in images? Amphitheater Hermite, IHP

#### Amphitheater Hermite, IHP

The performance of modern algorithms on certain computer vision tasks such as object recognition is now close to that of humans. This success was achieved at the price of complicated architectures depending on millions of parameters and it has become quite challenging to understand how particular predictions are made. Interpretability methods propose to give us this understanding. In this paper, we study LIME, perhaps one of the most popular. On the theoretical side, we show that when the number of generated examples is large, LIME explanations are concentrated around a limit explanation for which we give an explicit expression. We further this study for elementary shape detectors and linear models. As a consequence of this analysis, we uncover a connection between LIME and integrated gradients, another explanation method. More precisely, the LIME explanations are similar to the sum of integrated gradients over the superpixels used in the preprocessing step of LIME.

• 49
Vasiliki Velona - Learning a partial correlation graph using only a few covariance queries. Amphitheater Hermite, IHP

#### Amphitheater Hermite, IHP

In settings where the covariance matrix is too large to even store, we would like to learn the partial correlation graph with as few covariance queries as possible (in a partial correlation graph, an edge exists if the corresponding entry in the inverse covariance matrix is non-zero). In recent work with Gabor Lugosi, Jakub Truszkowski, and Piotr Zwiernik, we showed that it is possible to use only a quasi-linear number of queries if the inverse covariance matrix is sparse enough, in the sense that the partial correlation graph resembles a tree on a global scale. I will explain these results and discuss extensions and applications.

• Monday, October 10
• 50
Stefan Sommer -Diffusions means in geometric statistics Amphitheater Hermite, IHP

#### Amphitheater Hermite, IHP

Analysis and statistics of shape variation and, more generally, manifold valued data can be formulated probabilistically with geodesic distances between shapes exchanged with (-log)likelihoods. This leads to new statistics and estimation algorithms. One example is the notion of diffusion mean. In the talk, I will discuss the motivation behind and construction of diffusion means and discuss properties of the mean, including reduced smeariness when estimating diffusion variance together with the mean. This happens both in the isotropic setting with trivial covariance, and in the anisotropic setting where variance is fitted in all directions. I will connect this to most probable paths to data and algorithms for computing diffusion means, particularly bridge sampling algorithms. Finally, we will discuss ways of sampling the diffusion mean directly by conditioning on the diagonal of product manifolds, thereby avoiding the computationally expensive iterative optimization that is often applied for computing means on manifolds.

• 51
Nina Miolane -Geomstats: a Python package for Geometric Machine Learning Amphitheater Hermite, IHP

#### Amphitheater Hermite, IHP

We introduce Geomstats, an open-source Python package for computations and statistics on nonlinear manifolds that appear in machine learning applications, such as: hyperbolic spaces, spaces of symmetric positive definite matrices, Lie groups of transformations, and many more. We provide object-oriented and extensively unit-tested implementations. Manifolds come equipped with families of Riemannian metrics with associated exponential and logarithmic maps, geodesics, and parallel transport. Statistics and learning algorithms provide methods for estimation, regression, classification, clustering, and dimension reduction on manifolds. All associated operations provide support for different execution backends --- namely NumPy, Autograd, PyTorch, and TensorFlow. This talk presents the package, compares it with related libraries, and provides relevant examples. We show that Geomstats provides reliable building blocks to both foster research in differential geometry and statistics and democratize the use of (Riemannian) geometry in statistics and machine learning. The source code is freely available under the MIT license at https://github.com/geomstats/geomstats.

• 52
Yusu Wang -Weisfeiler-Lehman Meets Gromov-Wasserstein Amphitheater Hermite, IHP

#### Amphitheater Hermite, IHP

The Weisfeiler-Lehman (WL) test is a classical procedure for graph isomorphism testing. The WL test has also been widely used both for designing graph kernels and for analyzing graph neural networks. In this talk, I will describe the so-called Weisfeiler-Lehman (WL) distance we recently introduced, which is a new notion of distance between labeled measure Markov chains (LMMCs), of which labeled graphs are special cases. The WL distance extends the WL test (in the sense that the former is positive if and only if the WL test can distinguish the two involved graphs) while at the same time it is polynomial time computable. It is also more discriminating than the distance between graphs used for defining the Wasserstein Weisfeiler-Lehman graph kernel. Inspired by the structure of the WL distance we identify a neural network architecture on LMMCs which turns out to be universal w.r.t. continuous functions defined on the space of all LMMCs (which includes all graphs) endowed with the WL distance. Furthermore, the WL distance turns out to be stable w.r.t. a natural variant of the Gromov-Wasserstein (GW) distance for comparing metric Markov chains that we identify. Hence, the WL distance can also be construed as a polynomial time lower bound for the GW distance which is in general NP-hard to compute.
This is joint work with Samantha Chen, Sunhyuk Lim, Facundo Memoli and Zhengchao Wan.

• Tuesday, October 11
• 53
Martin Bauer -Elastic Shape Analysis of Surface Amphitheater Hermite, IHP

#### Amphitheater Hermite, IHP

• 54
Eric Klassen (The Square Root Normal Field and Unbalanced Optimal Transport) Amphitheater Hermite, IHP

#### Amphitheater Hermite, IHP

The Square Root Normal Field (SRNF) is a distance function on shape spaces of surfaces in R^3. Unbalanced Optimal Transport (UOT) is a variant of Optimal Transport in which mass is allowed to expand and contract as it is transported from one point to another. In this talk (joint work of Bauer, Hartman and Klassen) we discuss an unexpected relation between the SRNF distance for oriented surfaces in R^3 and UOT for Borel measures on S^2.

• 55
Steve Oudot (Optimization in topological data analysis) Amphitheater Hermite, IHP

#### Amphitheater Hermite, IHP

This talk will give an overview of the line of work on optimization for topological data analysis, from the initial attempts at differentiating the persistent homology operator, to the recent adaptations of stochastic gradient descent and gradient sampling.

• 56
Omer Bobrowski -Universality in Random persistence Diagrams Amphitheater Hermite, IHP

#### Amphitheater Hermite, IHP

One of the most elusive challenges within the area of topological data analysis is understanding the distribution of persistence diagrams. Despite much effort, this is still largely an open problem. In this talk we will present a series of conjectures regarding the behavior of persistence diagrams arising from random point-clouds. We claim that, viewed in the right way, persistence values obey a universal probability law, that depends on neither the underlying space nor the original distribution of the point-cloud. We back these conjectures with an exhaustive set of experiments, including both simulated and real data.
We will also discuss some heuristic explanations for the possible sources of this phenomenon. Finally, we will demonstrate the power of these conjectures by proposing a new hypothesis testing framework for computing significance values for individual features within persistence diagrams.

This is joint work with Primoz Skraba (QMUL).

• 57
Kathryn Hess -Morse-theoretic signal compression and reconstruction Amphitheater Hermite, IHP

#### Amphitheater Hermite, IHP

In this lecture I will present work of three of my PhD students, Stefania Ebli, Celia Hacker, and Kelly Maggs, on cellular signal processing. In the usual paradigm, the signals on a simplicial or chain complex are processed using the combinatorial Laplacian and the resultant Hodge decomposition. On the other hand, discrete Morse theory has been widely used to speed up computations, by reducing the size of complexes while preserving their global topological properties.
Ebli, Hacker, and Maggs have developed an approach to signal compression and reconstruction on chain complexes that leverages the tools of algebraic discrete Morse theory,, which provides a method to reduce and reconstruct a based chain complex together with a set of signals on its cells via deformation retracts, preserving as much as possible the global topological structure of both the complex and the signals. It turns out that any deformation retract of real degreewise finite-dimensional based chain complexes is equivalent to a Morse matching. Moreover, in the case of certain interesting Morse matchings, the reconstruction error is trivial, except on one specific component of the Hodge decomposition. Finally, the authors developed and implemented an algorithm to compute Morse matchings with minimal reconstruction error, of which I will show explicit examples.

• 58
Cocktail (Tour Zamansky, Jussieu) Jussieu, Sorbonne Université

#### Jussieu, Sorbonne Université

4, place Jussieu, 75005 PARIS
• Wednesday, October 12
• 59
Joannes Krebs -On the law of the iterated logarithm and Bahadur representation in stochastic geometry Amphitheater Darboux

### Amphitheater Darboux

#### IHP

11, Rue Pierre et Marie Curie 75005 Paris

We study the law of the iterated logarithm and a related strong invariance principle for certain functionals in stochastic geometry. The underlying point process is either a homogeneous Poisson process or a binomial process.
Moreover, requiring the functional to be a sum of so-called stabilizing score functionals enables us to derive a Bahadur representation for sample quantiles. The scores are obtained from a homogeneous Poisson process. We also study local fluctuations of the corresponding empirical distribution function and apply the results to trimmed and Winsorized means of the scores.
As potential applications, we think of well-known functionals defined on the k-nearest neighbors graph and important functionals in topological data analysis such as the Euler characteristic and persistent Betti numbers as well as statistics defined on Poisson-Voronoi tessellations.

• 60
Katharine Turner - The Extended Persistent Homology Transform for Manifolds with Boundary Amphitheater Hermite, IHP

#### Amphitheater Hermite, IHP

The Persistent Homology Transform (PHT) is a topological transform which can be use to quantify the difference between subsets of Euclidean space. To each unit vector the transform assigns the persistence module of the height function over that shape with respect to that direction. The PHT is injective on piecewise-linear subsets of Euclidean space, and it has been demonstrably useful in diverse applications. One shortcoming is that shapes with different essential homology (i.e., Betti numbers) have an infinite distance between them.

The theory of extended persistence for Morse functions on a manifold was developed by Cohen-Steiner, Edelsbrunner and Harer in 2009 to quantify the support of the essential homology classes. By using extended persistence modules of height functions over a shape, we obtain the extended persistent homology transform (XPHT) which provides a finite distance between shapes even when they have different Betti numbers.

I will discuss how the XPHT of a manifold with boundary can be deduced from the XPHT of the boundary which allows for efficient calculation. James Morgan has implemented the required algorithms for 2-dimensional binary images as a forthcoming R-package. Work is also with Vanessa Robins.

• 61
Heather Harington- shape of data in biology. Amphitheater Hermite, IHP

#### Amphitheater Hermite, IHP

TBA

• 62
Frédéric Barbaresco -Symplectic Foliation Model of Information Geometry for Statistics and Learning on Lie Groups Amphitheater Hermite, IHP

#### Amphitheater Hermite, IHP

We present a new symplectic model of Information Geometry [1,2] based on Jean-Marie Souriau's Lie Groups Thermodynamics [3,4]. Souriau model was initially described in chapter IV “Statistical Mechanics” of his book “Structure of dynamical systems” published in 1969. This model gives a purely geometric characterization of Entropy, which appears as an invariant Casimir function in coadjoint representation, characterized by Poisson cohomology. Souriau has proved that we can associate a symplectic manifold to coadjoint orbits of a Lie group by the KKS 2-form (Kirillov, Kostant, Souriau 2-form) in the affine case (affine model of coadjoint operator equivariance via Souriau's cocycle) [5], that we have identified with Koszul-Fisher metric from Information Geometry. Souriau established the generalized Gibbs density covariant under the action of the Lie group. The dual space of the Lie algebra foliates into coadjoint orbits that are also the Entropy level sets that could be interpreted in the framework of Thermodynamics by the fact that dynamics on these symplectic leaves are non-dissipative, whereas transversal dynamics, given by Poisson transverse structure, are dissipative. We will finally introduce Gaussian distribution on the space of Symmetric Positive Definite (SPD) matrices, through Souriau's covariant Gibbs density by considering this space as the pure imaginary axis of the homogeneous Siegel upper half space where Sp(2n,R)/U(n) acts transitively. We will also consider Gibbs density for Siegel Disk where SU(n,n)/S(U(n)xU(n)) acts transitively. Gauss density of SPD matrices is then computed through Souriau's moment map and coadjoint orbits. Souriau’s Lie Groups Thermodynamics model will be further explored in European COST network CaLISTA [6] and European HORIZON-MSCA project CaLIGOLA [7].

• 63
Victor Patrangenaru -Geometry, Topology and Statistics on Object Spaces Amphitheater Hermite, IHP

#### Amphitheater Hermite, IHP

• Thursday, October 13
• 64
Nicolas Charon - Registration of shape graphs with partial matching constraints Amphitheater Hermite, IHP

#### Amphitheater Hermite, IHP

This talk will discuss an extension of the elastic curve registration framework to a general class of geometric objects which we call (weighted) shape graphs, allowing in particular the comparison and matching of 1D geometric data that are partially observed or that exhibit certain topological inconsistencies. Specifically, we generalize the class of second-order invariant Sobolev metrics on the space of unparametrized curves to weighted shape graphs by modelling such objects as varifolds (i.e. directional measures) and combining geometric deformations with a transformation process on the varifold weights. This leads us to introduce a new class of variational problems, show the existence of solutions and derive a specific numerical scheme to tackle the corresponding discrete
optimization problems.

• 65
Irène Kaltenmark - Curves and surfaces. Partial matching in the space of varifolds. Amphitheater Hermite, IHP

#### Amphitheater Hermite, IHP

The matching of analogous shapes is a central problem in computational anatomy. However, inter-individual variability, pathological anomalies or acquisition methods sometimes challenge the assumption of global homology between shapes.
In this talk, I will present an asymmetric data attachment term characterizing the inclusion of one shape in another. This term is based on projection on the nearest neighbor with respect to the metrics of varifold spaces.
Varifolds are representations of geometric objects, including curves and surfaces. Their specificity is to take into account the tangent spaces of these objects and to be robust to the choice of parametrization.
This new data attachment term extends the scope of application of the pre-existing methods of matching by large diffeomorphic deformations (LDDMM). The partial registration is indeed induced by a diffeomorphic deformation of the source shape. The anatomical (topological) characteristics of this shape are thus preserved.
This is a joint work with Pierre-Louis Antonsanti and Joan Glaunès.

• 66
Herbert Edelsbrunner - Chromatic Delaunay mosaics for chromatic point data. Amphitheater Hermite, IHP

#### Amphitheater Hermite, IHP

The chromatic Delaunay mosaic of s+1 finite sets in d dimensions is an (s+d)-dimensional Delaunay mosaic that represents the individual sets as well as their interactions. For example, it contains a (non-standard) dual of the overlay of the Voronoi tessellations of any subset of the s+1 colors. We prove bounds on the size of the chromatic Delaunay mosaic, in the worst and average case, and suggest how to use image,
kernel, and cokernel persistence to get stable diagrams describing the interaction of the points of different colors.

Acknowledgements. This is incomplete and ongoing joint work with Ranita Biswas, Sebastiano Cultrera, Ondrej Draganov, and Morteza Saghafian, all at IST Austria.

• 67
Claire Brecheteau - Approximating data with a union of ellipsoids and clustering. Amphitheater Hermite, IHP

#### Amphitheater Hermite, IHP

I will introduce a surrogate for the distance function to the support of a distribution, which sublevel sets are unions of balls or of ellipsoids. I will expose different results, including rates of convergence for the approximation of these surrogates with their empirical versions, built from pointclouds. I will explain how to use such estimators to cluster data with a geometric structure. The results have been published in the papers [1,2], and are still in progress.

• 68
Dominique Attali- Reconstructing manifolds by weighted $\ell_1$-norm minimization Amphitheater Hermite, IHP

#### Amphitheater Hermite, IHP

In many practical situations, the shape of interest is only known through a finite set of data points. Given as input those data points, it is then natural to try to construct a triangulation of the shape, that is, a set of simplices whose union is homeomorphic to the shape. This problem has given rise to many research works in the computational geometry community, motivated by applications to 3D model reconstruction and manifold learning.

In this talk, we focus on one particular instance of the shape reconstruction problem, in which the shape we wish to reconstruct is an orientable smooth $d$-manifold embedded in $\mathbb{R}^N$. We reformulate the problem of searching for a triangulation as a convex minimization problem, whose objective function is a weighted $\ell_1$-norm. I will then present the result in \cite{socg2022} which says that, under appropriate conditions, the solution of our minimization problem is indeed a triangulation of the manifold and that this
triangulation coincides with a variant of the tangential Delaunay complex.

This is a joint work with André Lieutier.

• Friday, October 14
• 69
Barbara Gris - Defining Data-Driven Deformation Models Amphitheater Hermite, IHP

#### Amphitheater Hermite, IHP

Studying shapes through large deformations allows to define a metric on a space of shapes from a metric on a space of deformations. When the set of considered deformations is not rel-evant to the observed data, the geodesic paths for this metric can be deceiving from a model-ling point of view. To overcome this issue, the notion of deformation module allows to incorpo-rate prior coming from the data in the set of considered deformations and the metric. I will pre-sent this framework, as well as the IMODAL library which enables to perform registration through such structured deformations. This Python library is modular: adapted priors can be easily defined by the user, several priors can be combined into a global one and various types of data can be considered such as curves, meshes or images.
This is a joint work with Benjamin Charlier, Leander Lacroix and Alain Trouvé.

• 70
Laurent Younes -Stochastic Gradient Descent for Large-Scale LDDMM Amphitheater Hermite, IHP

#### Amphitheater Hermite, IHP

• 71
Stephen Preston - Isometric immersions and the waving of flags Amphitheater Hermite, IHP

#### Amphitheater Hermite, IHP

A physical flag can be modeled geometrically as an isometric immersion of a rectangle into space, with one edge fixed along the flagpole. Its motion, in the absence of gravity and wind, can be modeled as a geodesic in the space of all isometric immersions, where the Riemannian metric is inherited from the kinetic energy on the much larger space of all immersions. In this talk I will show how generically such an isometric immersion can be described completely by the curve describing the top or bottom edge, which gives a global version of a classical local result in differential geometry. Using this, I will show how to derive the geodesic equation, which turns out to be a highly nonlinear, nonlocal coupled system of two wave equations in one space variable, with tension determined by solving an ODE system. The new model has the potential to describe motion of cloth with much fewer variables than the traditional method of strongly constraining three functions of two space variables.

This is joint work with Martin Bauer and Jakob Moeller-Andersen

• Tuesday, October 18
• 72
Mikhail Belkin - Mathematical Aspects of Deep Learning (1/2) Amphitheater Darboux

### Amphitheater Darboux

#### IHP

11, Rue Pierre et Marie Curie 75005 Paris
• 73
Yusu Wang - Some Theoretical Aspects of Graph Neural Networks (and Higher Order Variants) (1/2) Amphitheater Darboux

### Amphitheater Darboux

#### IHP

11, Rue Pierre et Marie Curie 75005 Paris
• Wednesday, October 19
• 74
Mikhail Belkin - Mathematical Aspects of Deep Learning (2/2) Amphitheater Darboux

### Amphitheater Darboux

#### IHP

11, Rue Pierre et Marie Curie 75005 Paris
• 75
Yusu Wang - Some Theoretical Aspects of Graph Neural Networks (and Higher Order Variants) (2/2) Amphitheater Darboux

### Amphitheater Darboux

#### IHP

11, Rue Pierre et Marie Curie 75005 Paris
• 76
Quentin Mérigot - Optimal Transport (2/8) Amphitheater Darboux

### Amphitheater Darboux

#### IHP

11, Rue Pierre et Marie Curie 75005 Paris
• 77
Ery Arias-Castro & Eddie Aamari - Embedding for Data Analysis (2/8) Amphitheater Darboux

### Amphitheater Darboux

#### IHP

11, Rue Pierre et Marie Curie 75005 Paris
• Thursday, October 20
• 78
Eric Klassen - Geometry of Shape Spaces of Curves and Surfaces (4/9). Amphitheater Darboux

### Amphitheater Darboux

#### IHP

11, Rue Pierre et Marie Curie 75005 Paris
• 79
Wolfgang Polonik - Statistical Topological Data Analysis (4/9) Amphitheater Darboux

### Amphitheater Darboux

#### IHP

11, Rue Pierre et Marie Curie 75005 Paris
• Wednesday, October 26
• 80
Quentin Mérigot - Optimal Transport (3/8) Amphitheater Darboux

### Amphitheater Darboux

#### IHP

11, Rue Pierre et Marie Curie 75005 Paris
• 81
Ery Arias-Castro & Eddie Aamari - Embedding for Data Analysis (3/8) Amphitheater Darboux

### Amphitheater Darboux

#### IHP

11, Rue Pierre et Marie Curie 75005 Paris
• Thursday, October 27
• 82
Eric Klassen - Geometry of Shape Spaces of Curves and Surfaces (5/9). Amphitheater Darboux

### Amphitheater Darboux

#### IHP

11, Rue Pierre et Marie Curie 75005 Paris
• 83
Wolfgang Polonik - Statistical Topological Data Analysis (5/9) Amphitheater Darboux

### Amphitheater Darboux

#### IHP

11, Rue Pierre et Marie Curie 75005 Paris
• Wednesday, November 2
• 84
Quentin Mérigot - Optimal Transport (4/8) Amphitheater Darboux

### Amphitheater Darboux

#### IHP

11, Rue Pierre et Marie Curie 75005 Paris
• 85
Ery Arias-Castro & Eddie Aamari - Embedding for Data Analysis (4/8) Amphitheater Darboux

### Amphitheater Darboux

#### IHP

11, Rue Pierre et Marie Curie 75005 Paris
• Thursday, November 3
• 86
Eric Klassen - Geometry of Shape Spaces of Curves and Surfaces (6/9). Amphitheater Darboux

### Amphitheater Darboux

#### IHP

11, Rue Pierre et Marie Curie 75005 Paris
• 87
Wolfgang Polonik - Statistical Topological Data Analysis (6/9). Amphitheater Darboux

### Amphitheater Darboux

#### IHP

11, Rue Pierre et Marie Curie 75005 Paris
• Tuesday, November 8
• 88
Accueil Amphitheater Darboux

### Amphitheater Darboux

#### IHP

11, Rue Pierre et Marie Curie 75005 Paris
• 89
Introduction de la journée (AMIES & Semestre Gesda).
• 90
Baptiste Labarthe (Metafora) Analyse Topologique des données : applications prometteuses aux données de cytométrie et au diagnostic médical. Amphitheater Hermite, IHP

#### Amphitheater Hermite, IHP

• 91
Klervi Le Gall (UN) Analyse en Composantes Principales sur l’espace des fonctions à valeurs sur la variété des rotations 3-dimensionnelles: application à l’évaluation du déficit ambulatoire chez les patients atteints de SEP. Amphitheater Hermite, IHP

#### Amphitheater Hermite, IHP

• 92
Rémi Perrichon (ENAC) Statistique et géométrie au service de l’analyse de trajectoires d’avions. Amphitheater Hermite, IHP

#### Amphitheater Hermite, IHP

• 10:50 AM
pause Espace Cafétéria, IHP

#### Espace Cafétéria, IHP

• 93
Table ronde avec Frédéric Barbaresco (Thales), Nicolas Bousquet (EDF) et Stéphanie Allassonnière (Univ. Paris Cité). Amphitheater Hermite, IHP

#### Amphitheater Hermite, IHP

• 12:30 PM
buffet Espace Cafétéria, IHP

#### Espace Cafétéria, IHP

• 94
posters Espace Cafétéria, IHP

#### Espace Cafétéria, IHP

• 95
Présentation commune des librairies et plateformes GeomStats, Gudhi et TTK. Amphitheater Hermite, IHP

#### Amphitheater Hermite, IHP

• 96
Atelier GeomStats Salle 01, IHP

#### Salle 01, IHP

• 97
Atelier Gudhi Salle 05, IHP

#### Salle 05, IHP

• 98
Atelier TTK Salle 201, IHP

#### Salle 201, IHP

• Wednesday, November 9
• 99
Quentin Mérigot - Optimal Transport (5/8) Amphitheater Darboux

### Amphitheater Darboux

#### IHP

11, Rue Pierre et Marie Curie 75005 Paris
• 100
Ery Arias-Castro & Eddie Aamari - Embedding for Data Analysis (5/8) Amphitheater Darboux

### Amphitheater Darboux

#### IHP

11, Rue Pierre et Marie Curie 75005 Paris
• Thursday, November 10
• 101
Eric Klassen - Geometry of Shape Spaces of Curves and Surfaces (7/9) Amphitheater Darboux

### Amphitheater Darboux

#### IHP

11, Rue Pierre et Marie Curie 75005 Paris
• 102
Wolfgang Polonik - Statistical Topological Data Analysis (7/9). Amphitheater Darboux

### Amphitheater Darboux

#### IHP

11, Rue Pierre et Marie Curie 75005 Paris
• Tuesday, November 15
• 103
Kathryn Hess - Topological Approaches to Neuroscience (1/2). Amphitheater Darboux

### Amphitheater Darboux

#### IHP

11, Rue Pierre et Marie Curie 75005 Paris
• Wednesday, November 16
• 104
Kathryn Hess - Topological Approaches to Neuroscience (2/2) Amphitheater Darboux

### Amphitheater Darboux

#### IHP

11, Rue Pierre et Marie Curie 75005 Paris
• 105
Quentin Mérigot - Optimal Transport (6/8) Amphitheater Darboux

### Amphitheater Darboux

#### IHP

11, Rue Pierre et Marie Curie 75005 Paris
• 106
Ery Arias-Castro & Eddie Aamari - Embedding for Data Analysis (6/8) Amphitheater Darboux

### Amphitheater Darboux

#### IHP

11, Rue Pierre et Marie Curie 75005 Paris
• Thursday, November 17
• 107
Eric Klassen - Geometry of Shape Spaces of Curves and Surfaces (8/9). Amphitheater Darboux

### Amphitheater Darboux

#### IHP

11, Rue Pierre et Marie Curie 75005 Paris
• 108
Wolfgang Polonik - Statistical Topological Data Analysis (8/9). Amphitheater Darboux

### Amphitheater Darboux

#### IHP

11, Rue Pierre et Marie Curie 75005 Paris
• Monday, November 21
• 109
WS3-0 Amphitheater Hermite, IHP

#### Amphitheater Hermite, IHP

• 110
WS3-0 Amphitheater Hermite, IHP

#### Amphitheater Hermite, IHP

• 111
WS3-1 Amphitheater Hermite, IHP

#### Amphitheater Hermite, IHP

• 112
WS3-2 Amphitheater Hermite, IHP

#### Amphitheater Hermite, IHP

• 113
WS3-3 Amphitheater Hermite, IHP

#### Amphitheater Hermite, IHP

• Tuesday, November 22
• 114
WS3-4 Amphitheater Hermite, IHP

#### Amphitheater Hermite, IHP

• 115
WS3-5 Amphitheater Hermite, IHP

#### Amphitheater Hermite, IHP

• 116
WS3-6 Amphitheater Hermite, IHP

#### Amphitheater Hermite, IHP

• 117
WS3-7 Amphitheater Hermite, IHP

#### Amphitheater Hermite, IHP

• 118
WS3-8 Amphitheater Hermite, IHP

#### Amphitheater Hermite, IHP

• Wednesday, November 23
• 119
WS3-10 Amphitheater Hermite, IHP

#### Amphitheater Hermite, IHP

• 120
WS3-11 Amphitheater Hermite, IHP

#### Amphitheater Hermite, IHP

• 121
WS3-12 Amphitheater Hermite, IHP

#### Amphitheater Hermite, IHP

• 122
WS3-13 Amphitheater Hermite, IHP

#### Amphitheater Hermite, IHP

• Thursday, November 24
• 123
WS3-14 Amphitheater Hermite, IHP

#### Amphitheater Hermite, IHP

• 124
WS3-15 Amphitheater Hermite, IHP

#### Amphitheater Hermite, IHP

• 125
WS3-16 Amphitheater Hermite, IHP

#### Amphitheater Hermite, IHP

• 126
WS3-17 Amphitheater Hermite, IHP

#### Amphitheater Hermite, IHP

• 127
ws3-18 Amphitheater Hermite, IHP

#### Amphitheater Hermite, IHP

• Friday, November 25
• 128
WS3-18 Amphitheater Hermite, IHP

#### Amphitheater Hermite, IHP

• 129
WS3-19 Amphitheater Hermite, IHP

#### Amphitheater Hermite, IHP

• 130
WS3-20 Amphitheater Hermite, IHP

#### Amphitheater Hermite, IHP

• Tuesday, November 29
• 131
Stephen Preston - Riemannian Geometry on Lie Groups (1/2) Amphitheater Darboux

### Amphitheater Darboux

#### IHP

11, Rue Pierre et Marie Curie 75005 Paris
• Wednesday, November 30
• 132
Stephen Preston - Riemannian Geometry on Lie Groups (2/2) Amphitheater Darboux

### Amphitheater Darboux

#### IHP

11, Rue Pierre et Marie Curie 75005 Paris
• 133
Quentin Mérigot - Optimal Transport (7/8) Amphitheater Darboux

### Amphitheater Darboux

#### IHP

11, Rue Pierre et Marie Curie 75005 Paris
• 134
Ery Arias-Castro & Eddie Aamari - Embedding for Data Analysis (7/8) Amphitheater Darboux

### Amphitheater Darboux

#### IHP

11, Rue Pierre et Marie Curie 75005 Paris
• Thursday, December 1
• 135
Eric Klassen - Geometry of Shape Spaces of Curves and Surfaces (9/9). Amphitheater Darboux

### Amphitheater Darboux

#### IHP

11, Rue Pierre et Marie Curie 75005 Paris
• 136
Wolfgang Polonik - Statistical Topological Data Analysis (9/9). Amphitheater Darboux

### Amphitheater Darboux

#### IHP

11, Rue Pierre et Marie Curie 75005 Paris
• Wednesday, December 7