I will present an approach to iteratively minimize a given objective function using minimizing movement schemes built on general cost functions. Both an implicit and an explicit method will be introduced.
In finite dimensions the explicit method unifies several standard gradient descent-type methods: gradient descent, mirror descent, Newton’s method, and Riemannian gradient descent. Byproducts of this framework include a new nonsmooth mirror descent and global convergence rates for Newton’s method.
More generally, rates of convergence will be shown when the energy is convex along *variational c-segments*, an extension of geodesics developed within the framework of spaces with nonnegative cross-curvature (NNCC spaces). Applications to optimization on spaces of measures include extensions of the JKO scheme beyond the squared Wasserstein distance.