Coercive convex functionals of linear growth in the gradient of the argument generically attain their minima in the space of functions of bounded variation, i.e. the gradient of the minimizer is a vector measure. One class of such functionals are Rudin-Osher-Fatemi (ROF) type models known for their applications to image processing. ROF-type functionals consist of a regularizing term of linear growth and a lower-order fidelity term measuring the distance between the argument and a given function ("noisy image"). In this talk, I will discuss recent results on control of the Lebesgue-singular part of the gradient of the minimizer in terms of the datum, obtained in collaboration with Z. Grochulska and P. Rybka.
Paul Pegon