Nicolas Papadakis on "Learning of Wasserstein Generative Models and Patch-based Texture Synthesis":
The problem of WGAN (Wasserstein Generative Adversarial Network) learning is an instance of optimization problems where one wishes to find, among a parametric class of distributions, the one which is closest to a target distribution in terms of an optimal transport (OT) distance. Applying a gradient-based algorithm for this problem requires to express the gradient of the OT distance with respect to one of its argument, which can be related to the solutions of the dual problem (Kantorovich potentials). The first part of this talk aims at finding conditions that ensure the existence of such gradient. After discussing regularity issues that may appear with discrete target measures, we will show that regularity problems are avoided when using entropy-regularized OT and/or considering the semi-discrete formulation of OT. Then, we will see how these gradients can be exploited in a stable way to address some imaging problems where the target discrete measure is reasonably large. Using OT distances between multi-scale patch distributions, this allows to estimate a generative convolutional network that can synthesize an exemplar texture in a faithful and efficient way.
This is a joint work with Antoine Houdard, Arthue Leclaire and Julien Rabin.