Mathematical Finance, Stochastic Analysis, and Machine Learning By Yu-Jui Huang: GANs as Gradient Flows that Converge

Time

-

Locations

PH 131

Speaker:

Yu-Jui Huang, Colorado University

Title:

GANs as Gradient Flows that Converge

Abstract:

In this talk, we will show that the unsupervised learning problem can be solved by gradient descent in the space of probability measures. The gradient descent is governed by a distribution-dependent ODE, whose dynamics involves the density function of the state process as well as the derivative of the density function. To construct a solution to the ODE, we take an unconventional route. First, we show that the associated nonlinear Fokker-Planck equation has a unique weak solution, on strength of the Crandall-Liggett theorem for PDEs in a Banach space. Next, using Trevisan's superposition principle for SDEs, we find a unique solution to the distribution-dependent ODE, made from the solution to the Fokker-Planck equation. Our main result shows that, as time passes by, the solution to the ODE converges to an invariant distribution, which turns out to be the underlying data distribution. That is, one can uncover the unknown data distribution by simulating the distribution-dependent ODE. We design an algorithm for the simulation of the ODE and find that it is equivalent to the algorithm of the generative adversarial network (GAN), the state of the art of unsupervised learning. As a result, our developments provide a new theoretic framework for the analysis of GANs and it sheds new light particularly on the convergence of GANs.

 

Mathematical Finance, Stochastic Analysis, and Machine Learning 

Tags:

Getting to Campus