Computational & Applied Math Seminar

Variational Gradient Flow for Deep Generative Learning

  • Speaker: Can YANG (The Hong Kong University of Science and Technology)

  • Time: Oct 23, 2020, 14:30-15:30

  • Location: Zoom (ID 669 417 9735)

Abstract

In this talk, we will discuss a framework to learn deep generative models via Variational Gradient Flow (VGrow) on probability measure spaces. The gradient flow asymptotically converging to the target distribution is determined by a velocity vector field that is the negative gradient of the first variation of the f-divergence loss. Since the vector field depends on the density ratio of the pushforward distribution and the target distribution, we show that the ratio can be estimated via a deep logistic regression model with statistically guaranteed accuracy. The proved estimation error bound improves those obtained in exiting literatures. Connections of our proposed VGrow method with other popular methods, such as VAE, GAN and flow-based methods, have been established in this framework, gaining new insights of deep generative learning. We propose VGrow-Pg via combing VGrow with the progressive growing training technique to handle the challenges in generating samples with high-resolution. We investigate several commonly used divergences in the VGrow framework, including Kullback-Leibler, Jensen-Shannon, Jeffreys divergences as well as our newly discovered “logD” divergence which serves as the objective function of the logD-trick GAN. Through a comprehensive evaluation of VGrow and related methods on benchmark datasets and a new Portrait dataset collected by our own, we demonstrate that VGrow-Pg is stable and achieves competitive performance with state-of-the-art GANs.

Baidu
sogou