Paper Reading

Paper Reading
Slide Note
Embed
Share

This paper delves into the optimization landscapes of Generative Adversarial Networks (GANs) through a detailed analysis of rotation techniques, local Nash equilibrium, and locally stable stationary points. It investigates various optimization solutions and tools used in GANs, providing insights into the dynamics of training GAN models. The study also examines the discriminator behaviors in different GAN variants, shedding light on the complex interactions within these networks.

  • GANs
  • Optimization Landscapes
  • ICLR2020
  • Rotation Techniques
  • Local Nash Equilibrium

Uploaded on Feb 17, 2025 | 0 Views


Download Presentation

Please find below an Image/Link to download the presentation.

The content on the website is provided AS IS for your information and personal use only. It may not be sold, licensed, or shared on other websites without obtaining consent from the author.If you encounter any issues during the download, it is possible that the publisher has removed the file from their server.

You are allowed to download the files provided on this website for personal or commercial use, subject to the condition that they are used lawfully. All files are the property of their respective owners.

The content on the website is provided AS IS for your information and personal use only. It may not be sold, licensed, or shared on other websites without obtaining consent from the author.

E N D

Presentation Transcript


  1. Paper Reading ICLR2020 : A Closer Look at the Optimization Landscapes of Generative Adversarial Networks Presented by Dachao Lin

  2. Outline Rotation in GANs optimization LNE and LSSP optimization solution of GAN Visualization tool Experiments 2

  3. Rotation in GANs optimization Discriminator: Vanilla GAN WGAN Vanilla GAN Direc GAN Real sample at x=0, generator parameter , generate data at x= . 3

  4. Rotation in GANs optimization Direc GAN ?2+ ?2=1 Rotation Graident flow 4

  5. Rotation in GANs optimization WGAN WGAN-GP 5

  6. Local Nash Equilibrium (LNE) Local Nash equilibrium (LNE), i.e. a point only locally true. Being a DNE is not necessary for being a LNE: a local Nash equilibrium may have Hessians that are only semi-definite. NE are commonly used in GANs to describe the goal of the learning procedure. The interaction between the two networks is not taken into account. 6

  7. Locally Stable Stationary Point (LSSP) Intuition: ? ?(? ) ? ? ? 7

  8. Locally Stable Stationary Point (LSSP)

  9. LSSP LNE

  10. Visualization tool

  11. Visualization tool

  12. Path-angle for NSGAN (top row) and WGAN-GP (bottom row) 12

  13. Eigenvalues of the Jacobian of the game for NSGAN (top row) and WGAN-GP (bottom row) 13

  14. Summary The rotational component is clearly visible. The complex eigenvalues for NSGAN seems to be much more concentrated on the imaginary axis ??? WGAN-GP tends to spread the eigenvalues towards the right of the imaginary axis ???

  15. Top k-Eigenvalues of the Hessian of each player

  16. Summary The generator never reaches a local minimum but instead finds a saddle point. The algorithm converges to a LSSP which is not a LNE. NSGAN converges to a solution with very large positive eigenvalues compared to WGAN-GP. The gradient penalty acts as a regularizer on the discriminator and prevents it from becoming too sharp.

  17. Discussion GANs do not converge to local Nash Equilibria. The optimization landscapes of GANs typically have rotational components. Whether we need a Nash equilibrium to get a generator with good performance in GANs? Handle strong rotational components: extragradient, averaging, gradient penalty based

Related


More Related Content