Awesome Open Source
Awesome Open Source


Various GANs with Chainer


  • Chainer==1.24.0
  • OpenCV


By default, all models are tested on the CelabA dataset. You can find the training results in corresponding folders.

Gradient Penalty

Most of recent GANs (WGAN-GP, CramerGAN, DRAGAN) contains the gradient norm regularization, this has been proved as a way to stabilize GAN training.

The current version of Chainer do not support high order derivatives, a solution is to manually implement the backward procedure with auto-differentiable chainer.functions. (Refer WGAN-GP codes for the details.)

  • L.Linear, L.Convolution2D, L.Deconvolution2D, F.leaky_relu, F.relu, F.sigmoid, F.tanh, L.LayerNormalization is implemented.
  • Some GAN papers suggest to use LayerNormalization instead on BatchNormalization in the discriminator in the case of gradient penalty.

Special thanks to mattya for the idea and reference codes.


Some DRAGAN results:



Alternative Project Comparisons
Related Awesome Lists
Top Programming Languages

Get A Weekly Email With Trending Projects For These Topics
No Spam. Unsubscribe easily at any time.
Python (888,324
Gans (5,357
Gan (5,357
Adversarial (3,505
Adversarial Networks (890
Chainer (757
Wgan (406
Dragan (59