Awesome Open Source
Awesome Open Source

Automatic Line Art Colorization


This repository is about automatic line art colorization with deep learning. In addition to training the neural network with line arts only, this repository aims to colorize with several types of hints. There are mainly three types of hints.

  • Atari: Colorization with hints that include some lines in desired color (ex. PaintsChainer)
  • Tag: Colorization with tags (ex. Tag2Pix)
  • Reference: Colorization with reference images (ex. style2paints V1)

Line extraction method

There are many kinds of line extraction methods, such as XDoG or SketchKeras. If we train the model on only one type of line art, trained model comes to overfit and cannot colorize another type of line art properly. Therefore, like Tag2Pix, various kinds of line arts are used as the input of neural network.

I use mainly three types of line art.

  • XDoG

    • Line extraction using two Gaussian distributions difference to standard deviations
  • SketchKeras

    • Line extraction using UNet. Lines obtained by SketchKeras are like pencil drawings.
  • Sketch Simplification

    • Line extraction using Fully-Convolutional Networks. Lines obtained by Sketch Simplification are like digital drawings.

An example obtained by these line extraction methods is as follows.

Moreover, I add two types of data augmenation to line arts in order to avoid overfitting.

  • Randomly morphology transformation to deal with various thicks of lines
  • Randomly RGB values of lines to deal with various depths of lines

Experiments without hints


First of all, I needed to confirm that methods based on neural networks can colorize without hint precisely and diversely. The training of mapping from line arts to color images is difficult because of variations in color. Therefore, I hypothesized that neural networks trained without hints would come to colorize single color in any regions. In addition to content loss, I tried adversarial loss because this loss function enables neural networks to match data distribution adequately.


  • [x] pix2pix
  • [x] pix2pixHD
  • [X] bicyclegan


Method Result
pix2pix & pix2pixHD

Experiment with atari


Considering the application systems of colorization, we need to colorize with designated color. Therefore, I tried some methods that take the hint, named atari, as input of neural network.


  • [x] userhint
  • [ ] userhint v2
  • [x] whitebox
  • [x] spade


Method Result
userhint here

Experiment with reference


I also consider taking the hint, named reference, as input of neural network. At first, I had tried to implement style2paints V1. However, I had difficulities reproducing the results because training came to collapse. Then, I decided to seek for substitutes for style2paints V1.


  • [x] adain
  • [x] scft
  • [x] video


Method Result
adain here


Get A Weekly Email With Trending Projects For These Topics
No Spam. Unsubscribe easily at any time.
python (51,962
deeplearning (286

Find Open Source By Browsing 7,000 Topics Across 59 Categories