Colorizing with iDeepColor

Interactive Deep Colorization in action.

This is a Low-Quality Unprocessed™ scan from an old Brownie photo (a small 1:1 print) my grandfather took in Guam while he was stationed there during WWII.

To invoke I specify the file to avoid wasting time processing the sample image: python2 ideepcolor.py --cpu_mode --image_file ~/Downloads/store.jpeg The program begins by automatically colorizing your image (deep colorization):

As you can see it is well-trained in sky and cloud formations, with a bit of green tinge added to the vegetation.

One goes about clicking at points on the grey image, assigning colors from either a suggested palette or from recently used colors.

Here’s the qt4 UI hard at work.

A hint network is generated and sent off to the caffe-based neural network. During installation of iDeepColor a training model set about 0.5 gigabytes in size is downloaded. Processing (which is done with each color selection, takes about 10 seconds on a single cpu running on a 4GHz i5. Build caffe with openmp to speed this up. Building caffe with CUDA+cuDNN is a good way to enable Nvidia compute 30 and higher devices. I also tried building the opencl branch but have yet to test on an OpenCL gpu.

I didn’t do a very careful job but here is the result anyway:

Moral of the story
Using it is pretty fun: installation not so much; sure, a professional colorizer can perform magic with GIMP, etc, but that’s work. Also it makes you want one of those zillion-CUDA-core GPUs.

So what do you think of the technology? Make your opinion be known. Vote now!

  • Bring on the AI
  • Let me do it by hand

0 voters

Citation:

Real-Time User-Guided Image Colorization with Learned Deep Priors, Zhang, Richard and Zhu, Jun-Yan and Isola, Phillip and Geng, Xinyang and Lin, Angela S and Yu, Tianhe and Efros, Alexei A
ACM Transactions on Graphics (TOG) vol. 9:4, 2017 ACM

2 Likes

Is similar to “colorize interactive” of gmic

That’s super cool!

I will grant that the interface is similar. I notice Colorize [Interactive] produces a lot of artifacts on photos, and it is not trained in things like “Don’t paint the tires the same color as the car” and “Don’t paint clouds the same color as the sky”. The Deep Colorization is particularly good at automatically colorizing portrait photos. Either method is far from flawless, and for both I could only get it to work on linux. iDeepColor works well on Ubuntu, and should be possible on macOS. I tried Debian but there are some errors to resolve.

1 Like

Really cool!!!

Btw, if anyone manages to get ideepcolor working on your system, please do share your details, experiences and results. Looking at the poll results suggests there might be some interest.

Ps. Happy New Year!

Now, it seems more possible to make actually convincing colorization of black and white photo. Not quite perfect, but it’s a start. The cars are somewhat missing reflection colors.

The cars are somewhat missing reflection colors.

I’d like to see a future iteration of the AI colorizer further trained in specularity enhancement and estimating light source location in 3D.

This is a really interesting project!
I just tried to make it work quite quickly on friday evening. I didn’t manage to compile the caffe dependency on my fedora notebook, but I need to analyze it a little further. Maybe I should try it on my desktop anyway, since it has a capable nvidia gpu in contrast to the notebook.

After a long an painful try, I succeed to make it works.

After following all the step from the Github of the project, it didn’t work for me. I learned (the hard way) that the compatibility with pyhton3 was not ideal :sweat:.

The make distribute is not really optional (at least from my experience)

So after switching back to python2.7, I had to add a symlink from $home/caffe/distribute/python/caffe to $home/.local/lib/python2.7/site-packages (I didn’t know why the PYTHONPATH method not working on my computer, Linux Mint 18)
I aslo add to update my LD_LIBRARY_PATH, added couple of symlink for some other lib in some system folder that I didn’t keep track of it.

Most of my error message were solved with a google search. But I had to search a lot :sweat:

I finally was able to get ideepcolor running, it’s really impressive. But after a such long journey to have it run, I would say it kill most of the fun out of it.

Hope you will not have a such painful experience to have it run. (I think this one was definitively out of my league)

1 Like

@Thomas Thanks so much for the tips. I was able to get ideepcolor to run on python3/debian9 with some modifications to the scripts.
https://github.com/junyanz/interactive-deep-colorization/issues/40

In my case setting PYTHONPATH is essential, while make distribute, LD_LIBRARY_PATH and symlinking were not needed.
export PYTHONPATH=~/caffe/build/install/python worked and made its way into my .bashrc

It has been a tribulation, but like all things tribulatory the learning done was invaluable. My family is flabbergasted with the few results I’ve had with some old family b/w photos, which has made the trouble well worth it.

1 Like

With the included python notebook DemoGlobalHistogramTransfer.ipynb I have taken my T-max 100 cat photo and a small picture of a tiger sitting on concrete as input to recreate the tabby coloration. I was expecting more of a tiger look but was pleasantly surprised at the resulting tabby.


Looks like some black magic work. It’s a great time when we can do something like that.

Looks good. Is there any way to add to the training data set to increase reliability?

Also, if I wanted to do frames in a video, could I create a “decisions” object on one frame and apply to multiple?

The training model should be released in the Near Future :registered: .
https://github.com/junyanz/interactive-deep-colorization/issues/28

The deep colorizing engine itself is interfaced in python and command line. There are ipython notebooks which demonstrate the python interface. You might be able to build a video frame handler around the interface. interactive-deep-colorization/DemoInteractiveColorization.ipynb at master · junyanz/interactive-deep-colorization · GitHub
Histogram Transfer is a different beast which does not utilize a user defined hint network, only its modeled training. interactive-deep-colorization/DemoGlobalHistogramTransfer.ipynb at master · junyanz/interactive-deep-colorization · GitHub

More examples of iDeepColor. Raw processed in RawTherapee, Colorized image retouched using GIMP.
My grandfather took these of the Spirit of 1776 Freedom Train in Grand Island, Nebraska on May 17, 1948.



iDeepColor would not pick up on colorizing the fence, which was easily picked up in a single click of the magic wand in GIMP.

1 Like

Installation is still somewhat confusing but added my 2 cents on the github page in case it helps:

https://github.com/junyanz/interactive-deep-colorization/issues/10#issuecomment-434748966