"Differentiable Programming for Image Processing and Deep Learning in Halide"

NOT MY WORK!

So I came across this paper that enhances the Halide programming language allowing it to do “general reverse-mode automatic differentiation (AD), and the ability to automatically optimize the implementation of gradient computations. This enables automatic computation of the gradients of arbitrary Halide programs, at high performance, with little programmer effort.”

So uh… something technical related to automatically optimizing gradient? It looks really cool but I honestly don’t quite follow:

http://gradient.halide.ai/

Point is, I do understand the part where they improved performance of their image filters by 10x-100x over the handwritten CUDA stuff, while writing more expressive and concise code.

I figured there might be people here who would be able to understand the material described, and maybe even get excited by it. So that’s why I’m posting it.

Also, that enhanced AHD demosaicing filter on page 9 is interesting, I wonder if other algorithms could be enhanced as well?