There’s a nice night sky photo over on /r/Photography, a large part of which was due to using the dehaze slider in Lightroom to remove light pollution. This got me wondering how I could get a similar effect with open source Raw editors, especially since I live in a place with a lot of light pollution
Does anyone have any suggestions for achieving a similar effect to the Dehaze slider in darktable, RawTherapee, etc.?
If you provide a raw file along with two JPEGs: one neutrally processed, and another differing only by the use of this “dehaze” filter, then we’ll be able to answer you.
The “Dehaze set to +74” shot in that link is obviously not telling us the whole truth, unless the dehaze filter also removes phone cables and phone cable reflections…
I think it’s interesting to see the halo-ing that seems to be occurring during the “dehaze” application. I actually prefer a more aggressive curve on the second image to the “dehazed” version.
Here is the difference of the basic toned image to the dehaze equivalent. You can clearly see the haloing from the algorithm on the edge of the house/tree and some inverted haloing (darker) along the top and upper corners.
I’ll have a play a bit (or defer to @houz or @Morgan_Hardwood or @David_Tschumperle ) and see if there’s something along the same lines we can possibly get…
Unfortunately I don’t have Lightroom (if I did, I probably wouldn’t be asking this question ).
I did some Googling and it seems like a few people have taken stabs at this before in darktable. I need to read their solutions in more depth when I’m not at work… That thread includes this Raw: https://plus.google.com/+DavidLaCivita/posts/guT6gwyi3Vk which is pretty hazy. Perhaps someone with lightroom can make some before and after dehazing jpegs?
Also, don’t know if it helps, but this research paper apparently describes the method. Also, this guy has some lightroom presets that are supposed to give the same effect as Adobe’s slider, so I assume its possible to develop a reasonably similar thing in darktable or RawTherapee.
If LR uses what that PDF describes - a tool very sophisticated under the hood, then something identical is currently not possible in RawTherapee (see issue 2853), though you could emulate the result to some degree using RT’s Tone Mapping tool (which uses edge-preserving decomposition) and the new Wavelet tool.
My suspicion when I saw the results of the tool, given the haloing that it sometimes causes, is that it employs a nonlocal-means (nonlocal-mins?) like method to determine the darkest thing in the neighborhood (a very large neighborhood, perhaps 1/5 of the picture height), assumes that that should be black, and then subtracts the newly generated value from that pixel.
In the above starfield image, it’s thrown off by the black foreground eaves and the tree, which make the nearby starfield think “oh, there’s something close by that’s basically black, so there’s no haze here”.
That paper looks like it uses something a little more sophisticated called soft matting to prevent the edge effects, but I’m not clear on how it does that (it might be described in another paper?).
Honestly dehazing is likely the wrong tool for this job. Dehazing algorithms generally need to deal with varying levels of haze because of variations in depth - now while the variation in the distance of different stars and the milkyway are gigantic, they are pretty much irrelevant because most of it is empty space.
If you want to take milkyway shots IMO stacking is your biggest friend. This will give you a relatively clean image to work with which can then be whitebalanced, pushed using curves and some form of local contrast enhancement and denoising and of course saturation boosts.
This is the result I got doing pretty much what was described above, using a fairly cheap aps-c camera (a6000) and non exotic lens (24/1.8).
Now back on the actual dehazing thing. It’s something I wanted to do for quite some time. I take a lot of shots when paragliding or generally in the mountains and haze is pretty much always an issue. I currently solve it using either Lab curves or local white balance for the color part and curves + some form of local contrast for the luminance part of the equation, trying to reconstruct depth using painted and parametric masks. But that’s dumb manual labor.
What I want to try when I find some time is to basically use the local contrast (variance) as an estimation of depth/haze, filter it with some edge preserving filter (guided bilateral blur or something) and then use that with an auto detect (or user provided) haze color to subtract the haze.
I lack time and some familiarity with the tools to rapidly prototype the idea so of someone else wants to give it a shot, be my guest.
However, in the darktable version I used a parametric mask, in which I attempted to capture haze by varying the L and b channels. The foreground is relatively unchanged in that version (at least when compared to the original rather than dehazed versions). I also managed to recover some colour using the colour correction module, it looks somewhat similar to the dehaze +50 setting:
It merely moderates the overall brightness of the hazy area, lowering it mildly, while increasing local contrast, rather than explicitly subtracting out brightness.
It doesn’t really remove the haze at all, but it should be more like what you actually perceived when looking at the scene.
(settings: drama of 89, and adjusted blackpoint and whitepoint to taste)
Yeah, the color shift is the biggest problem, otherwise Darktables Equalizer is already fairly good. If the would allow chroma control on the residual it would work much better I think.
I gave it a quick whack in Gimp with the Wavelet-Decompose plugin and a simple mask based on similarity to the haze color.
(left is the raw as I exported it from DT, right is dehazed in gimp with some simple masking at the sky and around the trees mostly because there was some fugly haloing caused by the non edge-avoiding wavelets the plugin presumably uses).
IMO the lighroom results still look the best - and that with a simple slider.
it would be really interesting to try to do something a bit more advanced with darktables edge optimized wavelets. But I guess it would be sufficiently painful to extract that code into something usable outside of darktable.
Gave it another whack, trying to fake what would happen if you could remove the residual in darktables equalizer:
I still would not use dehazing on the night sky. Just mask out the night sky and apply curves + whitebalance and you are mostly there, without any of the ugly artifacts introduced by more local techniques.
Another way is to create a really blury version of the image, so you basically only have the haze, then subtract that. This essentially gives you a low pass filter. You will need to readjust the brightness using some curves but this works too.
The getting rid of the haze/light pollution isn’t so much the problem, it’s getting enough signal that’s hard. And this is where stacking can help a lot as it gives you much better data to work with.
@Michael_Moreau a tip for the next night sky shot, leave to shutter open for a shorter duration, you have massive star trailing in the image.