How to disable pink on blown highlights in images?

I don’t know why Darktable likes to draws pink everywhere, Affinity and Lightroom doesn’t do this, and I just find it distracting:

Why do I need to see pink in the previews here? (And being a beginner, despite me posting the question in two different threads earlier, I must admit that I still don’t understand why I need to see this at all… sure it’s interesting to see when you have the exposure/highlights tool open, but in any other case I just want to see what my actual image looks like, not pink!)

You have been prompted at least two times to provide an appropriately licensed raw sample, or at least tell us what camera you’re using. It could be just that the white level is not (or cannot be) detected properly for your camera.

1 Like

it seems you‘re not displaying the embedded jpgs but preprocessed raw where you didn’t do a proper highlight reconstruction.
while other tools might do some automatic operations scene referred workflow in darktable requires you to decide how to deal with those clipping highlights

2 Likes

Here you go:

L1082846.DNG (51.3 MB)

1 Like

Thanks. This is what Leica provide, and what darktable uses:

WhiteLevel                      : 16383

and that does seem to correlate w/ all 4 CFA channels having a maximum around that value.

Not sure why the default 1.0 clipping threshold of the highlights module doesn’t work OOTB, but reducing it to 0.95 or so resolves the issue (for the default inpaint opposed method). Looking at the data in more depth, while there are indeed some saturated pixels at 16380, there is also another peak for all channels at around 15500 (i.e. approx. 0.95*16380), suggesting that the white level is indeed not quite right for this camera…

image

5 Likes

To clarify, this pink is not a feature or function - it’s what you naturally see on images with blown highlights, when they have not been reconstructed properly.

If you have the highlight reconstruction module turned on, but the pink is still showing, you must have one of the few models of camera where the white point is not set correctly in dt, and you have to adjust the threshold manually.

Once you have one right, you can save it as a preset in that module so it’s easy to apply to other images - or you can copy and paste the setting in lighttable to as many images as you want.

3 Likes

And, with most cameras, the thumbnails will show the embedded jpg preview, not a raw file, until you make adjustments in darkroom - you could try checking this setting in preferences:
Setting it to never will force dt to show the embedded preview - if there is one, and until you make any edits.

@kmilos i think you are fully right here, i just had a look at other files i used to check the hlr algorithms from a TL2 camera and they all have the white point that high. I don’t know if that is technically correct (saturating and electronics) but for us, not optimal.

There are many cameras that set it “a bit too high” so most of the hlr algos use a threshold slightly lower than 1.0 but that is not enough here.

@eobet

  1. There is not a single knob to use
  2. If you don’t see the “issue” in other apps, there can be many reasons for that. Some are bad as darktables old clip algorithm. Straight gray in highlights but all possible details lost.
  3. In darktable we try to find the “best” way, so you will (have to ) learn to use what we call styles and presets. In this case - you will probably do best to define a preset specific for your camera raw files for the highlights module with a clip of ~95% . (Maybe there will be suggestions from other users for a preset in whitepoint/rawblack. I wouldn’t do so on many reasons …)
3 Likes

Thank you for the information and thank you @kmilos as well.

I’ll look into how to make the preset and hopefully have it apply automatically to everything I import from that camera (I’m actually very disappointed in how much secret sauce Adobe and Leica put in the software of the raw files in this camera to try to correct the shortcomings of the hardware… Darktable is the only application I’ve found apart from Lightroom which can deal with the heavy fringing the lens produces, for example… guess that’s the reason it was their cheapest camera).

1 Like

I wouldn’t say so. Actually i think the lens correction via the embedded data is very good (i might be able to get this even better)

The problem with the “fringing” - i see what you mean with those cables hanging around. I think the problem here is very few photosites having a dark signal within an all-blown-out sky. Very difficult to get that correctly demosaiced.

Did you try to reduce exposure on such images? I would be surprised if that problem wouldn’t go away. I have seen other test images with the same lens → just great.

1 Like

Interesting! I’ve mailed both the developers at Skylum and Affinity about the fringing and they said that without Adobe’s collaboration what they can do is very limited (and it shows in their respective applications).

Great to hear. I’ve been spoiled by Lightroom handling everything “automatically”, so I’m just going to have to roll up my sleeves and take my time to figure things outs here!

I felt like the DNG from @eobet was missing some information so I downloaded a sample from DPREVIEW… It has the same issue in the blown highlights that is completely fixed by setting the raw WP to 15500. I think that the value being used by DT is just too high so the area is not considered “blown”. Many Canons have WP this low or lower. DT doesn’t even mark these areas as blown until you lower the WP to around 15500 so for sure there will not be correct HLR… I think a preset for this camera at around 15500 would solve most of this for DT… I don’t think other software reports what they use so its hard to know what LR or others are using

View Leica TL2 sample gallery from DPReview.

2 Likes

How do you generate that nice looking histogram for the raw values?

Affinity uses lensfun and doesn’t contribute back, if we are talking about TCA fringing… So the best they can do is take from the community without contributing back :wink:

5 Likes

Octave/MATLAB. Same can be done in Python.

Octave I have used a few times. Thanks.

One “tip” (at least from my perspective) with images with bright/blown areas would be to disable filmic rgb and try sigmoid instead - it has a different way of handling bright areas which I often prefer. It can actually make things look more blown out, but in combination with tone equalizer works very well - again, IMHO :slightly_smiling_face:

This is aside from the highlight reconstruction issue, but sometimes sigmoid will actually cover up the pink anyway… not sure if this is useful as it’s only hiding the issue…

Edit: I just tried on the image you uploaded (thanks) and sigmoid doesn’t cover up the pink anyway in this case. But as other have said adjusting the HLR threshold does remove it properly. I did a quick-ish edit on the image just in case you find it interesting - you can apply the .xmp via lighttable (in overwrite mode) if you want to see what modules I used and so on.

L1082846.DNG.xmp (13.0 KB)

1 Like

Thanks for sharing this picture and the challenges it presents. You have recieved very good and comprehensive answers here. I downloaded the image and can confirm that in lighttable view the embedded JPG doesn’t have the purple. I feel the best solution offered for your camera is creating a preset of 15500 for the white level and seeing how that goes for the majority of your images.

I also compared the look created by filmic V6, Sigmoid and then filmic V5. I feel sigmoid or filmic V5 does a much nicer job of transitioning the highlights especially on the sign in the foreground. Filmic V6 created artefacts or at least obvious unpleasant transitions in the highlights.

I looked at the chromatic aberration/fringing problems. I applyed raw chromatic abberations and the chromatic abberations module which both really helped. But then also I applied the depreciated ‘defringe’ module and that really did wonders on the wires in the sky. @hannoschwalm since you are one of the developers I would like to suggest this image shows for me why the defringe module should not be depreciated. There are times when it works the right amount of magic to defringe an image when the ‘better’ modules have not succeeded.

1 Like

A preset for white point would require the white level doesn’t depend on iso.

Thanks for that information. As a humble user I didn’t realise this. Please take a look at how defringe module works really well on the wires of this image. I really don’t want to see the defringe module removed from Darktable as for some images it works so well and this image shows that. I also found this image very interesting for comparing filmic V5 and V6 plus Sigmoid. I love we have choices in DT and this image shows the value of those choices.