Something WRONG with darktable 4.1.0~git272.5a1a1845-1 clipped highlights

As shown in that rawspeed link, exiv2 can read unkown tag locations without a problem (add -u option), you just need to know the tag numbers (from e.g. exiftool indeed)…

I don’t.

You know this: xkcd: Dependency?
rawspeed depends on the spare time the main developer can spend on it …

1 Like

ahah! :slight_smile:

I just see a few PR waiting on rawspeed but no commits or anything going on in rawspeed since 04Jun.

Thanks Pehar, I understand now! I think I must have done something wrong in loading the .xmp as it looks different too. Always learning :face_with_hand_over_mouth:

The xmp uploaded by @Rajkhand does not show the artefacts. To make them visible you have to open color balace rgb and set perceptual brilliance grading → global brilliance to something around 35% or higher.

Hmmm. :thinking: I’m still not seeing it. I’ve just re-downloaded the xmp, applied it, then set the global brilliance to around 90% - nothing! Except what you’d expect the brilliance slider adjustment to do. Maybe my version (4.0) is doing something differently?
A thought: I’m not using OpenCL…

I played a bit more with @Rajkhand 's image, starting from scratch. As it was “playing”, I may not have noted all the relevant settings with proper scientific rigor. All this was done with dt 4.0 and OpenCL active (NVidia GTX 1060)

The “blooming” effect appears to be rather sensitive to the exact settings of the brillance sliders: changing the setting of “highlights” by 1 or 2 % suppresses it for any particular spot, but can make blooming appear in other spots. Setting ‘structure ↔ texture’ to 100% texture completely suppressed the blooming effect. Not sure if it’s relevant, but the setting of the filmic reconstruct threshold didn’t seem to have any influence whatsoever.

And I needed some extra exposure (+1EV) and a fairly low white relative value in filmic (~3.5EV). But even without the blooming, there were black artifacts in a number of the highlight spots.

Also, I could reproduce the blooming with a few percent of extra highlight brilliance and +7.7EV exposure…

With all that said, could someone explain what I’m supposed to see that needs such an extreme value of highlights brilliance grading in a basically dark image? What I see is an increase in brightness over the whole image (the whole histogram seems scaled to higher values), which completely destroys my careful setting of middle gray in the exposure module, and white and black references in filmic…

Glad it’s not just me! Actually, at one point in my own playing around I saw a small ‘bloom’ in a different spot.
And for what it’s worth, I never use the brilliance sliders. But that’s mostly likely just that I don’t know what to use them for!

Makernotes are not (always) just a tag to read. Sometimes they are a tag with binary data inside of it , that needs to be decoded to get values out of it. So it depends on camera and model .

Doesn’t mean reading certain unknown tags could be used to add a quick fix of course , but that depends who or what the plan is with rawspeed.

1 Like

Check AP’s latest video explaining LPHLR. It’s long but in the last 10 min he generates this editing the image and makes a comment that it can happen with the gamut mapping when you generate certain invalid values…

As an end user who is not an expert in mathematics or physics I want my editing software to behave in a certain way like if I crank up the exposure (call it brightness, brilliance luma etc.) to brighten the picture, may be at some point the whole picture becomes white but I will say something is wrong when a tiny spot becomes a giant white blob.

I request the developer and experts to analyse the problem and solve it in such a manner that the complexity is hidden from the end user.

Thanks

Can the OP or someone else open a github issue, attaching the examples?
Otherwise nothing will happen I guess

That might be where things went off the rails.

Those terms each have a specific meaning within darktable, and the modules (mostly) use the terms in that meaning.
So you should not use brilliance to increase the overall lightness of the image, that’s the job of the exposure module (unless, of course, you have a specific reason to do that). That the “perceptual brilliance” sliders seem to have the same effect does not mean they act the same way.

The sequence described in the manual (exposure, filmic, color balance rgb) uses those modules in that order for a good reason. That doesn’t mean that you should always follow that recipe blindly, but it’s a good starting point.

A bit like using other tools: you can force a screw in the wood with a hammer (most of the time), but the result will not be quite what you expect (and probably wanted)…

I can reproduce this behavior and have written a bug report:

If anyone wants to participate they are welcome to do so.

8 Likes

I totally agree with this. An average user dont know much about color science. I have told my photographer friends to use opensource software, especially darktable, but both of them said this cant be used as professional software. One reason is learning new software and they dont have time for this (darktable is not a easy one) and second was ugly colors that they were unable to fix. If i remeber correctly problems were with fixing highlights. To make highlights visually “pleasing” as lightroom can do with one slider, was too much difficult for them as they had to use multiple modules and have knowledge how to use them.

A “professional photographer” is not necessarily a better photographer, it’s just a person who get paid for his/her pictures.
In most cases a professional photographer needs to develop hundreds or thousands of pictures in the shortest possible time, with decent quality, for pure economical reasons.
For this purpose the “one slider” approach of LR is the best: fast and decent quality. Not excellent.

Instead, DT is for enthusiasts, people who practice photography for pure personal pleasure, who don’t mind to study a bit and take some time to get beyond the “good enough” quality, aiming to excellence.
DT is almost on par with the image processing techniques used by the cinema industry, who is the only one having the budget needed for hiring true image processing experts.
Photography, no, that is much behind, as there is simply too little money in it.

2 Likes

Arrélien Pierre is going to be happy reading that :stuck_out_tongue: Afaik, he is a professional photographer…

The average professional will not work at the ‘excellence’ level, if only because the market at the price levels associated with excellence is very small. Then again, how many professionals dealing with hundreds or thousands of images use raw? (starting frm jpegs, a lot of the advantages of dt are lost already)

The need for retraining and the perceived lack of support are probably more important reasons to stay away from dt. Whether those reasons are valid? No idea.

Another case where the blob appears
increase the brilliance to the point where white blob appears, just start decreasing to the point where the blob disappears, now open diffuse or sharpen module, select fast sharpen and try increasing the iterations by one and the blob will appear again, in my case it was around 3.

Also in some cases just zooming in will make the blobs to appear

Once again : you are pushing areas which are already clipped in raw data higher and higher using several modules. Several way to avoid this have been suggested (masking clipped areas, using curve modules instead of filmic, setting filmics parameters to avoid it’s reconstruction…) And the reason for the actual behaviour (artefacts) has been explained in the github issue mentioned/opened by @s7habo.

Driving a car with 200 km/h against a wall will cause artefacts on the car. :wink:

2 Likes