Colorspace clarification....

the real point is that there isn’t the offending blue to purple shift in the Lab tonemapped image (but there is in darktable)

1 Like

Please don’t quote me as if I said that “the entire direction that darktable is taking is wrong”. I’m far from convinced that this is true - I was paraphrasing you.

1 Like

Sorry for my outburst, the new darktable’s philosophy “everything should be done in linear rgb or using the rgb ratio preserving method, everything else sucks” it’s something that I really don’t understand.
There are definitely some benefits, sometimes, and disadvantages other times.

These are still very general statements you’re making @age but we are asking for some reference material to back up your statements. “Its obvious” or “go find it yourself” aren’t acceptable replies here.

If you don’t have the time to find supporting evidence, just say that. We are all busy. But continual double-down on vague statements does not progress the discourse.

3 Likes

It was my impression that as long as you want to perform a digital version of a physical operation (i.e. blending), you must obey the physics. Since physics works in ‘linear space’, your operation must be performed in ‘linear space’ (there may be more accurate terminology here…). If you’re happy to throw physics out of the window for something that feels more intuitive or gets you to eye-pleasing results faster, than this is not wrong, it’s just different.

Edit: oh and I don’t think I need to remind anyone on the forum that color science can be a tricky subject. There’s a lot of research going on to improve processing methods, accurate color rendition under all sorts of circumstances, etc. It seems a little foolhardy to stay with the CIELAB 1976 recommendation when, e.g. we have a pretty well-established CIECAM16 (2016) color appearance model to work with as well.

6 Likes

I should clarify that @anon41087856’s original article on the subject (here – disclaimer, I helped edit the English translation of this article) was pretty convincing to me and judging by my own results, every edit I have done using the scene-referred modules has been a vast improvement on similar edits done using the old Lab/display-referred modules. Either scene-referred is better, or @anon41087856’s modules are better despite being scene-referred. (I will of course admit there’s a possibility I’ve just improved my process over the years)

I need a pretty strong demonstration if I’m to believe that all of this was a load of rubbish and I’ve not seen anything like such a demonstration.

6 Likes

So you are proposing to perform USM/local contrast in ciecam16 color space? I would say that nobody will see any difference compared to cie Lab and as such it could be used because faster.

Close it, I don’t care. Still USM and local contrast in darktable could be used with the scene referred workflow because they work in the range 0-inf.
This thread was started from this assertion.

Sorry, where is your portfolio again ?

Here is mine : Portfolio - Aurélien PIERRE, Photographe. 100% practice since 2011. Theory came later to understand why I couldn’t get the advertised results out of the DR-beast that was the Nikon D810 at the time.

  1. Blur comes from lenses,
  2. Lenses process light,
  3. Linear RGB is proportional to light emission.

How does perceptual fit into that ? Light doesn’t have a gamma or a cubic root, I’m sorry.

Proprietary LUT are tuned for FPS, not for color consistency.

???

You know a shitload of papers actually come from an era where RGB had to be used as 8 bits integers, which does not let much of a choice than to use gamma RGB. Which says strictly nothing about linear RGB.

And again… Back to lens model and blur formation.

So perhaps the USM is the problem here ? Cared to try diffuse and sharpen ?

Because RGB ratios fall back to exposure changes which preserves hue and chroma. While independent RGB is a clusterfuck of unpredictable color shifts, all the more if RGB is not linear.

6 Likes

If you don’t want to participate in this thread then don’t. Its that easy. Nobody said anything about closing it, we are just trying to understand your point. I personally don’t understand what you’re getting at and it seems like quite a few others don’t either. We are trying to gain clarity.

Great, truth is that it doesn’t preserve hue and after filmic the user have to restore the saturation with only six saturation sliders and mask (vibrance, global chrominance, shadow saturation, midtone saturation, highlights saturation and highlights brillance), without this long step every image looks like a post-apocalypse movie.
The new sharpen module just adds two minutes for every rendering and for 90% of the times isn’t better the the old unsharp mask.
Good work.

Let us take a break from the discussion and refrain from using yous and Is. It is okay to disagree but to use disparaging words or expressions is not helpful.

6 Likes

I like some of the subjects, stop.
The image editing is not for you

No, my remark about CIECAM16 was merely to point out that newer systems may account better for more diverse color phenomena than older ones. That is something to consider when talking about editing in a certain color space / model. In particular, if I understand correctly, CAM16-UCS or JzAzBz or Oklab were designed to be much more ‘neutral’ over some axis (e.g. saturation vs. hue/lightness) than CIELAB was. If that’s true, and there are no major downsides to those newer models over the old one, I don’t see any reason why you wouldn’t prefer using them.

As for performing USM, I never investigated how it works thoroughly. Yet from a quick inspection of Wikipedia I see it is actually a physical darkroom technique. In that case, I would presume you would need to “obey the physics” as well in the digital darkroom, and doing that in anything other than linear light seems wrong (note: seems, not is, because I simply don’t know).

1 Like

I did not mean to drag everyone into an adversarial debate. I was honestly in the initial offering interested in @age 's comments when he spoke about things being superior… it was not a challenge but an inquiry.

I think there is room to differ in views on approaches when focusing on speed, practicality, simplicity, implementation, and targeted use cases for a technical application or topic.

Having said that, there is the need to generate examples that support or refute any such assertions of inferiority or superiority demonstrating the extent to which this maybe somewhat subjective or soundly grounded by math and physics… (which might still be a topic of debate as well even after all that). Also this may reveal or present whether this applies to specific use cases or a broader more general application.

I respect that this is very time consuming and is likely not of that much interest for people to invest time with screen shots and article links and full blown explanations…but it is more progressive than shots across the bow…

Sorry you all got dragged down the rabbit hole

6 Likes

Let’s take sometime to gather ourselves and remember that civility is a requirement in this space.

6 Likes