[Solved] Why do CAT and Filmic render clipped areas so weirdly

Then you simply didn’t touch the limits of a display referred workflow, never halos, never weird artifacts, colors… you simply had luck :wink:

I must admit I didn’t quite catch that - maybe as non native English speaker I need more plain language. What do you mean by being under “extreme threat” here and what is that tool fired in the end? Photoshop to convert to CMYK, right?

Well… Why should I go back to a display-referred workflow where I have to fight a lot harder to control contrast and (highlight) detail in more difficult images? I use fewer modules and less time to get a satisfying edit with filmic and associated tools than with the older display-referred workflow. And that’s for images where nothing is clipped in the raw data, so basically all the necessary information is present.

Agree with what you say, but the heat of the debate on this thread is caused by the same issues as tech companies face when introducing new versions. I am only suggesting that the way to reduce the heat and anger is to focus on the positives of the new release.

Keep up the good (and FOSS)n work !!!

image
New display technology is emerging that will really take advantage of this…

1 Like

This silly thing about all this is you still have a choice in DT so why such a fuss. Use that path that suits you and move on…use tone curves use whatever…It would be different if all the traditional tools were gone…and if you want or like the newer workflow then experiment or not…the choice is there…there is way too much time and vitriol wasted on this…If it is not apparent from the start that DT is not all wrapped up in a bow like LR then it should be so learn to be a ground up editor or move to something that gives that solution… THis is not a company that has just changed software and is forcing everyone to learn how to use it…take more photos and bicker less about shadows and highlights… :slight_smile:

With skipping lots or this thread, isn’t the answer that if you are having issues with artifacts in the highlights you should turn off highlight reconstruction? Because then most issues are fixed. The re issue is when you really really want to recover the very very last bit of highlight information possible to use (instead of clipping it out), then you need to try a lot to get the artifacts away. But then you are trying to recover more then most other software allows.

Oh, and put charlyw64 on your ignore list. Seems to improve the forum a lot.

3 Likes

The problem was a bit more complex, but you’re mostly right - turning off Highlight reconstruction was an important part of the solution :slight_smile:

Moreover, edits provided by others, ideas and explanations given, as well as Aurélien detailed teaching have given me precious understanding, thanks to which I was able to set my new quick and high quality workflow.

5 Likes

I know I am quoting answers not related that much, but I think you will get the point. I do not think the reason for @charlyw64’s continuing “ramblings” (maybe that’s too harsh but I do not find a better word – I am not a native speaker) is that he dislikes darktable, but he likes it very much and sees more and more of his old habits fade away as more and more modules of the display-referred section of the pipe are deprecated and/or get less attention/fixes/additions than in the past.

This or he’s just another troll.

However, assuming the first, there is a way to get to results. Proof that the drawbacks of the scene referred paradigm are serious enough to require another paradigm (and ideally start coding a solution). That’s what @anon41087856 did for a switch away from the display-referred paradigm, he was able to convince darktable folks to continue with the new approach despite lots of questions and initial objections (and despite his way of talking/writing being perceived as rude by many people) by showing the technical limitations and implementing a better alternative.

And that’s what Elle did regarding Gimp and colour management. And that’s what @patdavid did regarding the proliferation of FLOSS image processing documentation, support, tutorials and forums (probably ongoing process).

IMHO my examples show (at least to me) that that’s the preferred method for fundamental changes. YMMV. However, at least some proof where scene referred fails and display referred not or where the latter has serious benefits would be much better than continuing “ramblings”.

In case the troll assumption is correct, sorry for feeding :wink:.

7 Likes

No, no , no and no.

The only difference between scene and display workflows is where the view transform (aka tone curve aka tone mapping) sits in the pipeline : end vs. beginning. All the rest works essentially the same, it’s the state of the color data that changes a bit : 100% can’t be assumed to be white, 50% can’t be assumed to be grey, so it’s only the bounds that change. What’s hard is algos as well as GUI have taken take shortcuts that don’t work anymore in scene-referred.

Also, the non-chroma-preserving way of manipulating RGB channels independently in the context of lightness changes has created aesthetic expectations that are nothing more than habit (desaturating highlights, saturating shadows). What people forget is the chroma/hue changes have been deferred to specialized tools that allow a better handling of color in perceptual frameworks (color balance), so the tone mapping is asked to “please honour user’s color grading intent” at all cost.

But the scene-referred workflow is not enough of a revolution to be held responsible for all this. The same concepts apply, the same operations are realized, it’s jut that you need to care now about stuff you never heard of before (even though it was already there).

7 Likes

That’s because camera sensors started to get much higher dynamic range only since 2013 or so. Before, the dynamic ranges were mostly equivalent along the whole pipeline. Now, inputs are much larger than what outputs can display. When you do an inkjet print, you get a DR between 5 and 6 EV. So when you come to that from a 12 EV raw picture shot at 64 ISO, it’s more than halved.

2 Likes

Aurélien, I have no doubt regarding the benefits of a scene referred paradigm, I am a happy user of it. The point I made was that ongoing complaints won’t change anything, and proofing the claims is the only way to convince anybody. So your answer is heading in the wrong direction.

Therefore I cannot accept your multiple no’s: If the guy complaining can show/proof that there is a serious issue with a late view transform, fair enough, we have to deal with it. I personally doubt that there are serious issues that rectify a step back to the old paradigm. But that’s my personal opinion, which seems to coincide with your opinion. The scientific experience, however, tells us that there is always a chance that a better/more accurate theory/model supersedes the current state of the art.

6 Likes

everybodys darling is everybody’s fool
If someone wants to use the ‘old’ display referred workflow with all of its pitfalls then darktable has a whole bunch of tools for that since a long time. Further there’s no lack of alternatives. So no need to argue on the new scene referred stuff.
Most of those complaining about scene referred tools never proved it - they’re writing long letters of complaints instead. Their main arguments is ‘intuitivity’ of a mass market, never realizing, that quantity of users is not the driver for darktable developers but quality they can achieve (at a price of mass-intuitivity)
if darktable doesn’t fit a users need, there are two options:

  1. use a tool that fit’s your need
  2. fork it and implement the missing stuff yourself or pay someone doing this for you

If there are design flaws: just prove - not by long essays but by examples, issue reports,…

You can’t expect developers that spent much time to find the best way to cope with pitfalls of common tools to change their priorities just because some don’t understand this way and don’t want to change their habits (and also don’t want to convince developers by pouring sackfuls of money over them)
darktable is not lightroom - neither the business model nor the functionality

6 Likes

We’ve even put that right at the top of the README on github:

darktable is not a free Adobe® Lightroom® replacement.

11 Likes

Sorry, but somebody had to do the BW version :slight_smile:

marco

3 Likes

Sorry I’m late, but it’s taken me a while to adjust it. That’s all I know how to do.

Regards


_MG_5096.CR2.xmp (12.1 KB)

Hee hee.
I see it all of the time in period music groups.

They also have to gall to get vaccines, drive a car, and use electric lighting… all things not around during the “period” of their performed music.

The violinst also is using a modern bow, and is probably using modern strings.

Just no commitment to their art :crazy_face:

(I’m joking, in case some of you are getting the wrong idea).

That said, we can compare playing with period instruments to shooting film. It was how it was done before, film usually has a higher learning curve (at least in the beginning), and there is something different about the look, unless the photographer is highly skilled in either medium. And shooting film can teach us something valuable about photography, especially if we use some the digital workflow as a kind of crutch.

Shooting film, until you figure it out, seems a little unpreditable. So I guess I can understand people looking at baked in JPEG as “film like”… not in look, but rather in the workflow. For most photographers in the film days, it was kinda what you did: only a small amount of photographers developed their own film. Most high paid professional shooting pretty women sent their film to a lab. So you just “got it right in camera”.

Around here, we are most like the crazy black and white photographers who brew their own developer like it is the newest, craziest drug… we can just do the same thing now with color.

Most musicians use modern instruments because they are easier to play, especially easier to play in tune, they are more reliable, longer lasting, and less expensive. Ya’ know, that progress thing, There are a few period musicians who are very committed to it, and the result is beautiful. Listen to Rachel Brown if you want to hear someone who has “conquered” the baroque flute.

I also know musicians who make the same timbre and aesthetic (well, 98% the same)… with modern instruments. But most musicians, like most photographers, are too… lazy? (might be too harsh a word) to really conquer the tool they use to make their art. Instead, the tool dictates the art, which, IMHO, is the tail wagging the dog.

Like this color argument, it just depends on:
–your intent
–your skill
–your ability to use the tools to make your intent meet the forms available.

On film, ya gotta take the photo with more intent. In digital, we can “save” more. There is more flexibility. But it is a double edged sword… and there are limits. Color used to be a decision you made when you loaded the film. In digital, we can treat color more like the ability to dodge and burn like a B&W film.

My philosophy always asks: did you mean to do that? Or did you let the camera do it for you? Other people think differently. It is art, remember. I also suck at Jazz. Sometimes you get to wear sneakers when you are playing recorder. I wear a Hakama when I play Shakuhachi… not everyone does.

Some people want color they can control with intention… others are more “free spirits”… they want to software to keep them from doing crazy things, yet find halos to be “artsy”. We want precision, but most of us would jump to shoot Kodachrome…one last time.

However, that is no excuse to say things that are just incorrect (you know who you are). This group has the technical background to slap you back with math. Your incorrect technobabble might work on other forums, but here, you get a well deserved “fuck off” (gotta love the French when they get irritated).

All I know is that, with this new workflow, I get more what I want. And people (especially customers), unlike when changing cameras and lenses, can actually see the difference and prefer it.

An odd bunch, us photographers are. One day, we might agree on how things should look. I hope not, as that would be a sad day.

(hops off soapbox)

6 Likes

This pic is a screen shot in a Mac. I have not done anything just pressed spacebar on the image file in finder (file manager) which shows the file render by internal display algorithm

Looks like embedded JPG preview from my camera :slight_smile:

the in camera jpg engines are not that bad at all - and a mac also just display the embedded jpg by default :wink:

1 Like