Creating blacker blacks on a grey day from a flat image

There is a grain module in DT so an added algo might go there if it got traction…might first be worth figuring out what it would add vs what is present…

Glad you like it!
Very happy to share more details, it is fully algorithmic and in python for now (not very fast, a few seconds for 24MP images).
It is kind of mimicking the Poisson/binomial process of the silver halide particles in every pixel but uses the beta distribution function from scipy as a continuous approximation. Then I use opensimplex to generate spatially correlated noise patterns that are then distorted by the beta distribution.
My main addition compared to state-of-the-art and darktable grain is also to add artistic knobs to change the apparent color of grain (like in more salty or more peppery), and the overall shape of the noise distribution (aka to have some control on variance, skewness, and kurtosis of the noise distribution with parameters).

I made a quick and dirty gui for testing using napari and magicgui, too.


Right now there are definitely a lot of parameters :laughing:

I could make a full post if people are interested. I am still experimenting with a few things, though. (like in the rabbit hole) :grin:

5 Likes

that’s very cool!

Definetely too many options for mere mortals (thinking about a possible implementation in darktable), maybe you can reduce it with some presets and/or 3-4 ‘knobs’ max?

Quoting Todd @priort I also think would be nice to see a comparison between one straight image, one with added grain using the current grain module available in darktable and another using a couple of different sets of parameters according to your taste that could emulate different kind of grains perhaps (if you’ve managed to isolate those params that could emulate for example the tri-x pushed to 1600 or some other typical film – that’s the thing that I always aim for because on the wall just next to my monitor I have a panel with scattered prints and the ones that I still love are old shots taken with tri-x).

EDIT my good friend @Claes reminded me of this thread where Andrea and other knowledgeable contributors have discussed grain and algorithms… I guess my little advice above is totally unnecessary, as it can be seen that Andrea is the sort of guy that has already figured it all out, so I would just say, forget everything else, it would be nice to see the new grain generator implemented somehow in darktable and perhaps a followup to that thread to show a few examples?

It’s funny because I completely forgot that I also added a comment to that thread above, almost 3 yrs after the initial post… and here I am all excited again for some grainy noise to be added to my too clean images!!

2 Likes

I’ve been thinking about film grain in my spare time for a while. Not sure why it is such a special interest of mine.

I’ll think about reducing the number of knowbs. My dream interface would be something like 3 sliders plus 2-3 curves like in the contrast-equalizer to fine-tune the look.

I haven’t developed presets matching film stocks, but with good reference images, it might be possible. For now, I just tried to match my result with a few analog photos from photographers I am following.

After vacation I’ll definitely make a separate post with a few examples, explaining better the idea! So we are not going off-topic in this thread.

1 Like

Thanks you for your inputs.

I understand about downloading these IMG_1454.CR2.xmp files but could someone tell me how to import them into Darktable so I can see the editing process of each one individually?

Thanks for giving me a clue of where to go with this.
From your suggestion I found this in the 4.0 manual
‘Do not use the black level correction to add more density in blacks as this can clip near-black colors out of gamut by generating negative RGB values.
This can cause problems with some modules later in the pixelpipe.
Instead, use a tone mapping curve to add density to the blacks.
For example, you can use the relative black exposure slider on the scene tab of the filmic rgb module, or establish a deeper toe in the base curve module.’

In lighttable view you will see on the right history…load a side car…this is where you would do it… create a duplicate and load it on that would be my advice so you can compare with your edit…

1 Like

Yes, that works well, thanks.
Using the scene referred pathway should one use the Lab or RGB color space?

Bit dated but likely a better resource than me…

Edit maybe browse this to so as to get some background … on the terminolgy and approach…

https://darktable-org.github.io/dtdocs/en/overview/workflow/process/

1 Like

Impressed with the GIMP cloning - did you just use the clone tool for that?

Yep, all clone. If you pay attention to adjacent patterns, you can make lots of things disappear…

1 Like

Yes, you can cause problems using that slider. But in some cases it’s the fastest way to get where you want to be. In the image here with the cows, 0.004 or even a bit less did the trick.

If you need a fairly aggressive correction in the blacks, anything using an EV scale will need a very large correction (as it’s a multiplication, not an addition like in the exposure module)

1 Like

Replying to myself; sunk to a new low… :crazy_face:

Additional information on how I clone. First, i worked your image using the JPEG rendition, normally I’d save a full-sized linear TIFF, work on that and export a full-sized TIFF (or JPEG, depending on my mood…) from GIMP, and then use that intermediate rendition for various sizes and/or crops. In GIMP, I set the patch size to something just big enough to capture patterns but not too big that it incorporates complex collections of patterns. I set the blur to “not much” unless the clone patches are mostly uniform, that is, no patterns.

Most of the things I clone out seem to be long straight things, like power lines and microphone stands. For things like that, I just work from one end to the other, picking an adjacent side for the patch and wiping it over the offending object. If there’s a definite pattern to replicate, I don’t wipe, I click. Doing it in small chunks makes it easier to undo the last if it doesn’t go so well. Finally, I go back and do a good look at the whole thing, as it’s easy to think one wipe you’re staring at went well but it really didn’t remove all vestige of the object. I didn’t do that for your image, and there are a few spots I’d redo on further examination.

For uniform surfaces it’s easy to make splotches of clone that look “clone”, so for those I’ll usually go over the whole area with the heal tool in circular wipes, using a bit larger patch size.

FWIW…

2 Likes