This looks great. And the math sounds very good–is there a paper describing your method?
indeed!
would love to see this make its way into darktable, at present I have to “finish” all my images off with G’MIC to add satisfactory grain, but you’ve got here is even better, well done.
sorry to necro this old thing i think relevant references would be:
Alasdair Newson, Julie Delon, and Bruno Galerne. 2017a. A Stochastic Film Grain Model for Resolution-Independent Rendering.
https://onlinelibrary.wiley.com/doi/10.1111/cgf.13159
and follow up work:
https://dl.acm.org/doi/abs/10.1145/3592127
but this is essentially just deriving a gaussian filter to express the correlation of (monochromatic) grain influence between filtered pixels. @arctic’s current model is more sophisticated.
I somehow missed the second paper. Cannot get access to the pdf but watched the talk. Interesting!
It makes sense to model the statistics of the pixels. Grain particles are tiny and unless extreme magnifications are needed, it is usually a good assumption to not be able to resolve single particles.
What I miss in [Newson2017] and follow up, is a more extensive comparison to “real data”, i.e. statistic of density measured on negatives or prints.
Also the extension to color photography is a bit rushed. Especially because the main assumption of having a binary model is not valid anymore. In the color photography process silver is dissolved, dye clouds remain (that have a transmittance >0).
I love this work on grain, though.