I am trying to measure my camera signal to noise ratio using linear gamma files converted from RAW with darktable and Raw Therapee (RT).
I took a plain card and photographed it, using Canon 5D mark IV. I made sure the card was very out of focus. I shot it at ISO 100, by spot metering the card and using camera bracketing to give me a 7 shot sequence from -3 to +3 EV. That was to test linearity.
I used the spot meter on the card to shoot a sequence at ISO 100, 200, 400, 800, 1600, 3200 to test SNR.
I then wrote a program to calculate SNR as the mean divided by standard deviation of a 800 x 800 pixel section of the center of the image.
To test linearity I took the mean of the 800 x 800 section from the center of the images at the same exposure.
I made sure that I used settings for both darktable and RT that would be compatible with linear gamma output to a 32 bit per component TIFF file. I also did conversion with dcraw outputting to 16 bit per component TIFF.
Here is the linearity plot, which shows that yes, I am getting values that are linearly related to exposure time (all at ISO 100).
The strange thing being that while RT and darktable are both linear, they have very different values - RT is much brighter.
Note that Red, Green and Blue channel are plotted separately but the Red channel is hiding beneath one of the others.
Here is are the SNR plots which are even more problematic
The problem being that RT has a SNR of about 100 for ISO 100, whereas darktable has an RT of about 50 at ISO 100.
Obviously one should not get different SNR from two different RAW converters on the same file. Why is this happening?
I checked and no noise reduction was used in either one.
Both use the AMZE demosaicing algorithm.
I assume that the reason has something to do with darktable being so much darker.
The shots were done with the spot meter and 0EV correction. So in principle that should yield a “middle gray”, not a super dark tone.
If I take the 800 x 800 center pieces and take the mean all the way to down to a pixel, then darktable gives me {0.0886,0.0884,0.8875} for RGB expressed as floats. That is very dark.
RT gives {0.305,0.305, 0.307}.
dcraw gives {0.0350, 0.0350, 0.0400}
So RT is giving me roughly 3.4X higher values than darktable, and 8.7 times higher than dcraw (blue channel off by more).
darktable is giving me roughly 2.5X higher values than dcraw (but the blue channel is off by more so there is a color shift).
A multiplicative factor does not affect SNR - it shifts the mean and standard deviation by the same amount.
An additive number does change SNR. So basically RT seems to be adding a number to the values, while darktable does not.
Can anybody help with suggestions of what settings to use? Or offer an explanation?
My guess is that there is some setting that I need to adjust for darktable (or for RT) to make this at least get in the ballpark of the same value.