I am interested in “practical camera SNR” - meaning a measure of the baseline noise that has a clear relationship to the noise that I can expect to get in a photo.
I am not interested in a theoretical sensor metric that has little to do with a real photo.
Here is a practical example that motivated this. We know that a properly exposed ISO 100 shot will have less noise than a properly exposed shot of the same subject at say ISO 400, or ISO 1600.
How much more noise? My method will find that.
We also know that if we average 2 shots of the same subject that we should get an improvement in noise. How much? Theory tells us it should go roughly like Sqrt[N] where N is the number of shots. But it would be nice to verify that. Again, my method will measure that.
Putting these together, how many ISO 1600 shots do you have to average to have the same noise (i.e same SNR) as one ISO 100 shot?
That is an answer that I want to get. And I have it for most raw conversion software, but there are weird anomalies with RT.
I am re-running without the lens corrections, but unless there is a bug in the lens database or associated code, that should not matter much to a completely out of focus image of an evenly lit plain white card. Plus the lens is a Zeiss macro lens, not some wild fish eye.