Hi,
For over a year I’ve been running a script that uses 16-bit precision for all calibration and switches to 32-bit for the final stack to save a bit on disk space during stacking because I shoot many short exposures.
I recently changed back to 32-bit processing as I noticed in some cases a bit more noise in the final result when using 16-bit calibration. However, I noticed that with 32-bit calibration my background values in my final stack, which usually were <=~1% before now range a fair bit depending on the stack, but often are 18% up to about 50%. So when unstretched, the background appears medium grey.
Upon autostretching and processing, things look good, so I’m not sure if this is material, but I thought I would ask:
Is this unusual or something I should be concerned about? I’ve used SIRIL for over 2 years and never remember having my stacks look gray when unstretched, but perhaps my memory was faulty.
For the record, I shoot with a cooled ASI2400MC, in this case with the L-Ultimate filter, gain 300 or 350, offset 50, short 6-8 second exposures, so it’s natural to expect the histogram to have backgrounds that are very dark. I shoot with flats and biases but not darks (very short exposures with a modern cooled camera really doesn’t need darks), so I use calibrate both my flats and lights only against biases but I have tested with darks with an unmodified version of the built in OSC_Preprocessing script and get the same results. I’m on Windows running SIRIL 1.4.0B3 but I tested with the older 1.2.6 with the older scripts and get similar results. I get a gray background.
Thanks for any enlightenment!
Steven