How does the normalized real checkbox function calculate displayed values. I have been using PM to divide images and noticed instances where the normalized statistics window displays results larger than 1.
How does the linear display mode calculate its maximum brightness level displayed, i.e. does it calculate a maximum level based on 32 bit integer level?
Statistics can be expressed with a value greater than 1. Just because we do not constrain values in the range [0, 1]. There is an option in Pixel Math to do it if you prefer: Pixel Math — Siril 1.2.0 documentation
I see that option, thanks I will use it. but my question could have been stated better as, how can the statistics window report a normalized maximum greater than 1?
Because this is just normalized to the theoretical maximum, i.e. 65535 in a 16-bit image.
Value above 1 shows values that could be truncated by conversion to TIFF or jpg, …
Thankyou so much for your responses. I am using fit format and the Gui indicates I am in 32 bit mode, although I’m using pm. The images are mono extracted from RGB. I see a max value of 2+ when dividing an extracted green by an extracted red channel. Since I’m now limiting via pm to Max of 1.0 it’s not an issue.
Nonetheless am still an curious what could lead to the behavior of a max value of 2+ in normalized mode. But I’ll consider this one of those mysteries in life one must accept and move on. Thanks again for all you do for this forum.