Maybe @afre is saying the same thing and I’m misinterpreting. But it seems to me that given that scanning at a lower-than-max-optical-resolution does simply skip readouts, there is no binning going on during the actual scanning, instead just skipping over data points/readouts.
So for maximum signal to noise ratio, I’d suggest scanning using the maximum optical resolution the scanner provides, at the highest bit depth. And then as @afre suggests, use a command-line program to downsize, thereby binning to reduce noise while making a smaller image file.
@snibgo or @afre or other imagemagick experts - what command would be used to downsize using “binning” rather than the more usual downsizing while trying to preserve detail? Also, is there a way to tell imagemagick to not do any “gamma” correction? At this point, before making a scanner profile, what the native “gamma” that the scanner might (but hopefully doesn’t) incorporate even in “raw” output is unknown.