Thanks for the confirmation. The CN user who raised the question went back to look at one of their own images from a 12-bit camera, and saw the same thing. So false alarm.
I also re-pre-processed a stack setting the preferences to 32-bit float instead of 16-bit unsigned, for the average stacking operations and that did substantially smooth out the auto-stretched histogram. Why didn’t I do that before? I guess my day job trained me to avoid creating “illusory” precision. Storing values that were measured as 4096 options in >4,000,000,000 bins sounded like over-precision.
Of course, the point is not to make pretty histograms. I am re-processing my current project with 32-bit float - we’ll see if I can notice a difference in the result. If nothing else, it might help stimulate the economy a bit by getting me to buy another SSD.