@geldo, you’ve made me think about my histogram regarding, and I think it comes down to this:
- First regard: The data bounds. And that’s not just simple max/min, it’s where the “clump” begins and ends, because things like specular highlights can vex the simple statistics. There are a surprising number of factors to consider in the bounds, first one being how the exposure was captured, but also where the data sits in the container. For instance, if one is capturing 14-bit raws, you have a couple of stops of leeway as you mangle your data toward the 16-bit integer bound, if your software is using 16-bit integers internally. One of the reasons floating point is good; that boundary doesn’t exist there.
- Second Regard: The “clump’s topology”, how the tones are spread between max and min. Data as measured by the camera is decidedly “left leaning” due to the linear relationship of the light measurements. Once you start scaling that data out of linear, the shape of the clump starts to, for lack of a better term, “lean to the right” as the tones move toward a scale more suited to display and printing. This lean can come from curves, gamma transforms in ICC profiles, OCIO look LUTs, anything that will scale the data from its original linear relationships.
I too have found this thread to be quite insightful, particularly with regard to constructing my histogram tools…