Quite serious. Maybe “fortuitous” was not the best choice of adjective, but my point is that the association is borne out of expediency, and not from a sufficiently deep consideration of its inconsistency and, more important, its implications.
Not buying that, yet. I did a cursory review of some of the “authoritative” sources of definitions and didn’t find any that related the word “exposure” to the post-processing modification of the measured values, the operation we’re talking about here.
Here’s the thing. The “exposure” operation in software looks like this:
//Exposure Compensation
//
//Multiplies each R, G, and B value of each pixel by 2**ev
//
void gImage::ApplyExposureCompensation(double ev, int threadcount)
{
double mult = pow(2.0,ev);
#pragma omp parallel for num_threads(threadcount)
for (unsigned x=0; x<w; x++) {
for (unsigned y=0; y<h; y++) {
unsigned pos = x + y*w;
image[pos].r *= mult;
image[pos].g *= mult;
image[pos].b *= mult;
}
}
}
ref: https://github.com/butcherg/rawproc/blob/master/src/gimage.cpp#L4053C1-L4070C2
It’s essentially multiplying each R, G, and B value of each pixel times a number that represents a “stop”, owing to the old detents in lens aperture dials to reliably increment the amount of light let in by doubles or halves. Fancy Softwares might add embellishments like highlight-rolloff, but the above is the essential operation.
The key point is that the above IS NOT THE SAME THING AS CHANGING THE APERTURE AND/OR SHUTTER SPEED WHEN YOU TAKE THE PICTURE. Sure, it adjusts the values in the same scale, but the implications are different. Changing exposure setting before you take the shot changes the amount of energy to which the sensor is EXPOSED. Open the aperture a bit, more light reaches the sensor, particularly to the shadows, so less ambiguity in their measurement, resulting in less noise. Changing the “exposure” slider in your favorite software does not change the measured energy; whatever shot noise you captured at the scene is just multiplied, and a lot easier to see, dang…
I know why such a tool is in software; it’s one of the “carry-over” paradigms from film photography. Staring at an underexposed negative in the darkroom (I’ve done this, BTW), you think through how to print it so it looks okay. It’s all ass-backwards there, your underexposed negative is quite thin, so you back off on the print paper exposure time so those thin middle tones resolve to gray-ish. Cripes, probably have all that backwards, let me know. Deal is, you have a tool in the darkroom to adjust sub-optimal exposure decisions. Still have the same implications, not changing the sensor capture; you are changing an exposure, but it’s to the print paper…
My concern is this, and I’ve seen such played out in all sorts of situations: A person sees the “exposure” tool in their software, and concludes that if they underexpose their images at capture they can just simply slide this thing up to correct it. And then wonder why such pictures look terrible…
In rawproc, I named that tool “exposure compensation”. I originally called it just “exposure” like every one else, but I never felt comfortable with that. The more I think about it, the more I like my ultimate choice; adding “compensation” to the title gives it the proper context.
We can cogitate all the justification for the use of the title “exposure” to the corresponding tool, but it really isn’t clear communication. I"m a bit sensitive to such; I’ve previously stated my former day job in general terms, “aerospace engineer”. Where I actually practiced that was with The Boeing Company, 20 years as a senior engineer. We all became very sensitive to the need for clear communication in our recent travails…