I’m not at all sure that this is an exposure issue.
I think that a reasonably good modern digital camera is able to capture this without much problems. The status of the hardware we use to show images in their full glory seem to be out of reach (technically and/or financially).
The best monitors that I know of are 10 bits wide gamut. I’m not technically proficient enough in this area but I do believe that not even that is enough to correctly display all the info that a (very) good modern digital camera can produce.
I do hope somebody proves me wrong and points me to a, somewhat, affordable monitor that can handle modern RAWs in all its lustre.
Yes as the others have suggested, it is not an exposure issue. Here is a screenshot of three wave forms. Left: raw, no demosaic, everything else off. Middle: raw+wb (d65), no demosaic, everything else off. Right: raw+wb (d65), vng4 demosaic, everything else off.
The red channel overexposure is only post-demosaic, thus easily fixed in raw convertor by reducing exposure (this was the bit that confused my edit, as in dt workflow it is so rare I will ever have to reduce exposure), as the data is there. Instead of doing it in post, you could do what Carvac suggests and leave two stops of headroom, but wouldn’t that produce a slightly noisier image? If you wanted a nice OOC jpeg, that is how it would have to be done.
Do the bits matter here? I thought this was purely a gamut issue, and the only way to see this image properly would be on a wide gamut monitor. My understanding is that the main benefit of higher bit monitors is greater clarity around banding. That is, if you see banding in an 8 bit monitor, you can’t be 100% sure whether the banding is in the image, or due to the screens lack of bits to display the gradient properly (which would be extremely rare, as 8 bits is fine for 99% of real world scenarios). But if you have a 10 bit monitor, and see banding, you can be sure it is in the image, not the screens lack of bits. How wide does a gamut have to be before 10 bits starts becoming a requirement? I am looking to buy a new monitor soon, and I will be prioritising gamut size over number of bits. True 10 bit monitors (not 8 bits + 2 bits dithering) are probably priced out of my range anyway.
I am under the impression that a wide gamut monitor is always 10 bit (or better). I automatically assumed that the bits would matter. But like I stated before: My tech knowledge isn’t up to specs in this case.
When Adobe RGB is “wide gamut”, then there are monitors that only have 8 bit output. 10 bit per chanel is important for the gradation but seems not to be important in this case, which is mainly gamut related.
Is it, though You are correct, of course: It is wider than rc709, which seems to be the measure, but for whatever reason I don’t put adobaRGB into the WCG range (rec2020, ProPhoto: Yes).
But the answers given above clearly point out that this isn’t a X bits is important issue, which I assumed. Thanks for clearing that up!
It might be interesting to know that this is a semi-old problem.
Here is what I wrote five years ago. In those days, it was yellow petals
that were problematic… They probably still are.
@Claes: It definitely is an old issue and you see it in the red/orange/yellow range and flowers seem to particularly like to mess with you in this regard Have plenty examples lying around that made for interesting editing moments…
BTW: The filebin link expired in the old post. Any chance you can upload the RAW here again? I’m asking 'cause I’m curious to see how Morgan tackles that problem.