Not necessarily. Bit depth is a bit depth, trc is another beast. More bits → technically better gradients, as a rule. Also, 10 bit usually isn’t enough when used in linear fashion. IIRC, HDR10 is technically limited to a maximum of 10,000 nits peak brightness (however common HDR10 contents are mastered with peak brightness from 1,000 to 4,000 nits). Dolby Vision standards allows 12 bpc.
But with low bit depth beside brightness issues we have color bad smoothness. Jpeg’s trc is there because 8-bit is far too few for quality image reproduction. So the developers used human vision feature (roughly logarithmic sensitivity to the light) to push the density to darker zones leaving lighter ones sparse. As a result, on jpegs images you often may see banding on the sky, because it fits in the lighter, sparse, zone. And please don’t forget that sRGB and JPEGS are standards from CRT era. Viewing hardware made huge leap since then, but at the software side we still holding onto that ancient formats…
1 Like