Yesterday on a whim I tried out ten bit per channel display on my desktop, since I have an Nvidia card and a monitor that accepts ten bits and presumably dithers it down to the 8 bit panel (not sure whether it’s temporal, like older Dell monitors I’ve used, or spatial like my laptop with its 6-bit panel).
I first tried using Openbox, assuming it would have less issues, but it didn’t work at all.
Then I tried normal Unity, which surprisingly worked almost perfectly, except for window decorations which assumed there were 8 bits of alpha, rather than the two bits of transparency left from 32-(3*10), so they were blue and wonky.
xwininfo told me that everything was running in 30 bit mode, but nothing actually was using it. I had read somewhere that Krita works with 30 bit per pixel, but it had no less banding when I generated gradients and blurred them to make sure it wasn’t the gradient tool’s fault.
Imagemagick display likewise didn’t benefit.
Qt doesn’t seem to be able to display 30 bit images (I half-heartedly tried to make Filmulator’s pipeline output the proper 30 bit qimage but it had horrible posterization somehow; maybe I made wrong assumptions when writing my bit twiddling), and zooming in really far on the pixels (effectively making gradients) lead to the same banding as in 24 bit.
Has anyone tried this before and gotten it working?
If even one app benefits I would be willing to use it, even with broken window decorations.
This will also be important for app developers in the future, since higher dynamic range monitors and ten bit color are supposedly coming to consumers shortly. Though that’ll have different color spaces and gamma curves, not just more precision within the same gamma curve.