true 10-bit color depth

I have a GPU that supports true 10-bit color depth and I’m looking to buy a monitor that supports true 10-bit color depth.

Does darktable support true 10-bit color depth?

I want to make sure darktable supports true 10-bit color depth before buying a new monitor since the monitor is expensive and I won’t get the benefits if darktable does not support true 10-bit color depth.

Thanks for any help I can get. I tried to find the answer myself but could not so far.

many years ago i wrote a gtk3 prototype to output 10-bit colour through cairo and x11 (that’s what dt is based on). it was very hard indeed to teach cairo 10 bits. it was either secretly doing 8-bit (you could tell by banding in gradients) or it was very slow. like 30sec for one refresh. i don’t know this has changed in the mean time, my guess is that efforts would rather have gone into gtk4+friends.

i’m now using vulkan for my image processing, that does support true 10-bit output. with dithering enabled in 8-bit, i cannot attest that 10-bits make much of a difference to my old eyes.

6 Likes

Also, 10-bit support is only a part of it, these days one also has to take care of compositing for HDR (e.g. UI elements at a particular brightness level, while image surface remains unlimited…), and I don’t think anyone is quite there yet.

And yes, 8-bit vs 10-bit doesn’t really make a huge difference at 100-200nits (SDR), but it really makes a difference for >1000nit HDR displays.

1 Like

I have 10 bit support from Quadro Nvidia GPU and EIZO CG 279X wide gamut monitor with 10 bit support per channel.

I can get X11 to support 30 bits, but not the Linux aplications: GIMP, RawTherapee and Darktable.

30 bit support on linux Gnome is lost somewhere between X11 and final render in program. I suspect but do not know this is due to GNOME/GTK/Cairo. @hanatos suspect Cairo.

I am on Debian 11 with Gnome GTK3. GTK4 unknown to me

Gnome and GIMP and 10 bit

GIMP dev mail list:
https://www.mail-archive.com/gimp-developer-list@gnome.org/msg09813.html

I have not tested recently, so my feedback is a bit dated.

It seems one need vulcan or windows/osx/photoshop to get 10 bit per channel to work.

fwiw vkdt renders 10bits/ 30 per pixel just fine. have not tested on bright displays… but it seems to me a sufficiently fine resolution + dithering with 8-bit would still make it indistinguishable from true 10 bit. might test one of these days.

I just drew some experiments on my system. X11 reports 30 bit pipeline, xdpyinfo shows both 8- and 10-bit depth, xwininfo gives 30 or 32 bit depth. However the real output is still truncated to 8 bit somewhere, because in dt, gimp,… I can see ugly banding on smooth tiff gradients.
However, vkdt-generated 10-bits ramp looks smooth, so vkdt->X11->GPU->display pipe is 30 bit, indeed. I suppose one needs vulkan rendering to get real 30-bit output. Now I’m puzzled how to make X11 render stuff via zink instead of direct radeonsi rendering.
I also think that GLES2/EGL is favourable over old GLX pipeline in this regard. So far no success. and I could not find any relevant info on how it should work, either. Any ideas?

1 Like

A little update on this.

  1. After massive internet search I managed to get zink working by switching from xf86-video-amdgpu to embedded modesetting driver in xorg.conf
  2. amdgpu appears to have parameter deep_color, defaulted to… right, 0. So one needs to add a line to modprobe.d (in case amdgpu is loaded as a module) or add amdgpu.deep_color=1 to the kernel command line (if embedded).
    At the moment I can’t investigate it further due to the lack of time, but it seems that 10-bit output is an achievable target.
1 Like