Which is why I said HDR in monitor lingo. When you buy an “HDR” monitor, it’s a premium product with some industry certifications (namely DisplayHDR from VESA) with a minimum luminance, gamut, etc etc.
Also, to get wider color gamut, you generally need more bits to prevent posterization. Adobe RGB you can get away with, but for DCI P3 and especially Rec 2020 you’re gonna have a bad time.
It sounds like the gamut was responsible for the beautiful colours here, not the brightness or contrast ratio. (And even if it was the brightness and contrast ratio, those things would be due to your screen
Brightness /does/ impact contrast ratio. Generally when HDR content plays, your display goes into full brightness to increase contrast ratio. I hope I don’t have to explain why that is the case. They also might activate things like local dimming to further improve contrast ratio.
When HDR TVs and such play SDR content, they usually set up with a lower luminance value than the maximum (unless you specifically override it somehow) so things don’t look too bright, because SDR content was mastered assuming the white point luminance is at 100-300 nits or thereabouts.
they would not be embedded in the video.
They literally are. This is where my knowledge falls short, but there’s a lot of content out there that say “alright, we want this to be displayed at 600 nits” or whatever (and a bunch of other metadata as well, such as a built-in tonemap in case you can’t hit 600 nits, and even a LUT to move color spaces). This is called HDR metadata. I believe even pictures can be embedded with this kind of info. Check out colorist: https://github.com/joedrago/colorist, which aims to be a CLI app to get that done.
Often, the HDR metadata needs to be sent from the app to the OS through the GPU driver to the HDR-compatible display for everything to click together and work. Most HDR displays work in sRGB compatibility mode (AFAIK) if you don’t pass this information, in fact. Which is my concern. How the heck do I create an output raster that will trigger all the good stuff and make it look nice on my OLED phone?
edit: to add, there were plenty of indications that the HDR content kicked in a bunch of switches on my phone to go into a special mode. The brightness became locked at 100% even though I tried to pull it down, and the color gamut definitely looked expanded because there were some regions where when viewed on an sRGB screen like my laptop, it looked deepfried and had no separation between medium bright saturated vs. full luminance saturated. All automatic. I’m sure it’s because the video had embedded HDR metadata, which I know is a thing (look it up). HDR seems to be well-built for video and the ecosystem is way more mature.
And if they were due to your screen, it would be the same for all content viewed on that screen, not unique to that video.) As stated above, to take advantage of this, set ‘working profile’ to something wide. Rec 2020, aces and pro photo are all wider than DCI P3.
If the output profile is sRGB, darktable will do color space conversion from the working profile to output profile, using algorithms like “perceptual” or “relative colorimetric” (I’m not even sure what algorithm they use, because there is no way to set it). Oh by the way, the default in dt is linear Rec 2020. So yeah, almost everyone works in wide gamut mode.
I am hoping that setting output profile to DCI P3 is enough to allow me to display with better capabilities, but considering your other inaccuracies I’m a bit worried that this will actually happen. This seems to be a path that not many have gone through. Semi related tangent: remember the Android wallpaper brick incident? It was caused by a software bug that was triggered by a DCI P3 image, because after multiplying with a luminance matrix to convert to grayscale, floating point rounding errors caused an index to overflow. Woops. Looks like these HDR images are a new thing.
If you are specifically outputting something to the DCI P3 space, then you can either set that as working profile, or set one of the others mentioned, with DCI P3 set as export/output profile. And for accuracy it would of course be preferable to edit and view on a monitor that covers (as close as possible to) 100% DCI P3.
I don’t have my DCI P3 monitor yet, but I am hoping that everything works well and I get something with a wide dynamic range, but I’m a little worried that 10 bit support and the appropriate HDR metadata passing is all supported.
I was hoping that someone with a bit more knowledge on the ecosystem or someone with prior experience could share a bit more insight. Again, repeating my original question: what file format do I use to get 10 bit and also the appropriate HDR metadata to get passed through the monitor with good compatibility? Does it even exist?