Help me understand the choices behind "color zones"

I am an heavy user of the color zones module of darktable.

In the current implementation it sits in the display-referred part of the pipeline, after the filmic/base curve/sigmoid. I have followed with interest the transition of darktable towards a solid scene-referred workflow.

My question is: where should the optimal position in the pipe be? Which color space is optimal?
Is it already in the “best” spot?

Right now color zones uses the CIE LCh color space [darktable 3.6 user manual - color zones]. Is this the best choice?

Recently I have been reading more about color models and spaces. In particular I was triggered by @anon41087856 post talking about the JzAzBz colorspace in the revamped color balance module [rebooting color balance - #143 by pass712, The sRGB book of color - Aurélien PIERRE Engineering, https://www.osapublishing.org/oe/fulltext.cfm?uri=oe-25-13-15131&id=368272]. JzAzBz should be good for high dynamic range applications, and thus it fits a scene-referred workflow.

Today I stumbled upon a very interesting blog post from Björn Ottosson , that is discussing about blending of colors in sRGB/linear/perceptual color spaces [How software gets color wrong].
Have a look in particular at the color blending comparison section:
Screenshot from 2021-05-07 23-32-36
[figure from How software gets color wrong]

My naive conclusion from this post confirms “sRGB” as a bad choice for smoothing/color blending, “linear” behaves much better and in a physically accurate way, and a perceptual space seams more predictable. Look at the transitions involving blue shades.

Also, what do you think about using color zones?
Any comment is much appreciated!

1 Like

I can’t answer much of the technical stuff, but Color Zone is placed in the optimal place by default. It uses a display referred space to do its work.

Using it is just fine, I use it a lot for selective desaturation, it works really well.

2 Likes

Yes, JzAzBz should be good for HDR, but I’m unsure what you mean by “it fits a scene-referred workflow”.

JzAzBz is designed to be a perceptually uniform colorspace. A scene-referred colorspace will not be perceptually uniform.

The blog post is interesting, especially in the choice of OSA-UCS for the perceptual colorspace, as that doesn’t have a closed-form transformation to CIEXYZ, so using it for image editing would be messy and slow.

Perceptual colorspaces are fun, because there are so many to choose from.

The blog doesn’t address the question of what primaries should be used for colorspaces using RGB models. There have been discussions on that issue in this forum.

I use it a lot, especially to enhance or hue-shift colours separately, and have set up some presets i.e. for “Blue Sky”, “Green Leaves” etc.

Imho it’s very spot on to use, will say it has an easy to understand user interface and simply does what you expect.

Indeed, I am also unsure with my words. In my limited understanding, one of the requirements of working in a scene-referred workflow is to deal with the HDR of the scene.
Then we should make changes that are believable in a physical sense, i.e. we are dealing with amount of photons, so we should work in a space mimicking the behaviors of mixing real photons fluxes.

So I guess, working in a linear scene-referred color space implies to be in a non-perceptually uniform space.

In general I see photo-editing divided in two steps:

  • fixing the raw light data so that they are represented in our screens in a believable way (best in a linear scene-referred workflow), or in alternate reality that is still physically believable.
  • artistic manipulation of colors in order to achieve a look, matching analog film colors for example.

The color zones module fits the second step.
This second step is more convenient in a perceptual color space.
And the optimal way to do so is after converting linear data to non-linear (scene- to display-referred transform).
Did I get it right?

Then, what do you think about the Oklab perceptual space from the same blog A perceptual color space for image processing? He is comparing it also with OSA-UCS.

Screenshot from 2021-05-08 09-15-13
Here are the plots of the Munsell data from the same blogpost [A perceptual color space for image processing].

Indeed I also think it is very convenient for fine tuning of colors, especially foliage and skies. And it is pretty intuitive to play around.

But in my limited experience I have an hard time to figure out how this tool compares to state of the art alternatives of other pieces software (and recently published tools).

Generally, I try to keep the curves as smooth as possible, because I remember weird outputs when fancy slopes are applied. This might be more of a misuse of the tool than a issue regarding the color space or technicality of the module.

For example, I feel quite bad about these examples in the manual. I wouldn’t do this to my images. :upside_down_face:
image image

Is this also your view?

Also, I don’t know how color accurate is the GUI of the module. But I see hue shifts with saturation for example.
image

Anyway, I like that others appreciate the module as much as I do. :slight_smile:

So I guess, working in a linear scene-referred color space implies to be in a non-perceptually uniform space.

Yes.

In general I see photo-editing divided in two steps:

As a simplification, that’s fair enough. Where would you put denoising? And sharpening?

My editing involves four classes of colorspace:

  1. Camera RGB, before and after demosaicing. This is a linear RGB, possibly plus an offset. I denoise before demosaicing, athough the degree (if any) of denoising is an aesthetic decision.

  2. Scene-referred linear RGB, which is additive (so there is no offset), using some standard primaries.

  3. Perceptual (lightness, chroma, hue). Some people denoise here.

  4. Output-referred, either additive RGB or subtractive CMYK.

I don’t have enough experience editing with different perceptual colorspaces to say which is the best for my own editing, let alone anyone else’s.

I agree they are quite convoluted examples. Feel free to propose an alternative :slight_smile:

1 Like