ART - White Balance & Vectorscope

How do you find the right white balance if there is no neutral reference in the image?

In every good video editor there is a vectorscope and a reference line for skin tones. Does ART have this too or is the Hue-Chroma Vectorscope completely different there?

Here is the Vectorscope from Kdenlive with your I/Q lines. Whereby I is decisive for the skin tone.

The exciting thing is that every person’s skin tone lies exactly on this line. Only the brightness and saturation vary. This is because the skin is basically colorless and the color of the blood is the same in all people.

But if ART’s vectorscope is the same as that of a video editor, why is the color wheel rotated? I’m curious what you know about this and can tell me.

We’ve had this same discussion over in darktable land… And what you’re saying is certainly not true. I’ll leave it at that.

Ya I think even though it is explained that way if it were that simple then google wouldn’t be assigning so many resources and using machine learning to try to do a better job :slight_smile:

You can even access the database from here I think to see the sort of test images and process…

here i plotted the 10 spots from the abovementioned monk scale in the cie xy chromaticity diagram (caucasian white portrait loaded, that’s the white pixel scatterplot in the background).

yeah certainly not a line except for the very pale end maaaaybe, but i think it still supports the notion that skin colour is all the same.

Hello @paperdigits
What exactly do you mean, is not true?
Are you saying that the skin tone line (I) does not really indicate the correct color for the skin? At least this is the case with every video editor.

It is a great pity that we find the most basic of color plots re-named to something stupid like “Vectorscope”.

Here is @Micha’s crop opened in ColorThink as a 2D CIELAB plot … look familiar?:

And the other plot is simply a projection of Hue and Chroma in the HSL/HSV chromaticity plane and is not a “color wheel” per se.

As to hue rotation, the Wiki plot above is correct in showing red (0 deg) on the right.

Your scientific contributions are far beyond my comprehension. Not that I don’t want to learn. But I’m just looking for a useful tool so that I don’t have to leave the white balance to chance or my daily form. My question is: How can you get the white balance right if there is no neutral object in the photo?
Here is the picture and now the question: How do you find the right color?

There isn’t one color for human skin. It certainly can’t be represented by one line in the vector scope.

Hello @paperdigits
Why don’t you check this photo with the vectorscope?


With ART’s vectorscope, all four skin tones are pretty much on the I line.
With a video editor, they are exactly the same.

1 Like

Open the the image in RawTherapee in the Color Tab. Select ‘White balance’ Method: ‘Automatic & Refinement’ → ‘Temperature Correlation’.

Do not click in the image area before noting the parameter values below the ‘Pick button’

If you don’t use RawTherapee, most Editors have an ‘Auto WB’ but may not tell you all of the above.

I don’t know what you mean by “find the right color”, sorry.

I am the one who created the (A)RT vectorscopes. The short answer to you question is dashed line between the red and yellow lines in the Hue-Saturation vectorscope. However, like others have pointed out, the idea of a skin tone line should be taken cautiously.

The other vectorscopes you mention are UV vectorscopes that display the I and Q axes of the NTSC system. While the I axis points in the general direction of skin tones, I am not aware of any evidence that shows rigorous efforts to design the I axis as the average skin tone. Scientific data show that skin tones do vary and can deviate significantly from the I axis. If your goal is to have accurate skin tones, then you must only use the line as a rough estimate. Consider how much your subject’s skin tone deviates from the I axis and in which direction and make sure the data shown in the vectorscope reflects that.

There is a vast amount of literature saying that the positive I axis is the skin tone line. Just because many people say it doesn’t mean it is true. I’m sure it gets repeated often only because it looks reasonable and very few people verify the claim. In all the research that I did while developing the vectorscopes, I did not find one credible piece of evidence that ties the I axis to skin tones.

With this background in mind, the (A)RT Hue-Saturation vectorscope dashed line is the positive I axis of the I Q color system if your output profile is sRGB. The I axis does not map cleanly to the Hue-Chroma vectorscope because the Hue-Chroma vectorscope is in a perceptual color space. The dashed line here is an approximation of the I axis using a single color that I unscientifically chose using the hue of the I axis and a representative saturation and lightness of skin colors. Once more to be clear, the dashed lines in the (A)RT vectorscopes represent the positive I axis and therefore is technically not a skin tone line.

1 Like

Of course I know this function: Auto in ART, in RT and in darktable. The results are sometimes better than those of the camera: “As Shot”, but sometimes not. In any case, the automatic mode can’t always deliver a satisfactory result, the faces are often too red or too red-blue. At least in the example image, there is usually some green missing, which I then adjust by hand.

As I said, many people use the vectorscope for video editing. And since ART also has one, I would like to find out whether you can use it for this, if you naturally choose a small section with “crop”.

In RT, one can ‘pick’ a small section [32x32], and poking around in your image except for the gray area at top right results in some horrendous color balances, especially on the skin.

Therefore we are talking at cross-purposes.

Thanks @Lawrence37 I understood your comments very well. My previous attempts also show that the I-line cannot be used as a guarantee for skin tones without further ado. Unlike with video (Kdenlive), this is probably due to the different color space (?) UV and NTSC (?) described by you.

It’s a pity that you don’t have a reliable tool for photo editing. But it’s very good for me that I can’t place any hope in the Vectorscope from (A)RT based on your description, I’ll leave it, because I notice that if I place the colors of the face completely on the line in a portrait, the tone becomes too yellow. So I have to rely on my feeling, on my eyes and practice a lot. And if the light is problematic, you should definitely photograph a neutral object.

But, Lawrence37, tell me, why did you create the vectorscope and why this line if you can’t use it 100% for skin tones?

Hello @cedric
Any contribution can be helpful, but of course I chose an example where there is hardly any or no neutral gray. As I said, if neither “As Shot” nor “Automaitik” or “Pick” lead to perfect results, I wonder if there are any other methods that guarantee a good white balance.

I must confess to having never looked at the Vectorscope in RT:

I am not certain that the term “White balance” is appropriate to this discussion, preferring instead to use “Color balance” if we continue to talk about skin tone.

In the above diagram, the group of color coordinates can moved around to taste by the sliders at right.

I did find this in Rawpedia as to “the line”:

Additionally, you can see a diagonal line at the top right. This line indicates the average Caucasian skin hue. In a portrait, hovering the mouse pointer over a medium skin tone, the graph should mark the pixel around this line. Otherwise, there is a color cast on the skin that you would be interested in removing.

In the above, note the black dot in the group which resulted from clicking on the cheek skin. If your skin is “average caucasian” the dot is out of place and should be moved toward ‘the line’ with the sliders. All colors in the image will be affected but the skin will now be “correct”.

Hopefully this qualifies as an “other method”…

I think sometimes as well all the terms are thrown around and then technically some statement is violated…I think you could trust the line in the vectorscope to identify the hue family of the skin tones… which at least one reference cited at something like 13 to 32 degrees for a sample of “Dutch Europeans”, so maybe different still for the entire population. The vectorscope is not going to give you the skin color or tone as that also requires information it doesn’t have show so as a reference the base hue range of skin is roughly in that range and could at times likely serve as a coarse reference point to correct white balance issue or at the very least compensate for lighting effects on skin and that would also act on a color cast and might make the image look “white balanced”…

https://www.researchgate.net/publication/232229641_Comprehensive_candidate_gene_study_highlights_UGT1A_and_BNC2_as_new_genes_determining_continuous_skin_color_variation_in_Europeans

1 Like

Not quite. The dashed line of the RT Hue-Saturation vectorscope is exactly the positive I axis if your output profile is sRGB, which is likely the case. If a particular image shows colors along the I axis in Kdenlive, it will also show colors along the dashed line in RT.

Allow me to clarify some points.

  • UV color space: The fact that Kdenlive and other video editors use UV vectorscopes and RT uses Hue-Saturation affects where the hues are shown in the vectorscope, but it has absolutely no effect on accuracy of the I axis. If you have an image where the skin tones line up exactly on the I axis in Kdenlive and open the same image in RT, those skin tones will line up exactly on the dashed line too.
  • NTSC: This is an old video system and the reason I brought it up is to explain where the I Q lines in vectorscopes come from. In all the trustworthy literature I have read, it is explained that the I axis was chosen to represent the color axis that humans are most sensitive to. This is important because when broadcasting video, sending more data for the I axis compared to the Q axis maximizes color fidelity. The takeaway is that the I axis in any vectorscope (RT, Kdenlive, etc.) shows part of a color system that probably did not have anything to do with skin tones in the first place.
  • If you find the Kdenlive vectorscope to be reliable for skin tones, then (A)RT can also be reliable. You just need to make sure that
    • you are using the Hue-Saturation vectorscope, not the Hue-Chroma vectorscope which you have shared in your screenshot
    • the output profile is set to sRGB (RTv4_sRGB for example) when you are looking at the vectorscope
  • I added the vectorscope and the dashed line due to popular demand. I trust that users know (or will eventually figure out) that the line is useful as a sanity check and that skillful users can place skin tones on or slightly offset from the line depending on the subject’s true skin tone. I reiterate that the problem of using the I axis as a skin tone line is inherent to origins of the I Q axes and the nature of real skin tones. It doesn’t matter which software you use. The I axis is never 100% usable for skin tones. darktable excluded any form of a skin tone line and I understand their decision. I included the dashed line because people wanted it.
5 Likes

You can not color balance an image i.e. “get the white balance right” without knowing the true color of at least one object [and it’s illuminant] in the scene - so that you can match that color in your review image by whatever means. Anything else, e.g. Auto is just guesswork.

I don’t usually intervene on forums that are not related to Rawtherapee. nevertheless, as @cedric presented a screenshot of the ‘temperature correlation’ algorithm. Here are some simplified explanations and a link to Rawpedia (technical)

The White Balance problem is complex. If you look at academic publications, you’ll see that many algorithms have been tried. Most of them try to find a middle gray. About ten years ago, I tested 5 or 6 of these algorithms, but I wasn’t convinced. But how do you do that when there’s no gray?

Then I saw an article (in 2017, but I don’t remember where…), not at all documented, which gave as a lead the correlation of colors between those of the image and reference spectral data.

For several years I’ve been working on this (complex) algorithm, which involves a large volume of code and a lot of spectral data.
There are 429 of them, divided between skin tones, sky tones, near neutrals and, of course, all the colors at the limits of the CIExy diagram.
For image analysis, the one we want to treat, I use a maximum of 236 colors.

For more information, here’s a link to Rawpedia
Auto Temperature correlation

To use all the algorithm’s functions, check the box
Preferences > Color Management > Show White balance Auto temperature correlation settings.

The technical text explains the system’s limitations. The main problem is the illuminant, which must be close to the Daylight illuminant between 4100K and 15000K, and Blackbody between 2000K and 4100K. However, it also works when the CRI (Color Rendering Index - which reflects the proximity of an illuminant to Daylight) is “correct”.
The second problem is that an image often contains parts in the sun and others in the shade. In this case, you can perform a chromatic adaptation (with Selective Editing) of the part concerned.

Another very important point is that we correct the usual blue-red axis, but also the green-magenta axis.

An excuse my bad english :wink:

Jacques

6 Likes