WB and RAW processing question

Dear RAW processing experts!

I hope I can find answere here, my question:
Which of the common white balance settings for cameras has more close values to real sensor data?
So which has less modification in the values.

I have to know it without digging into the source code if possible.

I have to make some kind of spectrum analiser of a modified camera and need to set some pre defined white balance to see the amount of red green and blue (UV and IR also) is coming in using external cutoff filters.

To have a fixed measurement I have to set a WB and want to use some pre defined ones instead of custom as custom calibration could be tricky based on light sources.

I record videos of subjects so not an option to make RAW processing, and lots of usable data can be missed if sensor values need to be scaled from a low bitrate video later.

I have to know which WB is more close to sensor readings without adjusting red green blue values. (flash/cloudy/tungsten etc.)

I hope you can understand is :wink:
thanks!
AMG.

So, if I have this correct, you need to do a spectral characterization of a video camera, with only the already-whitebalanced, already-demosaiced output. Yes, you do need to back out the white balance as it messes up the energy relationship of the three channels, and the best way to back it out is to divide the channel values by the multipliers the camera used to apply white balance, which should be in the container metadata. I don’t know video metadata, so this might be problematic.

What most folk know about camera presets is usually obtained from image metadata. exiftool is your friend in this, but its website seems to be down at the moment. You can view the list of supported formats at it Wikipedia page:

https://en.wikipedia.org/wiki/ExifTool

Most cameras don’t record white balance settings. I think Sonys might put it into a proprietary metadata channel (I forget the name of it at the moment), but even then it’s inconsistent. Sometimes it’s even hard to tell which transfer curve the camera used. (for example, S-Log2 vs. the default curve, which isn’t sRGB for almost any camera - the default curve almost always has an S-shaped tone curve applied before the sRGB transform)

That’s something else the OP will need to take into account - determining the transfer curve of the camera, since it’s probably not linear, and while it’s labeled as Rec709, is more likely to be Rec709 + arbitrary undocumented tone curve.

Don’t forget that almost any camera is not only performing white balance scaling, but also a matrix transform from sensor-native colorspace to output colorspace.

One could probably measure the SSF including whitebalance for the camera using an approach like yours - but it’s gonna look weird since the matrix is going to induce crosstalk between channels that might even include negative values in the SSF for some channels. That’s likely to actually be more problematic than just simple linear white balance scaling.

1 Like

Yours is a much more comprehensive consideration of the problem’s challenges. Thinking through your list, I’m discouraged regarding its do-ability…

Thats the plan to set manual predefined white balance.
Just need the answer which one has more close to sensor values with less scaling so interpolation data lost.
Flash, Sun, Cloudy, Tungsten etc?
I just have to find which is more standard and I can use it for all video I record to have the data.
I think it is easy to know from raw processign, just list raw sensor values for pixels from a sample pictute RAW file and make TIFF from it with different WB in the develop process and see which WB applied RGB values matching the most pixel values.
Off cource because of the Bayer pattern there is crosstalk, but a developer knowing the sources could find it easily to debug the RAW sensor values during develop proces.
The outputs are easier to check.