new camera input luts available

From what I understand this donā€™t clamp anything because:

  1. camera rgb doesnā€™t have negative values (cameras canā€™t record negative light)

  2. with this ā€œrb chromaticity color spaceā€ it 's impossible to get values higher than 1
    With the formula above ā€œr=(r/(r+g+b),ā€¦ā€ The result r is always < 1

1 Like

Is the assumption correct that the outermost locus is the most spectrally pure? And the closer one gets to the center of those loci the closer to Illuminant E? In the D700 clut visualization, are there certain green values overlapping with less spectrally pure green values (upwards in the direction of blues)?

EDIT: Iā€™m so sorry for my manners! This looks fantastic and just reinforces my assumption that SSFs are a must for color-sensitive stuff (so all things photography). Great great work from @ggbutcher and @hanatos . I need to get a GPU to test vkdt.

2 Likes

right. itā€™s pretty much that only with blue instead of green (flips the triangle, forgot why i did it this way, something with white balancing where i kept green constant or so).

the range of the lut is all the physically possible values. it covers the whole spectral locus and scales with brightness of the input (because it divides out r+g+b, so the global scale introduced by exposure does not matter) (what @age said).

the outermost coordinates in the quad mean something random depending on camera response function. mostly unphysical stuff that never happens (except for noise below black point maybe). the pure spectral colours are close to the outermost ring in the amoeba shaped pointcloud (i think this ring is at 95% distance or so).

hm. the rings scale towards illuminant E, so you can see a bit of structure in the pointcloud around that. the colouration is rec2020, so it renders (1, 1, 1) white where it means D65.

hehe yeah the D700 shows the metameric ambiguity between camera rgb and cie/rec2020: a certain shade of green camera rgb coordinate could potentially have been another spectral stimulus, which, to the eye, would appear as a way more saturated shade of green. given just the camera, thereā€™s no way you will be able to tell them apart. browsing through the different models this happens quite a bit.

1 Like

But that means they are supposed to be used before exposure changes , right ? Because you canā€™t apply it to rgb data that has values > 1.0.

Just looking if i can use this in DT , or if I can learn something from it in using other ICC files as profiles better in DT.

no. think about a spectrum between 380 and 830 nm. if you scale it up, it still has the same shape. you can do stuff with it in a normalised space and then multiply the overall scale back in the end. this is also explained in the amazing dcamprof docs. in general, the apparent brightness of a certain spectrum is different in camera rgb and CIE XYZ, thatā€™s why itā€™s a 2D->3D table.

(re:icc and suchā€¦ maybe i should put together a list of things wrong with the dt processing pipeline. but then again maybe not).

2 Likes

Do it :wink: I say this not as someone wishing to piss on dt, but a user of the software who wants to better understand.

(also probably most of the more grave things are my fault in the first place)

1 Like

Thatā€™s what I was trying to get at. Thanks!
I assume you generated the ā€œringsā€ by sampling the camera SSFs with spectrally broader and broader virtual lights? Maybe even gaussian shaped? If so, it would be possible to plot lines of constant illuminant center-wavelength within that clut space?

Iā€™m looking forward to see what effects this has on real-worl gradients. Specifically how spectrally pure light-sourced gradients are affected. Even more so compared to matrix solutions. And which CFAs are better and which are worse. Super interesting stuff.

1 Like

(writing this on my cell phone, with dodgy Internet in the woodsā€¦)

+1 to the dcamprof docs! Even if you never use the software, Anders has compiled a treasure trove of information on camera profiling. Iā€™ve just spent the morning with a local copy, trying to better understand the mechanics of luts in this application. Every time I do such, I learn something newā€¦

1 Like

I would welcome you comments on this and if it would promote any discussion that could lead to improvements everybody winsā€¦

1 Like

well integrating the same ssf against sampled spectra, yes.

only spectral colours (and from there towards white) could be gaussian. the purple line is more like 1-gaussian (needs two peaks or a dip). dcamprof has some xy ā†’ gaussian spectra upsampling routine that i didnā€™t look into in detail yet. iā€™m using this method just because itā€™s so dead simple. i think real spectra would be better, but these are harder to sample densely in xy space. certainly room for improvement here.

reflectance spectra with mode/peak at constant wavelength? yes:


(all the way to the right would be the spectral colour, but it stops displaying stuff at boundaries of adobergb)
this is what the colour module uses for hue-preserving gamut mapping and changing chroma (following mizokami and webster 2012 if that is close to the idea you are proposing).

i have to necro this because i found this incredible resource of (predicted) camera responsivities that could be turned into input device transforms: https://github.com/COLOR-Lab-Eilat/Spectral-sensitivity-estimation/tree/main/data/predictions

1 Like

Ooh, had to take a break from my other stuff to do thisā€¦

Plotted their predicted Nikon D7000 against rawtoaces monochromator-measured D7000:

predicted-Nikon-D7000_normalized-vs-Nikon_D7000_rawtoaces

Blue overall sensitivity is divergent, but the overall band separation is consistent. Iā€™d say thereā€™d probably be a bit of white balance difference using their data, but the color hue reproduction should be close. When I get a bit more time I can run it through dcamprof for a max dE.

Edit: Forgot to mention, I had to normalize their data to 0.0-1.0 for the plot.

nice, curious to hear your opinion on thisā€¦ i created a couple luts and compared to image renditions with dng matrix and mkssf lut. seemed to me the wb is more on the blue side, but itā€™s not a simple multiplication by D65 or something. at least this seems to be the case for my canon 5d2 and the fuji x100t. maybe itā€™s D55? :man_shrugging:

i mean at the very least i could use the reference ground truth spectra they collected and augment the PCA model used in mkssf to get a better grip at some newer sensors that arenā€™t well reflected in this model so far. running mkssf + mkclut is not too much trouble as compared to just running mkclut on their data directly.

1 Like

I used dcamprof to make profiles for each of the datasets.

Predicted:

  D02 DE 0.00 DE LCh +0.00 +0.00 +0.00 (gray 80%)
  D06 DE 0.07 DE LCh +0.02 -0.06 +0.03 (gray 20%)
  D03 DE 0.15 DE LCh -0.02 -0.04 -0.14 (gray 70%)
  D04 DE 0.16 DE LCh -0.02 -0.06 -0.15 (gray 50%)
  D05 DE 0.20 DE LCh -0.01 -0.03 -0.20 (gray 40%)
  D01 DE 0.83 DE LCh -0.14 -0.75 -0.34 (white)
  A01 DE 0.95 DE LCh -0.06 +0.45 +0.84 (dark brown)
  A04 DE 0.96 DE LCh -0.66 +0.62 -0.32 (yellow-green)
  A06 DE 1.14 DE LCh +0.28 -0.72 +0.83 (light cyan)
  B04 DE 1.22 DE LCh +1.03 -0.14 +0.62 (dark purple)
  A03 DE 1.23 DE LCh +0.87 -0.91 -0.48 (purple-blue)
  B03 DE 1.23 DE LCh +0.98 -0.30 -0.67 (red)
  A02 DE 1.36 DE LCh +0.48 -0.09 -1.27 (red)
  A05 DE 1.36 DE LCh +1.13 -0.25 +0.61 (purple-blue)
  B02 DE 1.83 DE LCh +1.53 -1.59 -0.90 (purple-blue)
  C01 DE 1.84 DE LCh +1.50 -1.63 -1.43 (dark purple-blue)
  C02 DE 2.07 DE LCh -0.30 -1.77 +1.03 (yellow-green)
  B01 DE 2.15 DE LCh -0.35 -1.31 -1.66 (strong orange)
  B06 DE 2.30 DE LCh -0.69 -2.12 -0.55 (light strong orange)
  C03 DE 2.45 DE LCh +2.17 -0.39 -1.06 (strong red)
  C06 DE 2.50 DE LCh +2.06 -0.38 +1.37 (blue)
  C05 DE 2.71 DE LCh +2.11 -0.68 +1.56 (purple-red)
  B05 DE 3.29 DE LCh -1.09 -2.95 +0.93 (light strong yellow-green)

rawtoaces:

  D02 DE 0.00 DE LCh +0.00 +0.00 +0.00 (gray 80%)
  D06 DE 0.09 DE LCh +0.02 -0.08 +0.03 (gray 20%)
  D03 DE 0.11 DE LCh -0.01 -0.07 -0.09 (gray 70%)
  D04 DE 0.12 DE LCh -0.01 -0.02 -0.12 (gray 50%)
  D05 DE 0.16 DE LCh -0.00 -0.04 -0.15 (gray 40%)
  A01 DE 0.41 DE LCh -0.04 +0.26 +0.32 (dark brown)
  D01 DE 0.69 DE LCh -0.13 -0.64 -0.22 (white)
  A04 DE 0.80 DE LCh -0.64 +0.43 -0.23 (yellow-green)
  A06 DE 1.05 DE LCh +0.24 -0.60 +0.82 (light cyan)
  A03 DE 1.10 DE LCh +0.84 -0.76 -0.28 (purple-blue)
  A02 DE 1.18 DE LCh +0.53 +0.12 -1.05 (red)
  B03 DE 1.24 DE LCh +1.14 -0.07 -0.49 (red)
  B04 DE 1.35 DE LCh +0.82 -0.87 +0.61 (dark purple)
  C02 DE 1.50 DE LCh -0.32 -1.37 +0.51 (yellow-green)
  A05 DE 1.55 DE LCh +1.06 -0.89 +0.41 (purple-blue)
  B02 DE 1.59 DE LCh +1.47 -1.05 -0.88 (purple-blue)
  B06 DE 1.73 DE LCh -0.85 -1.45 -0.44 (light strong orange)
  C01 DE 1.76 DE LCh +1.59 -0.94 -1.18 (dark purple-blue)
  B01 DE 1.85 DE LCh -0.40 -0.86 -1.59 (strong orange)
  C03 DE 2.27 DE LCh +1.91 -0.60 -1.08 (strong red)
  B05 DE 2.41 DE LCh -1.11 -2.00 +0.75 (light strong yellow-green)
  C05 DE 2.42 DE LCh +1.97 -0.71 +1.21 (purple-red)
  C06 DE 2.69 DE LCh +2.04 -1.08 +1.35 (blue)
  C04 DE 2.76 DE LCh -0.77 -2.62 -0.36 (light vivid yellow)

nice thanks for doing these! this seems to confirm the white balance hypothesisā€¦ dcamprof normalises wb to D02 (apparent by the 0 DE, right?). with that out of the way itā€™s ballpark the sameā€¦ a bit worse which is expected, but not by much (3.29 vs 2.76 on the worst patch).

so i guess iā€™m looking for the right illuminant hereā€¦

1 Like

For the above runs, I used the dcamprof default D50 illuminant. I just ran StdA, and it dropped the max dE to 3.40, not a significant difference.

well okay you shine a different light on the patch, go through the observer and then youā€™ll get slightly different metameric behaviour here. but to compare to the reference itā€™ll first thing white balance it to make the D02 80% white patch match the reference values. this transform doesnā€™t count as error.

in my case iā€™d like to characterise this absolutely, without the wb step, so i can create a mapping camera ā†’ cie observer that only depends on the cfa curves (one mapping for each given illuminant). and using the monochromator curves vs. the predicted here gives me a colour cast (that i assume dcamprof normalises out in the wb step).

1 Like

(but now iā€™m all confused again with all the different steps and errors. could it be the issue that they use the abbridged range 400-700nm only?)

I wouldnā€™t think so, the camera responses through the CFA go to zero at about these limits.