The Quest for Good Color - 2. Spectral Profiles "On The Cheap"

Dual illuminant can also have two luts that are interpolated


I’ve been picking at variations, looking for why the red SSF separates from the rawtoaces reference. It would seem my assumption that, “all tungsten-halogen lamps are created equal” is okay for an approximation, but doesn’t really account for all of the variation in spectral power distribution, or SPD.

Here’s a link to a nice typical SPD plot, scroll down in the article to see it:

Of note is that these lights put the majority of their energy in the IR part of the spectrum, out of the range we consider for the SSF endeavor. And, of more significant note, the SPD in the visible range drops sharply from the upper end to the lower end of the visible spectrum. This is why power compensation is needed, or the per-wavelength sensitivity measurements wouldn’t be relevant to each other.

I created a few synthetic SPDs to see where the right data might reside. Since the DedoLight data I have is normalized from a value of 1 at 780nm down to 380nm, I plotted linear progressions from the same point. With the synthetic data, I couldn’t get SSFs that aligned throughout the spectrum; when low was good, high was off and vice versa. Indicates there’s significance in both the low-end toe and the upper-end rollof of the nominal SPD. Here’s a plot to illustrate:

The red Reference line is the synthetic SPD that gets good low alignment. This plot also shows the S-curve shape of the other measured SPDs, curvy enough to be relevant.

So, now I’m cogitating on alternatives to actually measure SPD of the spectrum light source. You’ll see a lot of cheap spectrometer setups using webcams and such, but using them for power measurement doesn’t work because of the bayer filter. There are folk who’ve uttered, ‘just add the r,g,b numbers’, but that doesn’t ring right if you look at any of the SSF plots, but I’ll still try it to confirm how wrong I think it might be. What I’m also considering is scraping the bayer filters off the sensor of a cheap webcam I have; there are good results reported from using photoetch resist (from printed circuit board manufacturing) to dissolve the filters. It’s a cheap camera I was going to throw away, so it’ll go down in fame… :smiley: It’s just a 740px width sensor, but I don’t need 1nm wavelength resolution for this job, about 20nm should work fine. The goal is to get a consistent light measurement across the spectrum, unfiltered.

I’ll report progress here…


Actually I completely forgot about a potential influence on the SPD and I don’t find a mention in your writings either: The grating itself usually hasn’t a perfectly uniform efficiency by wavelength. Did you get/find any information on that for your grating?

1 Like

This. You should measure the SPD at the same point of the optical path where the camera will be. This will take into account any non-flat contribution of the optics you use in the system. You could even leave the DSLR lens in front of the webcam and take into account it’s contribution.

1 Like

So a simple linear vs wavelength relationship, brings your SSF to better match the ‘reference’ SSF? Interesting. And tempting to ‘just’ bend your data to the other dataset…with all possible downsides that this could have.

But isn’t this what he is doing already? He kind of needs the SPD without the grating to see the effect of the grating (and whether this accounts for the differences to the Open Film Tools people).

Unit cost on these is about $3US. Unless there’s generic characterization for holographic gratings I haven’t already found, I don’t think they spend the time finding it. Its primary purpose is to shine colors on grade school walls…

Only at the low end. The high end is bolloxed. I briefly considered crafting a SPD to fit, as the reference data is anchored to measured SPD, but that only helps me, doesn’t help make this a viable method.

No, actually I was operating on the assumption that tungsten-halogen lamp physics would be close enough to use another lamp’s SPD. Really, it seems to work well enough to produce an image with overall rendition colors consistent enough with those from a matrix profile. I’d use the profile I’ve already built to mitigate extreme colors from my D7000, and I’d use the same workflow to make a Z6 profile without benefit of a reference data set. Just want to take “cheap” as far as it can reasonably go…

1 Like

I found a cheap (~$26US) monochrome camera that connects to a Raspberry Pi:

With it and one of my Pi boards, I’m going to make a spectrometer that can do SPDs. While still “cheap”, this sort of vexes my intent to make a way for average folk to do SSFs, as the whole RPi thing is a bit “computer-geekish”. What I’m thinking of doing with it is to measure a collection of plain old tungsten incandescent and tungsten-halogen light bulbs to see if there is validity in developing a representative SPD dataset.

I also got the IT8 target yesterday, so I’m going to set up a session on the south-facing front walk of my house to shoot it in straight-up daylight with all my cameras, to start working on the next thread…


Been loosely following. A thought came into mind just now: I have a halogen lamp that seems to change in colour and intensity with age. It is also a dimmer.

That is rather normal. The halogen is in the lamp bulb to mitigate this but cannot completely prevent the metal wire to slowly evaporate onto the lamp bulb itself. This likely changes transmission of the generated light to the outside and also the temperature of the metalwindings itself (a different radiation equilibrium is achieved inside to compensate for the loss of transmission to the outside).

Point is would this denaturing affect @ggbutcher’s quest?

With respect to intensity, this is why a SPD measurement really should be taken at each SSF measurement. I’m pretty sure the generic SPD data sets I’ve used to date are the cause of the differences between my data an the rawtoaces data, which was adjusted with an associated power measurement at the time of the spectrum measurement.

With respect to “color”, I’d be willing to bet that is a reduction in the black-body temperature.

That is my guess as well. :slight_smile: Weather is nice: don’t stay in your workshop too long. :stuck_out_tongue_closed_eyes:

Ha, I’m outside right now, fixing a sprinkler leak… :tired_face:

Regarding the discrepancy between the two sets of SSF, I wonder if this could be caused by the fact that Nikon “raw” data actually has black point scaling applied during the A/D conversion according to the white balance set in camera. So, if the in camera white balance setting was different for the two measurements then the color primary offsets would differ slightly. In this case it appears that the green primary is the same, while the R/G/B components of the red (615-650) and blue (440-460) are all boosted.

I’d think the rawtoaces measurements would have had the same bias. My starting point was the out-of-file data, with only a demosaic and a horizontal flip.

I’ve modeled some synthetic power calibration data, and it looks like that’s the likely culprit. I’m putting together a way to measure the light’s power using a monochrome camera, so I should be able to implicate or exonerate that this week. I’m also going to look at my wavelength calibration algorithm, as that might account for the premature downslope in the red curve.

1 Like

Be careful with this, as “monochrome” sensors (basically a silicon diode) have a sensitivity that is very wavelength dependent. You need a properly calibrated sensor for this.

I got one of these:

Essentially, I’m going to put it in the same optical path as the camera-under-measurement, and collect spectrum and calibration images. There are no Bayer filters, but there’s an IR filter I may remove in a subsequent test. Right now, I’m just going to see if I can build a power calibration file from the raw measurements of this sensor, see if they come close to the results from the Dedolight files from the Open Film Tools project.

Any more progress? Been thinking more about the differences between your measurements and RAW to ACES and I’m really thinking that it might at least partially have something to do with Nikon’s pre A/D processing. I can’t think of anything else that would cause that really odd spike in green/blue response AND a decreased red response at the same time. I believe that Nikon is amplifying the signal for the green and blue channels at the red color space primary position (via a matrix at some point in the analog to digital pipeline) to even out the noise floor across all three color primaries, purifying the primary red response after black point subtraction. It looks like they’re lifted to about the same point as the minimum response at the other color primaries. Also, it looks like the minimum response point of the curves at the far end of the blue spectrum are also artificially lifted to the same level. What do you think? Maybe you could figure out the camera’s native/uni white balance and try shooting the spectral image with the camera set to that?

With all the messing-around with variables I’ve done, the emerging culprit looks like some sort of filtration commencing at the peak of the R (or C, in some circles) response curve, like an IR cutoff. I really don’t think its anything in the camera processing, as that would have also affected the rawtoaces data. My initial suspicion, power calibration, has more to do with where the B (or A) and R (or C) curves peak, and that still needs a bit of work.

So, the culprits for IR filtration that are unique to my optical chain would be:

  1. The diffraction grating. Cheap plastic hologram grating, no direct data on its spectral attenuation. I might go looking for generic materials data…
  2. The camera lens. I really don’t think this is the problem, as I’d surmise the lens designer wouldn’t want to put IR attenuation in their mix that trumps the sensor IR filter.

This is pushing me toward spending the money on a lab-grade diffraction grating…

All that said, I’m still messing with power measurement, which I’ll write about here in a bit. With the cheap monochrome camera and some web code I wrote for the Raspberry Pi, I’ve conjured up a nascent spectrometer with its own set of challenges. There’s a lot more written about these devices; has been a real go-to for my design trades. Thing is, making such a device in addition to a SSF spectrometer starts to become more challenging for the average photographer. So, I shot my initial set of IT8 target images this past weekend, which will be the topic of my next thread; this may yet be the more reasonable approach for the majority of photographers.