The Quest for Good Color - 2. Spectral Profiles "On The Cheap"

Note: This post is a follow-up to The Quest for Good Color - 1. Spectral Sensitivity Functions (SSFs) and Camera Profiles. A lot of terminology was established in that post, so reading it first will help in understanding the missive below…

The economics of using spectral data for camera profiles is rather vexing. The predominant research, and the general opinion of the major practitioners, is that individual captures of monochromator-sourced single wavelength light is the best way to measure spectral sensitivity. That is an expensive proposition, as the minimum assemblage of equipment to do this would cost somewhere north of $2000US. Having access to an optics lab with the requisite equipment is a boon, but you won’t find this at the typical makerspace. The alternate approach, a single image capture of a complete spectrum, has challenges that can compromise the quality of the profile: alignment (in a lot of places), light leaks, just the fundamental act of teasing wavelength measurements out of an array of pixels, a lot of things to line up right to make it work. But the promise is in the expense; even with optical-grade components, the setup can be had for a couple hundred dollars US.

But me, I’m even more cost-averse. And, I don’t want to spend any more than I have to; problem is, it’s hard to tell with single-image spectrometer how much I’d need to spend. So, I set out to start with the absolute cheapest setup I could imagine, and upgrade from there if needed. You’ll see that thinking in my device; I tried to consider what various upgrades would require, and designed to accommodate them the best I could. Looking over the various setups, the single-image spectrometer stood out as being the cheapest, with room to grow into better parts and support.

Of all the setups, the Open Film Tools device appeared to be the most “constructible”. Here’s the optical schematic:

Essentially, a light is shined on a diffuser, shaped by a slit, and diffracted by the grating. The first-order diffracted spectrum goes off at a 30-degree angle from the originating light path, so that’s where the camera is placed. One fundamental limitation of this setup is that the upper (IR) end of the first-order visible spectrum is overlapped by the lower (UV) end of the second order, so wavelengths below about 400nm are not usable without some sort of precision mitigation, well out of my reach. Ideally, we’d want usable spectrum down to 380nm, but we’re going cheap here…

They even published the files to 3D print a box to hold it all, and upon which to mount the camera. But, they used optical-grade components for everything in the optical path. Here’s that BOM:

Diffusion Filter LEE Filters LE216 $7US
Slit Leybold 461-62 1mm $??
Diffraction Grating Edmund Optics 830 Grooves, 50mm Sq, 29.9° Blaze Angle Grating $200US

Still a lot cheaper than the monochromator. Thing is, I didn’t even want to spend a lot on a scheme that had a chance of not performing sufficiently. So, I started with the cheapest alternatives for each, with the idea to see how bad it was and make informed decisions regarding what to upgrade. So, here’s what I used alternatively:

  • Broadband Light: What's needed is a light that puts usable power into as much of the visible spectrum as possible. Of note is that one of the researchers who did the CIE 1931 color matching experiments used a tungsten automobile headlamp. A "common" tungsten household light bulb would do, if such a thing still existed. Contemporary household lighting is very uneven in visible spectrum coverage, as I'll show you in a bit. After thinking about it, I decided to spend a bit on a good tungsten-halogen accent light, one I'd be able to use in slide copying and other studio applications. I procured a LowellPro Focusing Floodlight, B&H link, $96US.
  • Calibration Light: What's needed here is a lamp producing uneven light, with power spikes at defined wavelengths. I struggled a bit with this, considering all manner of LED schemes, when the solution was literally right in front of me every morning I navigated the basement toward my office. In the closet with our network devices was a goose-neck clamp light, with a standard CFL lightbulb. On a whim, I grabbed the lamp and shined it at the box I was building, and surprise, it rendered a really clean three-spike spectrum. So I went alooking for CFL bulb spectral measurements, and Wikipedia turned out to be my friend again. I'll show you the images and data further down in the post.
  • Diffusion Foil: Before I actually paid money for such a thing, I rummaged the parts boxes for a suitable substitute. Well, that was even easier than I thought, staring at my notepad one day, it dawned on me that the paper was sufficiently thin to pass a lot of light, and its white color led me to think it'd not be spectrally uneven. So, a bit of white paper it would be for the first attempt.
  • Slit: There are all sorts of neat slit schemes, the one I like best is two razor blades magnetically mounted to a holder. However, the slit for this particular application doesn't need to be terribly small, 1mm is sufficient, so I decided to start with a sheet of cardboard with the slit razor-blade cut in it.
  • Diffraction Grating: This is the "money maker" part, where the spectrum is teased from the slit illumination. Such gratings go back a long way; the first noted observation of the phenomenon was apparently a gas street lamp viewed through a gentleman's silk handkerchief. Precision gratings come in either transmissive or reflective styles, with various spacings of the etched rulings. The literature leans to gratings with 1000 lines per millimeter for our application. The gratings cut from optical glass are pricey, but I ran across a very cheap alternative in holographic printed gratings for educational use. I procured 1000mm gratings from here: Arbor Scientific, $3.75US for a pack of 5, $10US shipping. Geesh... This was probably my greatest worry with respect to performance; I just imagined all sorts of distortion from the plastic thing in a slide mount. This consideration drove some aspects of the box design, to accommodate a better grating if this first attempt didn't cut it.

So, on the box. I went to download the 3d printer files in the Open Film Tools slide deck, but that link wasn’t available. They now publish a revised design on OnShape; I decided not to pursue this, mainly because of the on-going COVID-19 “stay home, dangit” thing would make finding a place to print it challenging. I have an extensive collection of scrap wood in a rack in the garage, so I thought I could easily fabricate a decent approximation without leaving the house. As I built the box, I tried to keep in mind “build-ability” by folk with not too many woodworking resources or skill. Accordingly, I think I have a design that can be built with hand tools, but a power “chop saw” is really handy for getting a decent 30-degree cut. Here are some pictures:

The overall setup:

The tungsten-halogen lamp is illuminated and shining on the slit, the calibration lamp is off to the right on the gooseneck. The camera is looking into the cardboard 30-degree offset face, and is connected to the tablet on the left. Note the spectrum displayed on the tablet screen; that’s coming off the slit-diffraction grid. Ignore the ruler on the box, that’s just a light blocker… :slight_smile:

Now, with the cover off, here’s the internal layout:

You’re looking at the box from the slit end. The diffraction grating is at the other end, and the camera portal is just past that offset 30 degrees. The inside is painted flat black; notice the sheen from the reflected light. This’ll be the topic of baffles in the next iteration.

Here’s the 1mm slit:

It’s cut in a piece of cardstock that slides into a holder in the box fabricated from 1/4" wood.

And here’s the diffraction grating:

It’s mounted in a holder fabricated from cardstock. Here’s a better picture of it:

Some box design considerations:

  • Square-Ness: Alignment of the parts is crucial, especially the alignment between the slit and the lines of the diffraction grating. Accordingly, I made sure the base was as flat as I could make it, and I used a carpenter's square to lay out the positioning of the slit and grating holders.
  • Light Leaks: This was a tough one, as that tungsten-halogen lamp is BRIGHT! Also, you want to keep ambient light out, but you also don't want to work with the contraption in a dark room. Turns out, the simplest thing to keep light out would be a flat face for the end with the slit; my original design put the slit in a holder set about 1/4" into the box, and that didn't seal well with the cover. Same thing for the camera end, just a sheet of cardboard with a hole cut to just accommodate the end of the lens.
  • Focus Distance: When you focus the camera to make the spectrum sharp, you're not focusing on the grating: you're focusing on the slit. So, what's needed is a distance from the camera focal plane to the slit that is at least as long as the minimum focus distance of the lens. I cut the original box to be about a foot long from slit to grating; I'm using my macro lens, which easily accommodates that distance. (ToDo: test normal lens)
  • Setup: This pertains to how the box is supported and positioned. I use a solid door for a desk, so I cleared a corner for the setup. The camera sits on a tripod facing the box on the desk, and the box is set so the lens opening is flush with the desk front and the slit-grating optical path points off the side of the desk to where the light is positioned on another tripod. The box doesn't have this now, but for the next incarnation I'll probably cut a camera interface from a sheet of Masonite, and have it extend below the base about 1/2" to sit against the side of the desk edge. I may even bolt the box to the desk with this extension. The box and camera need to maintain precise positioning for the capture of two images, the spectrum and calibration, so the respective images fall in the same exact place on the camera sensor.

One intent in designing the box was to avoid having to “rip” lumber into long dimensions. A table saw is something not many folk have in their household. So, I built the box from regular dimensional lumber with only cross-cuts required. I rooted through my lumber pile for the most square and uncurved lumber I could find; still, my “straight” base had a slight curve to the surface that made attaching the walls squarely a bit problematic. Nothing a fillet of glue couldn’t cure, however… :smiley: My base is a length of 2"x6" construction pine. I considered a more-square oak plank, but I wanted to be able to push-pin attach cardstock on either end and that wood would have been too hard with which to do that sort of thing.

Slit width plays a part in the spectrum resolution, but I haven’t played with that yet. I just cut a 1mm slit in a piece of cardstock.

Note that the optical path through the box is offset; this was to allow putting the lens hole in the center of the 30-degree slanted camera interface. The updated Open Film Tools box has this design; the original one in the slide deck does not. This did make the grating mount sit right up against the side wall; for the next box I’ll either make the walls thinner, or use a wider base.

To mount the diffraction grating, I made what is essentially a slide holder from cardstock. I cut a rectangular aperture in the card, then fabricated two slide-in brackets from cardstock, into which the slide-mounted diffraction grating is inserted. The whole assembly slides into a fixture mounted in the box I made from 1/4" wood strips, two U-shaped things between which the diffraction card is inserted. I figured this design would let me mount a different (or, ‘more expensive’) diffraction grating if that need arose. Cheap white glue was used for every assembly, wood or cardstock.

During assembly, make sure everything is square to both the optical path and “upright”, that is, doesn’t tilt. This is critical to painting an even spectrum on the camera sensor.

Controlling the light is a challenge. The Open Film Tools box has a baffle between the slit and grating whose significance I didn’t comprehend until I lit up the box for the first time and looked in from the camera direction. Even with a flat black paint covering all the inside surfaces, the light from the slit created reflections on the internal sides. The baffle would hide these reflections from the grating. You’ll note from the pictures that my box has two internal extensions from one wall, those I put in just to brace that wall since that space was not occupied by the optical path. They do serve to block those reflections on that wall. Also a challenge was the cover, particularly on the slit end where the light shown on all the exposed edges. However, just setting the cover on the box seemed to control the room light enough without any additional stripping or gasketing. I wanted to keep the cover loose to facilitate access.

Now, to put the contraption to use. So, to review the workflow: 1) take an image of the spectrum from the broadband light; 2) WITHOUT MOVING ANYTHING, take an image of the calibration light; 3) extract the pixels from each image relevant to the spectrum; 4) wavelength-align and power-calibrate that data, then normalize and “intervalize” the data and you’ve now got SSF data for profile making. To take the two images, I recommend using whatever “hands-off” mechanism with which you feel comfortable. Me, I didn’t trust myself to touch anything to use an on-camera timer, so I tethered the camera and used QDSLRDashboard to control it. That also helped in aligning the spectrum in the image frame, as I didn’t have to stoop down to look at the back-of-camera LCD. First, here’s the broadband light image, as a rawproc screenshot:

I’m showing it to you this way so you can consider both the image and the processing it’ll take to extract the data. First, note the peripheral artifacts, I believe those to be the result of the wall reflections discussed earlier. Note also the spectrum is centered in the image, this is important to avoid having to deal with vignetting and other lens-induced shenanigans. The spectrum has stratified variations, which I think is due to dust and imperfections in and around the grating. To the left, note the processing: first a ‘half’ demosaic, this pulls the Bayer measurements into single RGB pixels with no processing other than to average the two green channels. Later, we’re going to use the x axis of the image to align with the wavelength scale, so all R, G, and B values need to be co-located at each row (x) coordinate. Second, there’s a ‘rotate’ operator; this particular rotate is a horizontal flip, because the camera is viewing the spectrum “from the back”, so to speak. The image has to be flipped to put UV on the left, IR on the right. Third, the blackwhitepoint operator is there simply to give us something to look at; it has to be deleted before saving the image as data for our endeavor. Fourth, the crop is the important operator to subsequent processing; it bounds the data for consideration. The first image is shown befor the crop; here it is, cropped:

Choosing this crop involves getting enough spectrum left-to-right to cover the visible wavelengths plus some margin, as you really can’t see exactly where the lower and upper ends taper out of the camera range, and vertically to select the best part of the spectrum. The image columns will be averaged into three datasets, red, green and blue, so the stratified variations aren’t such a worry. This crop yields a 1461x93 image; getting particular dimensions isn’t important, as they’ll be averaged and “intervalized” out in the analysis.

Here’s the calibration image:

Yep, that’s a good 'ole CFL bulb. The concentrations of energy in each of the three channels will facilitate assigning each column of the spectrum image with a wavelength, and for this bulb all we have to find in the calibration image is where the max value of red, green and blue sit. Now, to know what wavelength to assign to each of these columns, we need to have data from a calibrated spectrometer. Well, I’m haven’t spent that money yet (or maybe, ever). so I went looking for CFL spectral data and found it here: I plotted the calibration image and found it to be a good match to the wikimedia data:


The blue, green, and red peaks occur at 437nm, 546nm, and 611nm, respectively. Write these numbers down…

The crop of the calibration image HAS TO BE THE EXACT SAME CROP as the spectrum image. That’s how we’ll line up the wavelengths upon the spectrum image columns. With that crop applied to both images, the blackwhitepoint is deleted and each image is saved to a comma-separated value (.csv) file. To do this, I wrote a “data image” file format for rawproc, where if you specify a .csv file extension for an image save, rawproc will instead write a data file that looks like this:

width: 1461
height: 93
rawproc-0.9.1Dev DSG_4583.NEF:rawdata=crop;  demosaic:half rotate:hmirror crop:0.251773,0.434891,0.842170,0.491803 

rmax: 0.038158 rmaxcolumn: 811
gmax: 0.042257 gmaxcolumn: 578
bmax: 0.027805 bmaxcolumn: 358

This particular format option is “channelaverage”, which makes a row for each channel and records the column average for each colum, 1461 columns in this case. Other options allow saving the entire RGB array, kinda like a .ppm file, but the channel average format is so much smaller, and that averaging work has to be done somewhere. So now, we have two data files, in this case DSG_4583-spectrum.csv and DSG_4582-calibration.csv

I’ve spent the last two weeks spreadsheeting this data 3 ways from Sunday. In doing so, the programmer’s hubris kicked in, and I decided earlier than later to write some code to do all this spreadsheet work. Over the last week I fleshed this out into ‘ssftool’, a C++ program that’ll take the .csv files just saved and process them all the way to a dcamprof-friendly JSON file. I’m going to describe the rest of the processing in ssftool terms; you can download it from here: Also included with ssftool are the data files we’ll use here. The with ssftool pretty well describes building it and using it, so I’ll not say much about that here.

First, the data from rawproc isn’t amenable to processing by spreadsheet or ssftool, but ssftool has a couple of operators to deal with that:

$ ssftool extract DSG_4583-spectrum.csv | ssftool transpose > spectrum.csv
$ ssftool extract DSG_4582-calibration.csv | ssftool transpose > calibration.csv

This’ll extract the red:, green:, and blue: lines from the rawproc file and transpose the data into columns. Note the pipe; ssftool will either open a file or take input from stdin, so we can pipe data from ssftool to ssftool with abandon.

The two essential operations to the data are to 1) assign wavelengths to each row, and 2) adjust each measurement to reflect the power of the light at that wavelength. The first one sounds easier than it is; essentially, a slope has to be calculated from the three channel maximums of the calibration image and their associate wavelengths, and that slope is used to calculate wavelengths for each of the intervening rows up and down from an anchor point. ssftool has an operator for this, wavelengthcalibrate:

$ ssftool wavelengthcalibrate spectrum.csv calibration.csv blue=437,green=546,red=611 > calibrated_spectrum.csv

The power calibration requires us to “intervalize” the data first. Right now, it’s still oriented to the x columns from the image, we need to extract the columns that represent the part of the spectrum we intend to use, at a, say, 5nm interval. ssftool has an operator for that, aptly named intervalize, and we’re going to use that in conjunction with the powercalibrate operator and a dataset for a tungsten-halogen lamp I got from the Open Film Tools site:

$ ssftool intervalize calibrated_spectrum.csv 400,725,5 | ssftool powercalibrate Dedolight_5nm.csv > power_calibrated_spectrum.csv

And finally, normalize the data to the range 0.0-1.0 and format it in JSON to feed dcamprof. Yes, ssftool has operators for both:

$ ssftool normalize power_calibrated_spectrum.csv | ssftool dcamprofjson "Nikon D7000" > d7000.json

Yes, I wasn’t going to get into ssftool usage, but I just can’t resist showing off a single command to do all of the above:

$ ssftool extract DSG_4583-spectrum.csv | ssftool transpose | ssftool wavelengthcalibrate <(ssftool extract DSG_4582-calibration.csv | ssftool transpose) blue=437,green=546,red=611 | ssftool intervalize 400,725,5 | ssftool powercalibrate Dedolight_5nm.csv | ssftool normalize | ssftool dcamprofjson "Nikon D7000" > d7000.json

It uses bash process substitution, <(…), to feed the calibration file into wavelengthcalibrate. Oh, unix…! :smiley:

The d7000.json file is ready to be fed to dcamprof to make an ICC profile. The dcamprof commands to do that are:

$ dcamprof make-target -c d7000.json -p cc24 target.ti3
$ dcamprof make-profile -c d7000.json target.ti3 profile.json
$ dcamprof make-icc -p xyzlut profile.json d7000.icc

But, we really need to validate the result. We’re fortunate in that we have a lab-grade set of SSF measurements for the D7000, courtesy of rawtoaces. So, here’s a plot of the result of all you’ve read about to date vs. the rawtoaces data set:


“cheap spectrum” are the heavy lines, “rawtoaces monochromator” are the lighter lines. My speculation about the red and blue peak differences is the power calibration; the Dedolight data is probably close but not quite. There also may be contribution from the presence of a lens in the optical chain; most monochromator imaging is done without a lens, light is shined directly on the bare sensor.

And here’s the acid test. I copied the d7000.icc to my profile zoo, then developed my problematic blue LED image with it. First, screenshot of the image with the matrix profile, actually dcraw primaries made into an internal D65 profile:

Then, same image processing, just switched the assigned camera profile to the just-created d7000.icc:

Pretty good for Rube Goldberg engineering, n’est ce pas?

@paperdigits is lending me an IT8 target, so my next post will be about whether profiles from such target shots provide results that make the spectral effort not so worthwhile. I’ll also be making a Z6 SSF profile, which is what I set out to do in the first place. Maybe not so much of an involved post for that…


Splendid! Thank you!

Now we just need to send you all our camera’s to have them profiled as well! Great work Glenn!

Oh wow, what a nice result!! Congratulations and Good Job!

This will be very interesting. Looking forward to it!

You should do a workshop at the LGM next year.

exciting stuff! glad to see you’re pushing this so thoroughly. very curious to see the IT8 as a comparison (but ultimately i want me these curves…).

Very nice that you pursued this and came up with solution that is in reach for most (assuming one isn’t all fingers and thumbs).

I have to admit that I was rather sceptical that this was feasible within a +/- 250 Euro/Dollar budget. I’m glad to be proven wrong!

Once this COVID-19 thing is sorted out (it will be, right?) it would be nice to supply your local makerspace with the docs/specs so they can build it, or even better build one for your local photography/camera club.

I must admit I only skimmed the text, definitely need to revisit for a thorough read. Two questions/comment already came to mind:

I assume the reason to include the lens is convenience, i.e. no need to control the distance from grating to sensor - is that correct? Some back of the envelope estimate for your 1e-6m grating puts that distance at something shy of 0.5m if you want the relevant wavelength range to cover approximately an entire APS-C sensor. Which will probably complicate the setup a bit, but seems doable. Haven’t thought about what the tradeoffs of having that glass in the path might be though, so maybe the entire thought is irrelevant - a well… xD

You mentioned UV overlap as a problem and didn’t mention why UV filtering wasn’t an option. Are those not available for cheap in the required quality or do they introduce artifacts?

1 Like

A lens is never 100% optically transparent for all visible light wavelengths - probably only vacuum would be - so your spectrum will be slightly modified. I’m not sure we’re talking about a measurable effect, but it could be an issue.


Diffusion Foil: Before I actually paid money for such a thing, I rummaged the parts boxes for a suitable substitute. Well, that was even easier than I thought, staring at my notepad one day, it dawned on me that the paper was sufficiently thin to pass a lot of light, and its white color led me to think it’d not be spectrally uneven. So, a bit of white paper it would be for the first attempt.

In every monitor should be a quite good diffusion foil. Just disassemble a old monitor or notebook.


That means that he is calibrating for the sensor+lens system and not for the sensor alone, is it? The same happens for every IT8 or other target that is shot with a lens.

1 Like

I fear the discrepancies in the CMF might be linked to demosaicing aliasing, since your spectrum is spatially split, and your CFA is too. Did you try basic gaussian/LPF after half demosaicing ?

Anyway, awesome stuff.

1 Like

That’s a thought. If nothing else, there should be a box and a light there… I’m expendable! :smiley:

That’s one of the considerations behind choosing ‘half’ for demosaic, the only data transform beside collecting adjacent values was to use the mean of the two green channels. I’ve got a gaussian blur operator, I’ll do some sensitivity analysis with it.

@PhotoPhysicsGuy pointed out in a PM that the power calibration is temperature sensitive; I’d just taken the DedoLight data for the various temperatures and averaged it. Probably a bad choice. So now, that’s another sensitivity analysis.

In order to resolve the spectrum to tease out the individual wavelengths, some kind of lens is needed to focus the light. Even in a monochromator, there’s a lens or mirror focusing the light on the separation slit. Those lab optical elements are ‘achromatic’, but our lenses, well, I don’t know. The Open Film Tools folk are very diligent to point out their profiles are for a certain camera/lens combination.

I have a copy of a paper that describes the mitigation alternatives, I just read enough to conclude any would be a bit more effort than I wanted to invest initially. Might get there, eventually…

An estimate around the binning, that’s going nowhere except to corroborate that the chosen binning isn’t totally off. So only read in case of academic interest (all numbers to be taken as ballpark estimates):
Given you illuminate a 1mm wide area of the grating, you will also get interference on the sensor from wavelengths that are theoretically 1mm apart on the sensor. Meaning going far below that in the binning won’t gain you any real resolution. You are currently binning a wavelength range of 350nm at 5nm and eyeballing your screenshots you are illuminating roughly half of the sensor, lets say 14mm. So that translates to 14mm * 5 / 350 = 0.2mm. So it’s a reasonable bin :slight_smile:

The following is a tangent, probably pretty irrelevant to your efforts, so feel free to ignore me :wink:

I see why you’d want to focus the light from the source onto the slit to not discard most of the emitted light, but that’s a separate issue, right?
I still don’t understand why you need to focus the light: You have a single point (well line) emitting light, radially distributed. That radial distribution is directly visible: You could remove your camera, darken your room and see the spectrum on the wall.

And regarding spectral power distortion:

Did you compare the magnitude of the peaks in your calibration image to calibration data? That could give an idea of whether there’s significant distortion. However I doubt that method is sensitive enough to show any differences in this setup, even more so given you only have max values from wikimedia (as opposed to also fwhm’s or such).

My gut feeling would be that any spectral power distortion by the “paper-diffusor” would be bigger than by the lens.

This is fantastic work Glenn. I was pondering whether it would be possible to find the root of the color woes I was having with my a6000 using diffraction grating after being really unhappy with the color rendition of my a6000 (despite profiling). It’s really exciting to see you going beyond this into creating actual profiles. Curious how far this can go. :slight_smile:

Although I am reasonably happy about my a6000, I’m also very curious about this. @ggbutcher I suppose you already compared with the baseline given by eg. creating a dual illuminant DCP using the studio scenes of dpreview, which is probably the cheapest way of getting a profile for your camera, right? Otherwise might be also an interesting experiment.

Yes. The focus I was referring to was with regard to resolving the spectrum on the sensor. When I set up things to record a spectrum and turned on the light, I had to focus the lens to get the spectrum to “lay flat” on the sensor so wavelengths could be resolved in as few image “columns” as possible.

No, I just really did an “eyeball” lineup of the calibration plot to the wikimedia reference, so I could choose the relevant wavelengths. I didn’t worry the power distribution in the calibration image past that.

I’ll add that to the “sensitivity” campaign.

My performance baseline has been matrix ICCs. I did try to produce LUT ICCs from Adobe DCPs; was successful with my D7000, but quite unsuccessful for my Z6, probably due to my small understanding of the DCP mechanics. My thinking regarding the dual-illuminant mechanism is that it just produces better matrices, so it’s not likely to help in gamut-transforming extreme colors.