Though it being a single line sensor is not a problem if we use it as a calibrator, assuming we trusted the QE curve in the spec sheet:
Place inline sensor in box, turn on well behaved source (knowledge of SPD not necessary) and take picture of spectrum from grating with it
Compensate for inline sensor QE = spectral photon distribution of source signal after grating
Place camera to be tested in box, turn on the same source and take picture of spectrum from grating
Normalize with the spectral photon distribution from 2) = CFA SSFs
There are much cheaper such inline sensors out there, such as this < $5 from Toshiba. I think they are used in scanners and the like. None of them include an onboard ADC (nor does the hackaday one) so an ADC-less Pi is probably less than optimal.
The key would be finding one whose ātypical QEā plot is repeatable and trustworthy. Do they exist?
This one appears to be spectral energy so it would need to be converted to quanta. And it obviously includes a coverglass. I wonder why they donāt show the response below 400nm.
I think you have replaced āestimate spectral photon flux from similar enough lightsourceā with āestimate spectral photon flux from similar enough sensorā.
Nothing wrong with that, it takes out the optical system in between the actual sensor and the lightsource. But it also relies on the published QE of the line sensor with some assumptions. Specifically any angular dependence of the line-sensor-QE or the actual-sensor-QE will lead to minor problems as well as line-sensor-QEs variance. Spatial ADC gain variances of both sensors not even touched.
At that point a calibrated photodiode that one scans through the image plane connected to a good enough voltmeter might make more sense.
The question is, which way forward is practical and might improve on the existing results?
Ha, buy an i1Studio! I know when Iām out of my leagueā¦
I had the money set aside, then decided to upgrade my computer instead. Now, I donāt have to take vacation days to run nlmeans denoise (another technical bit where Iām out of my league), but I have to start socking away my allowance again for the xRite spectrometer. Actually though, the 1-line arrays present a decent electronics project, something where I am in my leagueā¦
Seriously, i do need to pick away at these smaller influences. Still, when I went to compare results for my D7000 with a larger training set (Munsell spectra set in dcamprof), the max DE difference was even smaller, 4.61 for a LUT profile based on my measurement vs. 4.33 for the rawtoaces monochromator-measured equivalent. Iām using my Z 6 profile from my Rube Goldberg spectrscope data to good effect now. Whatās that old saw about perfect being the enemy of good enough?
Yes indeed, hence the skepticism at the end. My feeling is that variance in the latter is substantially less than in the former, especially because of the difficulty in controlling DIY temperatures and drive currents inexpensively.
A chunky slice of the budget of the Spectron II went to stabilizing the source and the integrator. On the other hand I suspect that the QE curve of the inline sensor should be repeatable with decent accuracy, at least from similar batches.
4.33 for a monochromator measured profileā¦I wonder if other monochromator based setups are better? that maxDE of 4.61 is very very respectable in that regard. Kudos.
I agree insofar as there āshouldā be something gained from taking out optical system and drive voltages for incalibrated filaments. But when a monochromator setup does not really improve on the results, itās somehow hard to justify a line sensor to calibrate against.
My Best guess is that the mentioned monochromator setup has still leeway to be better but was not optimized to deliver better maxDE values? A lot of people can see a maxDE of over 4! I would have expected a better result from a proper monochromator setup, and yet I donāt have any reference.
Now that weāre talking about it, Iāll probably take a few of these datasets and generate some munsell-based LUT profiles with them to see what DE they produce. Of note is that dcamprof produces a boatload of reports and images if you tell it to, and some of those are linear TIFFs that visually depict the difference for the various DE assertions.
I did a bit more digging behind the IDT_report_v4.pdf, found out it was a separate endeavor from the rawtoaces collection. Scott Dyer did four cameras, Canon 5D Mk ii and Mk iii, Nikon D810, and Sony A7. The thread I posted contains a post further down where he links a DropBox with 12GB of files supporting this measurement; I downloaded the tab-separated data files for each camera and added it to my collection.
A consideration in all of these endeavors is that the measurements were taken through a lens. Dyerās data includes transmission data for the lenses; I plotted the Nikon D810ās 24-70 f2.8 lens and it looks like this:
Rather significant, on the order of the diffraction grating, which Iām still not includingā¦
The āgold standardā measurement technique appears to be using the output of a calibrated monochromator to shine on the bare sensor, through an integrating sphere. Far past my budget, and I donāt even have example data from such a source for comparisonā¦
This one is a nice complement to that one. For our purposes we only need working temperature to estimate SPD. They use the same resistance method as in the other paper to estimate temperature and even suggest that CCT could be used, if known.
I like the tokyo dataset more for its 4nm bandwidth resolution.
Well with such significant contributions from a lensā¦Iād say choose a lens which approximates all lenses from one manufacturer best (haha!) and measure with the lens. Otherwise you have a profile for a usecase that noone uses. You always shoot with a lens. We had this discussion before, right?
Calibrating for every lens would be somethingā¦with these lens transmission curves one could do this in software and would require a no-lens profile. Then for every lens attached you multiply (divide? well you compensate for what the lens doesā¦) the attenuation on top of the SSFā¦
The spectral-lens-transmission data would be a nice addition to the lensfun-db!!
Makes using SSF data a bit cumbersome, I think. Youād have to start with the sensor-only data, then apply the lens compensation before building the profile to use for a given camera-lens combination. And then thereās the white pointā¦
Cripes, now Iām wondering how to test this. What it would take is 1) a monochromator measurement of a camera sans lens, and 2) at least one measurement with ideally the same monochromator setup and camera but with a lens in the optical chain. Iām not about to buy a monochromator, at >$1000, looked at the used ebay offerings with some skepticisim. Occasionally Iāve considered making one, hereās a nice gold standard of such: THE PULSAR Engineering
The university on the other side of the hill has an optical lab; guess Iāll have to take a walk over thereā¦
Couldnāt you just use your contraption twice, first body only and then with lens? After a little massaging, lens spectral attenuation would then just be the ratio of the results. You could assemble an approximate single curve by using each channel in its active portion.
I donāt think thatāll work for the lens-off capture; my workflow relies on capturing a focused spectrum where each 1nm of wavelength spans only a couple of pixels in width.
The folks at Max-max, the camera conversion company, did a series of captures using a monochromator output transmitted to the camera via fiber, with the end of the fiber directly illuminating the sensor. They got a series of images of a blurry circle, each image capturing light from a close-to-single wavelength. Thatās the sort of thing Iād like to do; 'course, now thereās fiber to considerā¦
Oh, I 100% agree! I thought that laughing smiley at the end of the paragraph was enough to signify my sarcasm/amusement about this Idea!
Well, the rainbow that the grating produces, just has to be as wide as your sensorā¦making the slit-width smaller should reduce spectral overlapā¦am I missing something? The wavelength dependence and thus the angle of the diffraction doesnāt need an optical system to be imaged,ā¦does it? No I think it doesnāt. It may help but I think itās not necessary (if I am not overlooking something super simple here).
Mulitmode fibers to the rescue! https://www.thorlabs.com/newgrouppage9.cfm?objectgroup_id=3255
Not too expensive. But of course coupling into the fiber is the big hassle because of the narrow acceptance angle. The couplers cost a lot, 180EUR hereā¦
I would at least try to just park the camera at a distance to the grating and see of you can make do without a lens.