I should have said: in my script, “childSpoon.jpeg” is the image posted in the OP.
I might add: the method is perfect, in the sense that the resulting histograms match. But for perceptual perfection, we want isolated elements to match: the background in the two images should match, the visible part of the child’s face in the two images should match, the jug in the two images should match, and so on. Unless all the elements occupy the same area in both inputs, this won’t happen.
The match-histogram method uses statistics from entire images to calculate the CLUT. But that’s not really what we want.
Another, more complex, approach is to crop small matching areas from both inputs: the wall, the jug, the face, and so on, so we collect statistics from these small areas. Then match the histograms based on those small areas. This gives as many images as there are areas, so we blend these using some 2D method: Shepards, or triangulation and barymetric distances, or whatever.
darktable has a color mapping module with histogram mapping and LUT, and also a deflickering option to match exposure from photo to photo (intended to be used for timelapses, but it should do the trick).
@snibgo Thanks a lot for your detailed instructions. I understand the principle of your approach. But it will take a while till I understand exactly what the script is doing.
This tweaks the RGB channels, giving each a multiplier and addition to make the mean and standard deviations match the target. This is simpler than the match-histogram method, and the result is fairly good. When the range of input colours is small, it is more reliable than match-histogram, so might be better for a method that chops one image into tiles, finds the tiles in the other input, matches pairwise, then blends the results.
I do my raw inspection in Mathematica, because so far I have never really bothered to learn Python. If you know Python, you could use rawpy · PyPI which utilizes libraw to import your files and try to find what you’re looking for.
I was curious to analize my RAW histograms more in details: view single sensor bin (R,G1,G2;B)… I know I can find into RT with no demosaicing and the inspector…but
I’m amazed in the FOOS there isn’t a software as Rawdigger
I’ve taken up the endeavor a couple of times, but put it aside as more pressing programming arose (sounds more ‘official’ than it is, really about what I felt like doing day-to-day… ), Really, I put a bit of work into rawproc’s histogram, and it meets my needs rather well…
I have put some time into a command-line program that’ll read a raw file courtesy libraw, then walk the image array, collect and sum the channel data per value, and puke it out as comma-separated text suitable for opening in your favorite spreadsheet program. Libreoffice makes a nice histogram of the data with it’s column chart. Works okay on my test raw, but it’s not easily compiled by non-programmers in its present state.
Really, Rawdigger isn’t such a commercial abomination; it helps fund libraw, the open-source core library that some of us use in our raw processing programs.
PS. Idea … and if RT ( because it’s capable to make RAW histogram but in the 0-255 range for gui and velocity needs) could export a RAW histogram scale 1:1 ??