How do I use dcamprof to color match to the color target values?

If you’re using the current reference file workflow, these settings are ignored for the reference file.

I’ve got an open pull request for an alternate workflow that doesn’t use that code path and instead sets up an appropriate processing profile. Useful if you want to open the reference image in other software (GIMP to pull fiduciary coordinates, Hugin to defish a fixed-lens fisheye camera)

Note that if you want to match exposure settings that’s getting into land I don’t mess with, at least not with dcamprof. All of the standard RawTherapee profiles are designed to not depend on exposure.

If you want to match the JPEG tone curve, you’re in a completely different land of response function reverse engineering.

It’s not really matching exposure but a reconstruct of a linear curve using the color target. 3D LUT Creator did this after the profiling stage, I just want to see if I can do it during the profile stage.

What camera do you have that has nonlinear sensor response needing reconstruction??? I remember seeing an article saying a couple of models (I can’t remember which ones) have their ADC saturation set to a point where the photosites saturate first, and those had a noticeable nonlinear shoulder before clipping, but the vast majority of cameras are highly linear up to the point of hard clipping and any nonlinearities are going to be very minor and require a lot more than just a single ColorChecker shot to identify and characterize.

I think I see where this is coming from…

See this video…

That only applies to video (and in practice, JPEGs too) since those routinely have nonstandard transfer functions applied, and is of no relevance to raw profiling via the procedure OP linked in their first post.

Using a ColorChecker will only roughly approximate the transfer function. If you want a more exact transfer function you need to do an analysis of both the RAW and JPEG shots from a camera with no distortion correction applied (RawTherapee’s AMTC is one such approach) with a decent gradient in the image so you get data at all luminance points, or use a transfer function recovery algorithm such as Robertson’s algorithm ( OpenCV: High Dynamic Range (HDR) for some info on Robertson’s algorithm if you don’t want to read his paper, OpenCV, LuminanceHDR, and my own GitHub - Entropy512/camResponseTools: Miscellaneous tools for reverse engineering camera response curves are implementations of it), although both Robertson’s and Debevec’s algorithms tend to be inaccurate at estimating the response function near the clip point because they’re intended to support HDR reconstruction and the clip point has zero weight in that phase.

Or just shoot RAW which is linear to begin with for the vast majority of cameras on the market.

I just pulled up my raw calibration shot that was used to profile my A7M4 in daylight. If I:

  1. Choose the Neutral profile
  2. Set white balance on the brightest white patch
  3. Adjust exposure compensation so that the brighest white patch has an L value of 96.3 when I hover over it, per https://xritephoto.com/documents/literature/en/ColorData-1p_EN.pdf

Edit: BTW, this is after generating the profile, so the DCP profile generated using “standard RawTherapee method” is used. This will potentially adjust luminance based on hue/saturation, but is a purely linear function of input luminance. e.g. a “2.5D” LUT. The shot is, by definition, captured at the exact illuminant conditions that were used to generate the DCP, since it IS the shot I generated the Sunlight part of the DCP from.

Then I get the following L values for the patches, with the target values from that datasheet listed for comparison:

Patch Measured Reference
19 96.5 96.539
20 81.7 81.257
21 67.7 66.766
22 50.7 50.867
23 36.5 35.656
24 20.5 20.461

Basically within measurement error

1 Like

I agree I just think and I could be wrong that the OP drifted a couple of times to talking about 3D Lut creator and in that second video the presenter uses those words reconstruct the linear response or something like that so I figured this was the likely source of those comments

How is the color compares to the target after adjusting the exposure?

Obviously if you adjust exposure away from what lands patch 19 at an L value of 96.5, none of the L values will match any more.