tiff linearization

Which parameters should I use converting my tiffs (with RawTherapee or dcraw) to ensure my hdrs are properly linearized and the colorspace (RAWs are in AdobeRGB) is handled correctly? The hdrs will be stitched into a spherical panorama used for cg lighting purposes, hence I am worried about accurately reconstructing the scene illuminance.
Preferrably the .tiffs would have a gamma curve applied to make better use of the available bits for underexposed images and improve denoising. What exponent/slope does LuminanceHDR expect for the ‘gamma’ response curve mode?
Also, is there a way to write compressed .exr files from the command line?

thanks for your help!

I don’t use LHDR. Try reading: Command Line HDR Creation — Luminance HDR Manual. E.g., see

image

It is kind of confusing because I can’t tell whether linear is referring to the input files or the tone mapping. Based on the flow of text, I am leaning toward the input. :man_shrugging::woman_shrugging:

Yeah, I already use the hdrResponseCurve arg. but still cant get convincing results.

By that I mean I still fail to create a hdr image that I can reproduce the unclipped exposures from (for validation). With photomatix this works perfectly out of the box - it seems to read the response curve from the embedded .icc profile. My result with LuminanceHDR is off both in terms of tone curve and color.

The colorspace handling in general is a mystery to me - do I have to stick with sRGB or is there a way to use eg. Adobe1998 as input/output colorspace?

I did dive into the code a bit and found the curve definitions. Gamma seems to use slope 4.0/exponent 1.7, the others are self explanatory. So maybe I’ll finally get at least the tonal range right.


In the image I only show the ‘linear’ response from LHDR, but I tried the other responses too.

Better have @fcomida answer your questions.

The raws are not in AdobeRGB, they are in the camera’s space as represented by the input profile.

Can you explain the problem more clearly?

Thanks for your response - I will clarify:

  1. I want to make sure my (automated) hdr pipeline centered around LHDR and RT outputs values as close to the scenes absolute values as possible. That includes color rendition, but I made the mistake of mixing up gamut/primaries and tone curve in my explanation. I just meant that, since I am capturing in AdobeRGB I would like to retain that gamut.

  2. I thought converting my RAW files to linear tiffs would allow me to assume a linear response curve in Luminance HDR (Debevec Model). I tried the same with sRGB (2.4/12.92) and gamma(1.7/4.0) response curves, but it seems that the cameras sensor itself returns nonlinear values. I think so because with an automatically recovered response (Robertson) I get far more plausible values (which are also very close to Photomatix’s result).

  3. now I am unsure how to proceed. As consistency of the results is important I dont want to stick with automatically recovering a response for each individual bracketed set. I can either pre-calculate my cameras response based off a few test sets, average them together and use that for all my hdr merges. Since I dont know what the sensor does exactly this might fail for different ISO settings. Alternatively I can find tiff conversion / response curve combinations that yield good results by trial and error.

  4. For validation I captured a scene with a large dynamic range and measured the nits of a few points of interest with a (Gossen Starlite) spotmeter. I uploaded it HERE. Since the whole process is full of possible error sources and my knowlege of this matter is limited I would welcome any proposed workflows to achieve the defined goal (1).

I hoped to answer some of your questions by verifying a few things empirically, but the only thing I managed to verify is that there are bugs afoot.

I believe the response of most cameras is linear. Some raw files contain a “raw curve”, which I suppose might result in the raw files being non-linear, though I haven’t verified whether this affects what you’re doing. I’d guess it does.

You’re capturing in whatever gamut your camera has. The colorspace setting in the camera affects only the embedded JPEG preview, not the raw file.

It should. There are two ways of doing this in RawTherapee 5.7:

  • The old way, use the “Save Reference Image” button in the Color Management tool.
  • The new and better way, use RawTherapee’s ICC Profile Creator to create a custom ICC profile with a linear gamma and whatever color space you need, e.g. ProPhoto, then save the resulting ICC file in the folder specified as the “Directory containing color profiles” in RawTherapee. Restart RawTherapee and the new ICC profile will become available as an output profile.

The resulting TIFF images appear correct when viewed in Geeqie without using a color profile - a TIFF saved using a ProPhoto gamma=1.0 output profile looks differently as expected when compared to one saved using a ProPhoto gamma=2.2 output profile. The actual pixel values are different and the profiles describe that, so when the images are viewed using the embedded profiles they look identical. So far so good.

Here’s where things get odd. I saved one bracketed set using an ICC v2 ProPhoto gamma=1.0 profile and one set using ICC v4 ProPhoto gamma=2.2, both using the ProPhoto working profile with TRC=none. I saved another set using ICC v4 ProPhoto gamma=2.2 and using the ProPhoto working profile with TRC gamma=2.2, slope=0. I created three HDRs, one from each set, using Luminace HDR v.2.6.0-249-g77752acd. I created camera response curves for each.

The resulting camera response curve files differed between each other in the “log10(response Ir)” column, but I haven’t succeeded in plotting a meaningful chart using LibreOffice. All three charts appear linear, here’s one:

!

tiff16ic_rtv4l_22.m.txt (412.2 KB) tiff16ic_lin.m.txt (412.2 KB)

I don’t know whether Luminance HDR is calculating the response curves correctly, and/or whether I’m charting them correctly. Would be good if you benchmarked against some third program, one which is known to calculate response curves correctly.

If anyone knows how to calculate camera response curves in a program which is known to work correctly in Linux, do tell.

16-bit integer TIFF, output profile ProPhoto, gamma=2.2: https://filebin.net/odkqnzb2zdialclk
16-bit integer TIFF, output profile ProPhoto, gamma=1.0: https://filebin.net/w44f6j4jag2tdjgr

@Elle is good at this stuff but she doesn’t use this forum anymore. Her email is below

If the tiffs have a gamma curve not equal to 1.0, they wouldn’t be linear.

an estimated response should show at least some noise tho.
Anyways, I seem to get very similar response curve estimations regardless of the tone curve I apply to my tiffs (linear or whatever gamma/slope). This might indicate the sensor of my Sony A99V reacting nonlinearly
Still got to average the response of different sets together to reduce error noise and analyze the impact of ISO on the curve. And maybe test it with another camera.
I capture 55 images at 0.3ev apart.

linear:
resp

with gamma curve (exp/slope = 1.7/4.0):
resp_gamma