Scene-referred editing with OCIO enabled software

How difficult is it for a normal user to setup a custom config for his specific calibrated display?

For the archetype of the person that cares about refined profiling, relatively easy. OCIO V2 will make it far easier for those accustomed to ICC profiles.

From curiosity, what is the difference between refined profiling and profiling? Is “refined profiling” a technical term in OCIO terminology?

OK, let me provide a very specific scenario:

A Krita user reads the Krita documentation about “Scene linear painting” and says “Wow, that sounds neat. I’d like to try that out”.

In the OCIO config file provided by @anon11264400 (Filmic Blender | filmic-blender) for use with the Krita scene-linear painting tutorial, there are two 3D LUTS and eight 1D LUTS.

Here’s the list of 3D LUTS:

  • desat65cube.spi3d
  • Filmic_False_Colour.spi3d

Here’s the list of 1D LUTS:

  • Filmic_to_0-60_1-04.spi1d
  • Filmic_to_0.99_1-0075.spi1d
  • Filmic_to_0-35_1-30.spi1d
  • Filmic_to_0-70_1-03.spi1d
  • Filmic_to_1.20_1-00.spi1d
  • Filmic_to_0-48_1-09.spi1d
  • Filmic_to_0-85_1-011.spi1d
  • sRGB_OETF_to_Linear.spi1d
  • BT.1886_to_Linear.spi1d

Let’s say a given Krita user:

  • Wants to edit in linear ACES_cg or even in a custom linear RGB color space.
  • Has calibrated and profiled their monitor using ArgyllCMS.

Which of the LUTS included in the “filmic-blender” file that goes along with the Krita tutorial will this particular user need to modify?

Read again my reply above, i already answered that.
We are using filmic blender and srgb for this example for the sake of simplicity
I think it would be much constructive if you try to understand the model and workflow using this simpler and existing config and when you’re ready then and just then learn how to produce your custom config for whatever reference and output you want.
You’re still trying to figure out how to adjust exposure in the view, speaking about ACES at this point seems quite a stretch.

My apologies, but you gave a general answer that seemed to be somewhat of the “sidestep the question” nature.

So I rewrote my question as a specific and very plausible scenario.

I’m asking a simple and very specific question. You seems to be well experienced with using OCIO. So I’m confident that you actually can give a specific answer rather than merely a general answer.

If you would answer my specific question with a specific answer instead of a generalized “non-answer”, maybe I’ll learn something. Maybe other people might learn something.

If you refuse to answer my specific question “which LUTS will the user need to modify” - or perhaps even find elsewhere already suitably modified - in which case where? - then I haven’t learned anything at all. And neither has anyone else.

OCIO is open source and there are several configurations you can find and tweak if you don’t know how to produce yours. Check SPI’s mentioned in the nuke video.
Luts and configs could be collected by this community, a suitable config could be produced by the community to be used by the main photography apps so less experienced users don’t have to deal with custom configs.
There’s a wide range of possibilities, but before jumping on that, it’s imperative that you understand the model.
Saying now which luts you need is just calling to even more confusion, as you still don’t seem to understand that those luts depend on the reference you pick

Which LUTS in in the filmic blender set provided for Krita can be used “as is” in the specific scenario I described, without modification or replacement?

This question calls not for generalities, but for a list of which LUTS can be used “as is” from the filmic blender set, given the user’s specific set of involved color spaces.

None. What part of “Blender’s OCIO config was designed for a srgb output and a linear rec.709 reference” you don’t understand?
And what is your point again? That you can’t do what you want with OCIO?
Because it’s actually that you can’t do what you want with that config, hence you need a different one.

Now, if that was an unorthodox way of asking for help for producing your own OCIO config, then apologies. I can help you with that when the time comes, but you really need to wrap your head around the model first

Please, lets keep the discourse civil. There is no need to be rude.

1 Like

Well, for starters, I wasn’t exactly expecting “none” as the answer. I sort of thought maybe that some of the LUTS actually were useable in situations where the reference isn’t rec.709 and the output isn’t sRGB. So now I learned something. Thanks!

Well, I rather suspect that a lot of what I want to do when editing my images can just as well be done using OCIO as ICC, given the right config files and LUTS. And now I know that “the right config files and LUTS” isn’t just a matter of taking care of the monitor profile.

Again, I’m not hostile to OCIO, I think the ACES/OCIO/etc way of handling images is fascinating. I get an odd little glimpse into that world every now and again because I subscribed to the openexr dev list quite a while back, to garner info on how GIMP should handle exr files, and never unsubscribed. And periodically I peruse the internet to learn a little more.

I am rather hostile to the degree to which @anon11264400 and @gez have repeatedly bashed ICC profile color management. And I wish to never hear the word garbage or leprechaun in this discussion again, though I suspect that’s a hope in vain. But these specific hostilities are entirely aside and apart from an educational discussion of what OCIO is and how to use OCIO to accomplish various editing tasks.

It seems the entire pipeline of LUTS from start to finish does require tailoring to the color space of the image (the “reference” color space, yes?) and the color space of the monitor - which in the case of the filmic LUTS is the same as the output color space wrt to the primaries if not the TRC - and in the case of an output color space that doesn’t share the primaries with the monitor color space, it seems one or more additional LUTS are required, yes?

It might help people better understand OCIO workflows if you could provide a short description (one-liner) of what the various LUTS in the filmic_blender zip file actually do in terms of “transforms from something to something, for some purpose”.

1 Like

On this forum we had already quite many discussions about display calibration/profiling, something which already confuses quite a lot of users.

The good thing is that once the calibration/profiling procedure is completed correctly, the user is left with an ICC profile that can be directly used with our FLOSS image editors. All that is needed is to set such ICC as the “display profile”, and load the associated LUT into the video card (something which is well documented).

If now we are going to add another complex configuration layer to this, I am afraid we will discourage a lot of users from following this path. Bear in mind that this is not criticism, nor a way to put some discredit on the OCIO workflow… this is just common sense.

We need a plan.

Worry about those that can see the problems of shoehorning scene referred data into a display referred chain. Either they will see that or they won’t.

In many instances, REC.709 references are all that are required given that many folks can’t see or understand the needs / limitations / complexities of an alternate reference space.

When designing Filmic, the goal was to be as straightforward as possible, for experienced pixel pushers. The video Andrew Price did has nearly a million views. Of the many emails I have received, mostly from experienced imagers with large amounts of paid professional experience hours racked up, very few were asking about display profiles, nor alternate reference spaces. That is, those are very important aspects, yet a non-issue for most people coming to terms with a scene referred reference space model.

No, not at all. I’m not ready to make another OCIO config file just yet. I’m still trying to figure out simple stuff such as "when @gez says “do this or that in Blender, what’s he talking about”, and “why exactly do I need to put a gray card in my scene”.

Obviously also I’m working on figuring out things such as “How specific are the various LUTS to a given working/display/output space”, which now I know the answer is “very”.

By comparison, in PhotoFlow (using ICC profile color management), the filmic curves operation only requires linear RGB, isn’t specific to any pre-specified primaries, and can be fine-tuned “on the fly” using the sliders, with defaults very close to what the filmic_blender LUTS do. But this isn’t the way OCIO works, which is not something that was obvious to me. Well, I was pretty sure there was no “fine tuning on the fly”, but the “filmic LUT is specific to input/output color spaces” part did surprise me.

So yes, this OCIO stuff is rather different from what I’m used to. I did make a custom OCIO config a long time ago, that went from Rec.2020 to my custom monitor profile. But it’s not good any more because I’ve long since updated my monitor profile. In ICC profile color management updating one’s monitor profile doesn’t require updating other stuff, because of the XYZ/LAB PCS.

Right now, in rawproc, I’m probably going to build a OCIO-transform tool that can be inserted like any other tool in the processing pipeline. True to rawproc philosophy, it’ll be there to use or misuse as one wishes…

With that, I’m going to keep my ICC code intact, ostensibly for the following purposes:

  • colorspace-convert my camera-gamut image to Rec709, or Rec2020 when I can learn to make ocio.configs.
  • post OCIO-transform to bring to calibrated display gamut. My home displays are pretty much sRGB, but I have three horrid displays at work, I intend to calibrate them when I can come in off-hours with my colorimeter.

I can probably get away with loading raws using srgb/linear, instead of raw/linear then colorspace-converting to Rec709.

Since OCIO modifies the image array in-place, I don’t think I’ll have to make changes to the image library. Just get a pointer to the image array, and call processor->RGBapply(imgptr);

I think this setup will be fairly easy to insert, and will give me sufficient capability to learn and compare with the olden ways. I did @gez’s Blender exercise and it was instructive regarding how to integrate. I compiled OpenColorIO last night, easy-peasy compared to LensFun, so I’m shelving lens correction for a bit in exchange for some immediate gratification.

With DisplayCal you can do that easily from your existing device icc profile (the 3D LUT Maker).

Well, If your software already has display correction via ICC I guess you could hook the output from the OCIO device and correct it.
It wouldn’t be too different to what other applications do when converting from the “working space” to the screen space.
So let’s say your OCIO device is sRGB (as an example, it could be anything you want). You could take the display-referred output through your ICC screen correction (from sRGB to your display) and that’s it.

That.

The option is to produce a 3D LUT and chain it to the OCIO view in the config so the output is already corrected for your specific device.
It may seem a bit more complicated than just having your screen profile installed, but I’m going out on a limb and say that if you managed to produce your own screen profile it shouldn’t be particularly difficult to create the LUT you need for your OCIO config.

That being said, though, I admit that fiddling with text files is scary for some people and the current implementations could be more friendly in terms of UI.

So, there’s configuration file entries to specify primaries and white point for a transform; or, generate a LUT from them using one of the apps?

Have you ever seen a bind configuration? DNS lookup server from the olden days. I’ve done those, nothing in a text file scares me… :skull:

Depends on what you mean here.

If you are talking about taking something to_reference or from_reference it is relatively trivial via a couple of matrix transforms. The matrices are encoded as four sets of four units, row by row. Assuming a matrix or 1D LUT is invertible, OCIO will create the inverted direction automatically. In the case of a view transform, it is prudent to make sure your from_reference stanza is filled as it is used most frequently in view output transforms, and will be far more performant.

This is a handy dandy little online application compliments of the awesome minds behind Colour.

If you know what to do with matrices, then it is probably as easy as digging up a matrix transform example and substitution with your needs. Beware, wider reference spaces bring their own nightmares including posterization along the saturation axis, and that brings with it the needs for desaturation for high emission values, along with other potential sweeteners including gamut mapping.

If on the other hand you are talking about adding in a display profile, there are more than a few examples out there as to how to tack on a custom view via a 3D LUT or via the aforementioned matrix approach, however your needs are suited.

Start simple. Build up. Be warned… Not all is as simple and trivial as it may have seemed with limited dynamic ranges…

1 Like

After tearing my hair out inserting the OpenColorIO library into rawproc’s build system (Only LIkes Shared Libraries -gah!!), I’ve finally got a crude ociotransform tool to work. Right now, it only does OCIO::ROLE_SCENE_LINEAR → OCIO::ROLE_DEFAULT, which with @anon11264400’s Blender config.ocio does a “Linear sRGB/709” to “sRGB” transform, and it looks right. Since the chromaticities are the same, the only real change is with gamma.

I’m working on a Surface tablet using msys2, and my success came when I finally did the following: 1) install the msys2 OCIO package, and 2) turned off -static in the rawproc link line. All the other stuff I’m linking to is just .a files, so it stays static. I also started to unpack the OCIO public configs, but after about 1.5G of ACES stuff I aborted that and got @anon11264400’s Blender config, much, much smaller. I’m hoping Linux goes smoother, except I’ll have problems cross-building for Windows unless I pack the .dlls in the installer. Not my preference.

Next step is to figure out what parameters to expose in the tool pane. For sure, input and output colorspaces, I think, but I need to determine how transforms are built in the config files, because I don’t think I need to build them in the tool.

So, I turn off CMS (input.cms=0), open the raw with:

input.raw.libraw.colorspace=raw
input.raw.libraw.gamma=linear
input.raw.libraw.cameraprofile=Nikon_D7000_Sunlight.icc

and insert as the first tool a colorspace convert to sRGB-elle-V4-g10.icc. Then, insert the ociotransform tool, and the gamma gets its due. Here’s a screenshot, ignore the ociotransform panel, I copied the colorspace panel code and I haven’t deleted the old widgets yet:

ocio on the Surface is a pig, ~6sec per transform for my 16MP D7000 images, so I multithreaded it and the time decreased to ~1.5sec, much better. Same general approach as multithreading cmsTransform in LittleCMS.

I started to figure out how to apply one of the LUTs to get a scaled display image, but I’m out of time, need to sleep now so I don’t sleep through my morning meeting.

The build thing is a bit of a roadblock, but I’ll definitely keep the code at least as a conditional compile.

(yes, I know, a lot of programming “language” above, but I think the coders will appreciate the story)

1 Like