Creating 3D .cube LUTs for camera OOC styles

Thank you for your reply.
I tried to install Linux using UTM since you said it is difficult on Mac.
Then I tried to install and run it as per the instructions, but it showed “command not found” and could not run.
I see darktable-lut-generator in the list of installed packages, but…
I’ve done a lot of research, but I’ve never worked with linux before, so I’m at a loss.
I would appreciate it if you could help me.

Thank you in advance.


What if you run it from the installation folder? .local/bin

Meanwhile, if you want we can run the raw files and JPEG files for you if you upload them.

Konnichiwa, @kusomisotechnique,

I do not know if it is significant here, but you seem to install

darktable-lut-generator

but you seem to try to execute it as

darktable_lut_generator

???

Have fun!
Claes in Lund, Sweden

Arigatougozaimasu!

I also suspected the difference between “-” and “_” and tried running both, but to no avail.
I thought maybe the Linux I installed was bad, so I tried installing another Linux with the same result…

Hmm, calling darktable_lut_generator is correct. darktable-lut-generator is the name of the pypy package, meeting pypy naming conventions.

I don’t know why it does not work in your Linux VM. Seems you are in the correct virtual environment, as pip lists the package.

I cannot say whether running the script on Mac is difficult because, honestly, I’ve never used a Mac and hence have no idea how it works…
I’ve taken a quick look at the darktable .dmg package for Mac. Seems like darkable-cli is in fact included somehow. I don’t know how / where software is installed on Mac, but if you know, maybe just take a look into the darktable installation directory, search for the darktable-cli executable and provide darktabe lut generator the path to it with the --path_dt_cli argument. That should work without you having to deal with a linux vm (hopefully).

Maybe it is even worth asking the darktable Mac maintainer(s) about darktable-cli, as I guess that you’re not the only mac user facing problems calling darktable-cli, and one cannot find anything in this regard via google or the bug tracker…

@Peter with the new multipass-alignment, also your images with the spectrum on the wooden surface seem to work fine! :slight_smile:

Thank you!
Very helpful as I was still struggling with this!
Now, I will take your word for it and attach a picture.
Best regards.
LUT.zip (76.2 MB)

darktable can’t open the raw file. Is this a compressed ARW file?

Thank you for your reply.
I was able to find darktable-cli.
I ran it with the additional arguments you gave me, and it successfully created the cube file!
It took me a whole day to finally do it. I feel very accomplished.
Thank you all!

1 Like

Thank you for your prompt attention.
I think the attached ARW file is the one I couldn’t open because I sent it as a lossless file.
I was able to run it by converting it to DNG.
Thank you for your help.

多分花の写真15枚が良い。 :slightly_smiling_face:

No need for a printed picture of a Spydercheckr this time. Just some random colourful pictures.

アドバイスありがとうございます。
カラフルな写真でやってみたいと思います!

Thanks for the advice.
I’ll give it a shot with some colorful pics!

2 Likes

I haven’t tried this out or anything, but I just thought I’d mention that it seems very likely that the in-camera processing varies with the camera settings and scene conditions used for each image. At the very least, the camera will be changing the matrix/LUT according to white balance and possibly also with other variables such as absolute scene luminance, local image details, subject recognition, sensor temperature, etc. So trying to come up with a LUT that works across a wide variety of images might leave you chasing your tail.

That’s a good point! It depends on the camera, however.
Usually, dedicated photo cameras have a similar pipeline (similar to Darktable): First, white balance, color matrix etc. are applied in order to transform the sensor data to a scene-referred color space and hence provide a common color representation that is independent of the scene illuminant.
Second, fancy processing is performed to make the image look nice with some intended style and tranform it to a display-referred color space.

In Darktable, the first step is performed on the Raw data so that we have the same basis for applying a fancy image style as the pipeline in the camera (provided that we use the correct white balance (-coefficients) and color matrix).

Hence, estimating the LUT means that we only need to estimate the processing of the second step, which is independent of the properties of the photographed scene.

For this to work, it must be assumed that the in-camera processing in the second step is the same for each scene and the filtering is non-spatial (so that a pixel’s output color is only a function of it’s input color).
Fortunately, this seems to hold for all dedicated photo cameras I know. For example, with my X-T3, the estimated LUTs for the film simulations yield visually identical colors to the OOC Jpegs in any lightning situation (luckily!).

Smartphones are of course a different story. Most camera apps do some local tone mapping and further spatial filtering, and even bake in lens-correction, so that the approach of reconstructing OOC looks using an LUT is basically useless…
And I’d guess that the situation will also change for dedicated photo cameras in the next years…

For more info, this is maybe a nice read:

And of course Aurelien Pierre’s blog.

1 Like