I’ve been developing tools to analyse and generate HaldCLUTs to mimic in-camera colours. I think it’s nearing a point where it would benefit others as well. Here’s a LUT generated from X-H1 data to replicate Astia film simulation in Darktable. I’m not sure how well it works with other Fuji cameras but at least X-H1 users might like it.
Here’s what to expect from the LUT:
And here’s the LUT in question:
Download the LUT and put it in your lut-directory
Open a raw photo
Disable base curve -module
enable 3d lut with the above lut
adjust exposure (I think tone equalizer does a fantastic job =)
I hope this helps others as well and makes editing photos more fun =)
I had made dtStyles for X-Trans with Andy Costanza from presets proposed by FujiFilm using their software : Fuji Film Simulation Profiles – Stuart Sowerby
Just download a haldclut-indentity12.png into their software ;
apply the proposed preset;
then save as .png.
I tested this LUT against various photos from different cameras and got very nice improvements with it. Could be a beginner’s tool to get to a decent result with one click and ease the learning curve.
This particular image was taken with a Canon Powershot G7 X
I don’t know Olympus software? If it offers presets to edit photos and allows you to work in .png or Tiff (which you will have to convert to .png).
I made haldClut with G’Mic’s film simulation and also with DxO’s FilmPack (made in Tiff and then converted), the method is always the same :
load a haldclut-identity;
apply the preset
The advantage of FujiFilm’s software is that it provides the presets to be applied to the RAW to have the presets internal to the case.
Ah, I see. This HaldCLUT was made with a program I’ve been developing. It takes a lot of image pairs (one made with darktable and another one from camera) and analyses them to create a LUT to mimic the latter. If Olympus doesn’t have good LUTs yet I could try making one
Are you still working on this. Is is something you plan to share or market?? Its a version of the approach by darktable-chart but it only analyses one pair and you are using several so of course it should be more broadly applicable and accurate…
a LUT generated from X-H1 data to replicate Astia film simulation in Darktable. I’m not sure how well it works with other Fuji cameras but at least X-H1 users might like it
It is my understanding that after applying the input colour profile + white balance of some sort (and noise reduction, lens correction, sharpening etc, and using exposure to even out ISO differences), photos from similarly capable cameras should look rather similar (subject to sensor dynamic range, of course). What I mean is that as long as the input profile is accurate and the sensors are similarly capable, the ‘flat-processed’ output of a Nikon should be almost identical to a similarly spec’d Canon or Fuji, and, in that sense, LUTs are universal. Is that not so?
My second question is the expected input of the LUT. What I mean is that, unlike filmic, the LUT won’t be able to cover arbitrary values: it’s created to map a certain input range to an output. If we apply non-linear operations (be it tone EQ or a filmic mapping), we won’t be processing the (linear) input in the ‘same way’ as a real Fuji camera / a brand of film etc. would. How do you usually use them? Use (non-linear, non-global) tone EQ to tame the dynamic range?
This does not mean I don’t like or use LUTs: I have unscientific fun with them. In fact, @Jean-Paul_GAUCHE’s profiles are among my favourites, and I’ll definitely try @sacredbirdman’s new profile, too. I usually use a ‘flat’ (logarithmic) filmic ‘curve’ for tone mapping (well, it’s a straight line when plotted, but the scale is not linear, and ‘straight curve’ sounds silly), and apply the LUT on top of that.
Its a bit less general than that with Lut as there are different types that can be applied depending on the desired goal and there are limitations as well…a key one being this "First, it is important to understand that all LUTs are calibrated for properly exposed images and will yield predictable results only if used on such images. Over exposed, or underexposed images, images with blown out highlights and completely black shadows will result in less than optimal color grades.
LUTs are calibrated for properly exposed images.
In some cases, there may be a few LUTs calibrated to work properly on under or over exposed images but it will always be either of the two. A LUT cannot produce a satisfactory result for both under and over exposed images." A good review is this one Lutify.me | Using LUTs? Here is What You Need to Know …they are certainly not one size fits all and there are caveats to using them…
I’ve loaded a grey scale (gradual transition + steps into darktable); adjusted exposure to make sure the darkest and brightest steps were reported as pure black and pure white, respectively, then applied @Jean-Paul_GAUCHE’s and @sacredbirdman’s LUTs. Interestingly, the first produced a definite shift to pink; the second remained neutral:
I think you have probably discovered what I was attempting to say just before you posted…I suspect the conditions that the luts were created vary and thus so to will your results…and if it varies under conditions as controlled as your example …on a variety of images the results might be hard to predict…
That was my problem with many LUTs as well (color shifts toward extermities). So when I made my program I decided to make it interpolate the vectors toward zero in both extremes (very bright and very dark). So it essentially gradually lessens the color correction in both ends to avoid wild color shifts. It also rejects the samples in color space segments that receive too few samples and just interpolates over them to avoid outliers spoiling the LUT. It seems to work reasonably well.
Are you still working on this. Is is something you plan to share or market??
The tool is very rough around the edges (like a lot of the parameters are only changeable by modifying the code) but… I could brush up the code a bit and write usage instructions for the brave and put in on gitlab or something
Thanks for responding…I was just curious to see how you processed the images to compile the Lut ….it would be nice to play around with although my coding is pretty weak but it might be worth sharing if you are getting good results…someone might even pitch in a run with it??