Just for fun, I’ve cobbled together a rough sketch of Blender’s AgX mapper for darktable. Not that we need yet another tone mapper. GitHub - kofa73/darktable at blender-agx-poc, if you are bored. No comments on code quality, please, it was powered by copy-paste.
There are (edit: 3) 4 controls:
a slope, that lets you manage highlights (yes, it’s possible to hit 100%)
a power, for midtone contrast
an offset, to control shadows (shifts the whole histogram up/down)
It looks good. IIRC Aurelien was initially inspired by Blender’s scene-referred workflow when he started developing filmic. Is filmic related in any way to AgX tone mapper or is it a whole different paradigm?
The whole transformation starts with applying a matrix to ‘inset’ and rotate the primaries, if I understood that correctly, but that matrix is fixed in the implementation, you don’t get sigmoid’s detailed controls. At the end, there’s another matrix to ‘outset’ the result.
After insetting, it converts the image into log space, that is common in filmic, sigmoid and AgX, as far as I understand. AgX uses a fixed dynamic range (about -12.5 - +4 EV, or about -10 EV below and 6.5 EV above 18% middle grey), so you’d have to use darktable’s exposure and tone EQ to fit the data into that.
After the log, a sigmoidal tone mapping function is applied, which is approximated here using a polynomial (not my work, any of this).
That is followed by applying the ‘look’ params (slope, power, saturation). In the original AgX, those were selectable per channel, allowing altering colour tones (e.g. there’s a ‘golden’ look), but I think darktable provides enough colour grading tools already, and it would have made the UI more cluttered, so I just set them all to the same values.
The slope is a multiplier (but we’re in log space), the power is used as an exponent: out = (in * slope) ^ power
In the original AgX, there were also per-channel offset params. Maybe it would make sense to re-add the offset (as a single value); that would act on the shadows/blacks.
As a finally step of the ‘look’, the saturation is adjusted.
After that, we have the ‘outset’ matrix (which was strange to me, because the inset matrix was applied to linear data).
After the outset matrix, a fixed gamma of 2.2 is applied ‘to linearise’ the data.
I feel like some where I saw it called or referred to as “FIlmic 2.0” but that might have just been suggesting it was then next thing to follow filmic rather than implying a direct code evolution…
there’s no guarantee this will go anywhere (depends on the feedback, I guess)
the code this is based on expects Rec 2020 as input, and provides Rec 2020 output. I do not check if the working space is Rec 2020, and no conversions are done. Should be OK for proof of concept, given darktable’s default working space is Rec 2020.
I’m not familiar with Blender, but if they say they use Rec 2020, I think they probably use a D65 illuminant. However, darktable uses D50 (if I understand correctly), which means we’d need to adapt the white point / alter the in/out matrices, but I haven’t done that. That means what you get from this module, in its current state, is not the ‘real AgX’, as it is used in Blender. At least that’s what I think, given my rather limited understanding.
Edit: I’ll try to add a simple Bradford D65 ↔ D50 tomorrow, just as an experiment.
I gave it a try and it’s pretty good. Some pictures can almost be fully developed (simple edit) using only the tone mapper which is pretty cool and can lead to very quick edits.
Some quick tests, the colosseum and the street shot only have the tone mapper besides the usual sharpening and denoising modules.
As far as I know, Blender’s AgX is Troy Sobotka’s AgX.
Re-reading EaryChow’s Python script I see that he has changed the outset matrix from that of Troy Sobotka, having no rotation in this Rec 2020 version, introducing a mix instead, which I do not have.
So maybe it was wrong to call this topic Blender AGX in darktable. I just wanted to make clear it’s not arctic’s much more ambitious emulsion emulation project.
Now that I spotted that hue mix, I’ll see if I can add it and if it makes a difference.
That hue mix looks broken to me. How can you mix angles like that? The Python library uses [0, 1) for hue representation. If the mix factor (in that script, 40%) is used with two reddish colours (say, 0.01 and 0.99), and use out = a + x * (b - a), we’ll end up with about 0.4, which is anything but red. Other colours, where we don’t go over the boundary, would be mixed properly, e.g. 0.19 and 0.21 results in a value about 0.2.
Am I missing something? Maybe they don’t hit it, because they only calculate a 37 x 37 x 37 3D LUT.
@flannelhead was part of the looong discussion at Feedback / Development: Filmic, Baby Step to a V2? - Blender Development Discussion - Blender Artists Community, so I guess the primaries module and the corresponding feature in sigmoid already provide what darktable’s developers envisioned. I will continue to play around for my own amusement / to learn, but probably nothing user-visible will ever surface from this (or maybe it’d be simpler to add the scale / power / offset controls in sigmoid).