A lot of great modules in Darktable (tone equalizer, contrast equalizer, color equalizer, did I miss something?) are parametrized by a continuous (possibly periodic) function, specified by a few control points, which are then interpolated.
The problem is that the interpolation does not preserve monotonicity. This is familiar to all users, but in case someone has not encountered it, here is an example:
I wonder if it would make sense to replace the interpolation algorithm with one that does preserve monotonicity (and ideally convexity, but that is not essential). Several algorithms of this kind exist, and AFAIK the color zones has one, see
If the interpolation code is shared between modules, this would just require a change in one place (but not that the hue is periodic, so the algorithm has to allow for that).
(I wanted to start a discussion before opening an issue, maybe someone will tell me that this is not feasible or desirable).
The example you show is not monotonic!
And in the modules you mention, monotonicity on the parameter curve is definitely not desired. (Modules where monotonicity is desired are e.g. filmic and sigmoid).
What would be nice is avoiding the overshoot you can see (the āwigglesā). But Iām not sure those are a problem in practice.
If they are considered a problem, perhaps Akima splines might be an option. Then again, devs havenāt used them, and the method dates from 1970ā¦
For me the behavior of the splines is a mystery. Mainly in tone equalizer and color equalizer the oscillations are most of the time not wanted and need to be compensated for by increasing the surrounding nodes.
I wonder if this is only a mathematical problem (no alternative solution found, no dvt. time to implement it) or if itās sometimes intended? Like in color equalizer why an increase in green saturation would need to automatically be compensated by a decrease in cyan and yellow saturations? Same thing in filmic where too much contrast breaks the monotonicity which sound like a nonsense for a tone curve.
So my understanding is that it is probably a mathematical limitation which, at the moment, need to be corrected manually by the users.
Yes, thatās my point Thatās the current behavior, which is not monotonic.
Just to clarify: I donāt want the curve to be monotonic, I want the interpolation to preserve monotonicity. That is, if I am interpolating between points (x_1, y_1) and (x_2, y_2), with some curve f(x) = y that has
I think the cause is that shape-preserving splines are a bit more involved to implement than plain vanilla spline bases. The algorithms are available, eg Schumacher (1983). It is quite simple compared to the very complicated algorithms already in Darktable.
I am just testing the waters here: eg would this be considered breaking compatibility for the related modules, etc.
Well, as you say, the algorithms are known. So why are they not used at the moment?
There may be issues with those algorithm which makes them less suitable for this particular use case. E.g. that Schumacher article works with quadratic splines, so you cannot have inflection points within a segment.
I think these are things that should be looked at before posting such a suggestion, otherwise it looks like all the work is put on others than the original proposer.
I donāt know how familiar you are with mathematics, butā¦ Darktable uses piece-wise cubic curves (a_0 +a_1 x + a_2 x^2 + a_3 x^3, where the coefficients a_0...a_4 are recalculated for each segment between nodes). With those, you have a problem when you want to fit a curve for the segment between the 3rd and 4th node in @Tamas_Papp 's figure:
youāll be using the 2nd to 5th node, where you see thereās a maximum at node 4. that means you know the tangent there should be horizontal. But you cannot decide (from those 4 nodes) that the tangent should also be horizontal at node 3.
Something similar goes for other segments: you cannot decide from the 4 points you are looking at that the oscillations should not be there.
Correcting that requires extra calculations, which might make those more advanced algorithms unsuitable for interactive use. And as the calculations take place on relatively few datapoints, parallel processing (let alone GPU) isnāt going to help you, the overhead is too costly.
TL;DR: those wiggles are a direct result of the way the curves are fitted to the nodes, and they are an artifact.
In filmic the overshoot with too much contrast is an issue, which is why you see a warning on the curve. If you still want to keep the contrast, you can āsolveā the issue with the ācontrast ā¦ā settings under the āoptionsā tab (āsafeā eliminates the overshoot in all cases)
Schemes exist for higher-order splines and other similar approximations. See Rasch and Williamson (1990) for a review.
I am happy to review the relevant algorithms, but I donāt know what the requirements are. Note that at the moment I am not proposing any concrete replacement, just gathering information in good faith. Just donāt participate if you feel that talking about this issue is too much of a burden for you.
Specifically, it is not clear to me that a quadratic scheme (eg the original Schumacher one linked about) would not be sufficient for the purposes of Darktable: it has continuous first derivatives, looks quite smooth, and has no āwiggleā artifacts. Why insist on cubic here?
But if it has to be cubic, there are several schemes for that too. Some are iterative and more computationally intensive, but have properties like mass preservation (important for solving some PDE, pretty much irrelevant for our purposes).
Ensuring piecewise monotonicity for a cubic spline is not difficult. It can be done with a single pass through the coefficients (modifying the computed derivative values). Iāve attached some of my class notes on splines; you can ignore 90% of it with the interesting part being Section 2.3.6 which discusses monotonic cubic splines.
There is some debate around what set P should be but it is not something I would lose sleep over.
Iām not fighting the suggestion that other fitting algorithms are possible and may be useful.
I do not agree that such algorithms are necessarily better for darktableās use.
And the more I think about it, the less Iām convinced that forcing all segments to be monotonic is always desirable.
I think the original proposal for such changes should at least show a real issue(*) , and an alternative suitable for darktableās use.
As for the Schumacher proposal, that schema can add extra nodes to the data. That allows to simulate inflection points with quadratic curves. But those nodes will have to be kept track of, and recalculated at each change in the fixed nodes.
(* : for me, a graph which is not quite what you expect isnāt a real issue, visible artifacts caused by that behaviour would be an issue).
The question is not so much are the algorithms known but rather were they known to the developer at the time they coded the cubic spline routines? Monotonic cubic splines, while a relatively simple extension to ordinary cubic splines, are seldom covered in numerical analysis coursesāeven at the graduate level. Indeed, even the canonical reference on splines āA Practical Guide to Splinesā by de Boor does not cover them at all.
Iām not sure I agree here. What is also extremely important is the expectation and reaction of the user. Users enable modules and paramaterise these modules to help achieve a specific look to their photos. For example boosting the brightness of a certain colour. I strongly suspect most users do not expect that using the UI to strongly boost one colour will lead to the two neighbouring colours being slightly muted. If this muting is visible on the image really doesnāt matter; what matters is how the user responds to the apparent muting indicated on the graph. If their response is to tweak the neighbouring nodes to eliminate this artefact then we have a problem: users are wasting time to correct perceived issues due to a module not aligning with their prior expectations.
It would therefore be interesting to know if people are actually doing this. Looking through my history Iāve found a couple of photos where I was trying to colourise a blown out sky and appear to have fiddled with the neighbouring nodes. I do not know if I am a typical user or if this is common, however,
Yes, they are. See this recent video from @s7habo, who I would consider an expert user. Almost every time he edits a node, he aligns the neighboring nodes to (I assume) avoid the wiggle.
Yes, I can second this. I work a lot with various numerical methods, read a lot of books on splines, and havenāt seen the cubic version you kindly linked (I was aware of the simple Schumacher scheme only).
Like I said earlier, Iām not sure segments should always forced to be monotonic. Example with the color equaliser shown in the first post: lift the 4th node 2.5 units and the 7th 2 units? Nasty, we get a darkening between nodes 5 and 6. Lift both by .5 unitsā¦ Do you now expect a monotonic segment between 5 and 6?
Further, those interpolation routines are used by several modules (not just the recent color equaliser).
Some of those other modules allow some control over the smoothing (contrast equaliser: āzone of effectā of the mouse is variable and modifies the effect on neighbouring nodes, tone equaliser has a smoothing slider). Perhaps one of those options is better than forcing piece-wise monotonicity (and it might make for a more homogenous user interface)
Strongly agree that oscillating splines should be fixed. Many years ago I created a program to help model and analyze geophysical data. One of the tasks was to fit a smooth curve given a set of 2D points. My first implementation used simple splines that oscillated in some situations. So I found so called (if I remember correctly the Russian terminology) parametric cubic splines that completely eliminated the problem. And implementing them was not so difficult compared to the original splines.
Imagine I want to desaturate the orange leaves, I lower the orange in color equalizer but this automatically increases both the yellow-greens and the magenta-red ranges as well, this is notable in the foreground leaves and in the blurred ones on the ground in the midle left of the image:
We can also see this with color pickers on the foreground leaves with 1) basic edit 2) uncorrected color eq. 3) corrected color eq.
Now, if I go back to the uncorrected color equalizer and also want to saturate the magenta flower, then it will also desaturate the blue flowers:
If I remember well, the behavior of the corresponding module in LightRoom is opposite: if you desaturate a lot a color it will start to desaturate the adjacent hues. This seems more logical a less error-prone because it is the userās intended output.
In fact, I donāt think this spline behavior to be a major/urgent issue because, in practice, I guess most of us correct for this, so it is clearly possible to obtain an artifact-free edit. But then the question is more like: do we want to mind this every time we use a module using splines? (and possibly any future module using splines will have this behavior).
But that being said, Iām totally aware of the actual constraints of dt development as a free software and the limited time investment available by the devs. So as said above, I see this thread more as an open discussion from a user perspective.
we need to interpolate a smooth curve between a given number of fixed points (most of the time, there are 9 points, but thatās not critical);
we do not want to have oscillations when we move an isolated point (if you move several points around, the situation gets morecomplicated wrt those āoscillationsā);
we do want a fast interpolation routine (as itās used to update an interactive display!)
Within darktable, there seem to be at least two solutions to mitigate the oscillations:
the contrast equaliser allows a āzone of influenceā around the cursor, which can influence several nodes at once (the display indicates the oscillations when that zone becomes narrow);
tone equaliser has a smoothing slider, which does not prevent the oscilations in all cases.
A third solution could be an interpolation which forces monotonicity per segment.
Two methods have been mentioned, with different constraints.
Personally, I prefer the solution from the contrast equaliser GUI:
we know it works for the contrast equaliser (i.e. it does what itās supposed to do fast enough for interactive use);
afaik, it does only affect the GUI, not the āworking codeā used to modify the image (which means it effects fewer code paths, as the working part of the module often has both CPU and GPU implementations). E.g. the āSchumacherā method can add nodes, which would require changes in the code using the interpolated curve (an extra variable for the final number of nodes)
I do not know whether that solution is compatible with all the modules using a similar interface, or with the current use of the different mouse functions and shortcuts .
If it stays only that, we are wasting our time, as any change will have to involve the devs (or one of use becomes a devā¦)
What we can do here, is figure out what the exact problem is and what the possible solutions are, with their advantages and disadvantages. Keeping in mind that the issue touches several modules, and a common solution would help make the GUI more consistent.
Note that the cost of interpolation algorithms mentioned here is negligible compared to the pixelpipe, or even a single module. We are talking about a 10^2ā10^3 FLOPS for a 9-node spline.
This, again, is a trivial change. The internals of Darktable are written well enough to be able to accommodate a change like this.
Not at all. This helps those who care about this with making a concrete proposal when we open an issue, which is much better for the devs than a fuzzy argument. @fdw linked a simple and fast algorithm, and @gigaturbo provided a nice concrete example, neither of which was in my original question. So we progress towards a better solution. Letās wait for more users to chime in and then we can open an issue.