lens correction preset with dynamic focal length


I’d like to have lens-correction enabled for all my pictures by default when I import them in darktable.

For this, I created a preset and set “auto apply this preset when criteria is matched”. for the criteria i choose the specific vendor/lens, where I know the lens-correction exists and works fine.

This works fine, and for every picture (if it is a zoom lens) the correct focal length is set accordingly.

The problem arises, when I want to change the preset in some way. For example - for certain lenses I do not want all corrections enabled, but only distortion and TCA (so no vignetting). As soon as I change this parameter, and then update the preset, it seems that also the focal length setting is hard-locked (although I didn’t even touch it). The next-time, the preset is auto-applied, it also applies the hard-locked focal-length. Which is very likely wrong if the lens is a zoom lens! In this case, the distortion is applied in the wrong way, and pictures even look more distorted.

How is it possible to change a parameter in the lens-correct-preset, without hard-locking the focal length also?

1 Like

You need to open a feature request at https://github.com/darktable-org/darktable/issues/new?assignees=&labels=&template=feature_request.md&title=
Saving a preset stores the current values for all parameters

1 Like

So it’s not possible. OK. Thanks for clarification.

Interestingly, it does not store all current values, if I don’t touch any of the parameters. So if I save the preset in the first place, coming from some random picture where the focal length is set to some random value. In this case, if I save a preset, this random value is not saved in the preset.

1 Like

There is already a bug report open at github:

I also suffer from the problem of darktable not correctly autodetecting my lens and thus relying on something like dynamic presets.

This and a ton of other missing features regarding presets and styles would be a few keystrokes away if the settings were not packed into that dreadful data-blob inside the xml.

And please no one tell me it is faster … the xml and settings are fractions compared to the actual image data that gets rendered over and over again.

We await your patches!

No, you don’t… :wink:

The internal XML structure is not likely to change, sorry.

yay for tech debt.

btw … if I win the lottery this is really on my list to throw money around.
ah, well, one can dream.

Two years later and the decision to put un-editable binary blobs into the sidecar files is still one of my favourite software-design failures. Really a shame for an app that tries to be technical.

But well, I also have not won the lottery. :woman_shrugging:

foss isn’t about demanding but contributing …


What blobs are you talking about?


The darktable:params and darktable:blendop_params blobs, presumably

Y’know, I just had to implement a “blob” format for G’MIC scripts embedded in the rawproc toolchain, so they could get past command line shells. Didn’t really want to, but sometimes the mix of tools and environments compel such…

1 Like

Pretty sure it’s a hefty amount of work to move all the params to human-readable, and I assume no devs consider it worth the effort for the small reward. If it were me, money would not change that (unless someone wants to split their lottery jackpot prize with me).

Martin, we might have a misunderstanding.
I was talking about a favourite thing of mine. :sunglasses:

Well said.

It is definitely not one of those changes one can make on a whim because it effects every. single. one. of those million-billion-trillion sidecar files out there.

For most of the use-cases of hand-editing styles I have found work-arounds, but when you search and find your own answer that is not an answer then you just have to answer yourself again, because lens-correction is the one tool where I still haven’t found what I’m looking for¹.

Anyway, sorry for resurrecting a zombie-thread.


¹) Time for some music from the past

We’d still need to support the blobs as well, which I guess would be its own nightmare.

Might as well summarize the issue here again, since it keeps coming around.

Introspection information describing the data format fields is available internally (for the current/last version of each module) so writing human readable files to xml and reading/verifying them (for the current version!) isn’t a hard task. @houz famously was there 90% of the way, without showing his workings, thereby discouraging anyone else from trying and getting rejected for not reading his mind on how to do it “correctly”.

The “problem” is with supporting previous versions. Because they are not introspected and therefore we just blindly trust that the blobs we receive are the right format (and size!). Human readable/editable means easier to mess up, so we’d get crashes. Or we’d have to strengthen all the legacy version upgrade paths, which we don’t want to touch because we have no test infrastructure to catch the inevitable breakage. Going forward, rework the legacy support to ease maintenance · Issue #14608 · darktable-org/darktable · GitHub wants to make this more robust (no idea why that is marked “good first contribution”; you really want to understand dt in-depth before touching that).

So switching to human-readable-only might be a no-go. But exporting both, and using the human one if the blob has been deleted/is absent (or some other indicator) and if it is the current version, should, again, not be too hard to implement. And might be sufficient for automation purposes, as long as scripts are immediately updated when a new module version has been introduced.

1 Like

Thank you very much for that concise insight into the situation.

Do you know if the introspection can deal with partial settings?

If yes, one might do it through a separate commandline util like darktable-generate-cache. Because although possible, editing files en masse is always a dangerous thing. But taking a style, output to unblobbed format, edit, repack to a valid blob seems like a viable way.

This way pretty all legacy treatment can be ignored. Maybe the blob-reader would need a little hardening against incomplete settings, but even that could be limited to the “apply style” path.