Getting darktable to respond correctly to updates of base image

Got into Darktable recently and am noticing these things which may be bugs or just my own ignorance . I wanted to ask here before actually making any bug report…

To outline my basic use case – I’m using darktable for art (composition /toning / color tests) rather than RAW development. Input files are usually lossless PNG / WEBP images resulting from scanning drawings. I edit these in GIMP, usually many times, in parallel with doing composition/toning in DT.

Firstly, when I crop an image I sometimes spot things that need to be edited out (UPDATE: Yes, I do mean ‘editing the base file in an external editor’, not ‘edited via DT modules and exported’). After I make those edits, it would be good if Darktable could check the on-disk image, and show these updates in the darkroom and lighttable. So far, I have found that I can make the darkroom update by restarting DarkTable, and then re-entering the darkroom view for that image. Otherwise, the image used in darkroom is an outdated, presumably cached version.

However, even after doing this, the thumbnails in the light table do not update – they are rendered with the outdated version. This appears to be because no changes to the history stack have occurred. I would prefer to avoid introducing meaningless edits just to provoke DT into updating the thumbnail.

Is there any way to force darktable to update a specific thumbnail ?
My only idea so far is to a) enable ‘look for updated XMP files on startup’ preference, and then update the timestamp of the XMP file whenever the base file is rewritten.

One option that you might think of, which I’ve tried and verified doesn’t work, is re-importing the image. No change.

(obviously I’d prefer that all this Just Works™, and especially, to avoid the time spent restarting darktable repeatedly, but I can understand why DT would not in general expect the base image to be updated.)

(PS. I am logging in via GitHub because the email login and ‘lost password’ email options here do not seem to be sending any emails… Probably a known issue but I thought I should mention it just in case.)

I did find a way to provoke an update by duplicating the item and then deleting the item I duplicated it from.
The catch to that approach is that it doesn’t cause any updates to other variants of the base image – of which there are usually several. So duplicating the entire set is required. And the other caveat is that of course it changes the image duplicate-ids (eg. the first image is not #0 any more, etc)

Updated source image file is not reloaded for image previews · Issue #18030 · darktable-org/darktable · GitHub suggests that Lua scripting can be used to address this problem by directly invalidating the cache entry(s). That seems possible but looks like it will take some research to find out how to hook into the correct point (when entering the darkroom view is, at my guess, the correct point)

Someone might also suggest contribs/ext_editor.lua in the standard scripts package. This is a module that 1) opens an external editor, 2) waits for it to exit, and then 3) effectively does the ‘duplicate image to invalidate cache’ procedure I outline above. I would prefer to avoid it for two reasons:

a) I often make a series of edits; it’s impractical to repeatedly restart eg. GIMP. Unfortunately the script relies on ‘the external program exiting’ (could maybe be changed – eg. use inotifywait to wait until the file is updated)

b) keeping the duplicate tracking clean – the same reason I hope to get a better solution than ‘just duplicate the item and delete the previous item’.

Things should synchronize so it seems you have something going on…

When I am using DT and just cropping an image it updates the thumbnail in the film strip in the darkroom and the thumbnail in the lighttable view.

Your second post confuses me because it suggests you may be using a LUA script or an external editor which is not mentioned in your original post.

Some people might need to know your OS and DT version to try and help. I hope you find an answer to your problem.

Yes and clarification on the workflow because it almost sounds like the OP might modify the “Base” image outside DT and then expect it to scan and refresh a directory but again even that is not 100% clear to me…

Terry:

Yes, if I “crop an image” in DT using the crop tool then the thumbnail etc are updated. Note how this makes no changes to the base file - ‘cropping an image’ in DT is just adjusting some parameters in a module and, maybe, writing an XMP file to disk; same for almost everything you can do in DT.

Things do ‘synchronize’ as Todd puts it… to that point, but not further. Changes made in DT cause updates to previews and thumbnails as you would expect.

Reading changes to the base file, as opposed to ignoring them, is the issue.

No, I am not using a script (I just opted to install the ‘external scripts’ package – DT makes this easy by default. Just a button to click to install scripts. It’s in the left side in light table view, i forgot which module. But, AFAIK, this only writes some files; you have to require them in your luarc to activate them, which I have not done.)

The talk about scripts is just exploring possible ways to address the problem – if it is possible to hook into different parts of DT. I’m currently unsure whether that is the case.

Yes, I am using an external editor (usually GIMP) on the base file. That’s the essence of the problem I’m trying to solve. I don’t mind having to manually tell DarkTable ‘hey, update your cache of this image’ … but, so far I haven’t figured out whether there even exists some way to tell it to update its cache.

OS and DT version are ‘Arch Linux x86_64’ and ‘5.0.0’ respectively.

I hope that clarifies a bit.

Yes, I do edit the base image repeatedly, that is exactly the question: can I make DT cooperate better with this?

Editing the base image is definitely necessary. I edit the base image in terms of drawing/painting – changing the actual structure and content of the image – and want this to work in cooperation with DT for toning/color/composition. Hopefully that illustrates my workflow better.

I think I understand your problem. It seems that you expect darktable to update your original file (raw or jpg or png or whatever). But darktable will not update the files. The modules are instructions used to generate the preview until you export the edits into a new file. By design, darktable will not touch the original file.

1 Like

DT does not work that way it is non-destructive so if you modify an image outside darktable it is not going to update from where is was when you imported it…in fact that could cause issues… DT knows about the image when you import it. The the edits are fully non-destructive…I know you are not working with raw files and likely this is why the normal flow is not working for you as raw files can’t be handled this way… but DT is primarily a raw developer and not an image editor…

No, I don’t want DT to UPDATE my base file –

It must not do that, in fact! Nondestructive work flow is central in darktable.

What I want is that it notices that my base file has been updated. Not by DT. By GIMP, most commonly.

It wont do that either. And that’s just dangerous. If you change the canvas size (Region of interest) in GIMP, what is dt going to do with that? You will need to start processing again.

If you did a change in GIMP, then you know you did it. Why try to have dt (or any other software) detect a change that you already know you did?

Maybe look at how to use GIMP to DT API…

Yes, I grasp that DT knows about the image at the point in time at when you imported it. This is one reason why I tried re-importing as a way to solve this problem. It is less clear to me why re-importing does not in fact cause any changes to the base file to register in DT, as this seems to me to be one of only two reasons – the other being ‘metadata updates’ – why one would want to re-import a file.

Canvas size changing would cause problems, but is definitely a corner case: I can and would do that via DT (Enlarge Canvas + Crop) in the typical case. Mostly with my edits I repaint pixels. Sometimes I move chunks of things around or reshape them. Some masks might need to be adjusted in the process.

To compare, imagine importing a raster image into Inkscape (link mode, so it references the original image directly) and then adding some vector lines / shapes on top. Then you move some things around in the raster. using eg. GIMP. The vectors are now a bit misaligned.

Well, as far as I can see, in that scenario, that just means ‘time to adjust the vectors’. What is dangerous here, or dangerous in the analogous situation with DT?

I do recognize that my workflow is definitely not what DT expects. As I tried to indicate earlier, I am interested in learning DT api in case I can do something with that.

But I am also a bit surprised that no one has yet said, for example, ‘you can force DT to invalidate an individual thumbnail with (X obscure GUI element)’.

If you are editing in GIMP you have to export the image with a new file name to have your edits applied. I do something similar where I have Microsoft Image Composite Edit merge numerous images I have edited in DT into a panorama. I then export from the panorama stitching program back into the folder of origin using a unique file name. I return to DT and ask the import module to add to library and it finds the new panorama stitched image in the folder. I can then do further tweaks in DT. This is the type of workflow you are going to have to use to achieve what you have suggested in your posts. It is unrealistic to expect any better of DT or another program. I hope this suggestion helps. Good luck with your endeavours.

Since darktable is primarily a raw editor, it does not expect the base image to change.

You can invalidate the cache from the preferences, but that’s not per image: you delete them all.

I have a user case from photography that I have experienced couple of times over the years. I create a panoramas in Hugin and export it as TIFF for final retouching in darktable. It can happen that I notice during the reouching that I need to tweak something in Hugin to get rid of ghosting or other artefacts. After re-exporting the image from Hugin, I have also noticed that in DT I continue to edit the original TIFF, not the fixed one.

Hmm, I don’t think it’s especially unrealistic to expect from software in general. Image viewers often have this ‘notice file changes’ functionality built in, for example (eg sxiv. when I update the image, sxiv’s display refreshes to show the changed image). On Linux, Inotify API makes it quite efficient. I also rely on this mechanic for text editing ( kakoune )

Thanks for describing your workflow – it’s basically making destructive checkpoints in between non-destructive editing, right?. Since I’m mostly not trying to use DT as a ‘renderer of final images’ but as an experimental effects workbench + DAM, I’ll have to consider whether that kind of thing could work for me.

@kofa: Yes, I noticed the option to invalidate them all, but that would be quite painful already at only ~200 entries in database (and the need to do it repeatedly…)

If you are a dev (wrong assumption on my part?), maybe you are willing to answer: dt_lua_image_t:drop_cache() : if I call this correctly, do you know whether I can reasonably expect this to do what I want, or will it instead keep the old full-size image and (only) regenerate the thumbnail?

(I’m not entirely clear on the distinctions made in DT’s caching system. The docs seem only to clearly state that the thumbnail will definitely be invalidated.)

@Juha_Lintula : that sounds like a use case where it may actually be practical to restart DT to get the updated version loaded in properly.

If you’re were asking me, I’m not a developer (of darktable). @wpferguson is the Lua expert here (and is an actual darktable dev).

Heh, I see I was a little slow with my edit. Thanks for the correction.

You seem to use dt as a DAM with editing options, where it’s a (non-destructive) editor with DAM options… So perhaps darktable just isn’t suited as a DAM for your workflow?


As for using the inotify API: keep in mind that darktable has to work efficiently even with large image collections (30k+, in my case). Also, the images can be stored all over the HD, there’s no requirement to have them all in one directory tree. It’s not even guaranteed that the film roll covers one directory…
That could mean all files have to be watched. That is a lot of “watch points”. Most image viewers I know of work on one directory at a time and don’t change the images. So just watching that current directory is enough.