How to load an exported sidecar file

Yes! Having given it a little more thought it is pretty clear that my thinking is tainted by my experience with other software that happens to behave quite similarly in this respect.

As previously mentioned there is merit to having the processing parameters embedded in the file. Also, because I’m using several different programs I do need to distinguish which files have been processed by which programs. The sidecar files do present a bit of a burden for organization and maintenance purposes.

Also, and possibly most significant, the image files I create from the programs used to develop raw files are usually in 16bit uncompressed tiff format. These could be called my master image files. As you might expect these are NOT the final work product and NOT what I share or for that matter print. Further post processing is involved to create image files ready (sized) to print maybe with borders and text added. Even a signature for the ones I really like. This further post processing is typically done using GIMP which has some (now much improved) capability for managing metadata. There are also some metadata edits that require ExifTool. Between GIMP & ExifTool I should be able to remove the XMP metadata added by DT which I might point out has become pretty irrelevant at this point in my pipeline.

This topic turned out to be a bit more complicated and ended up generating much more discussion than I originally expected. However, it was very helpful to me and I want to thank all participants for taking the time to contribute for which I am most grateful.

THANKS!

1 Like

A bit late to the party but with two words of caution:

  • make sure the appropriate option to save the development info in the exported file is enabled (darktable 3.8 user manual - export)
  • in some rare instances the resulting metadata including the development is larger than allowed by the file format (not sure if it applies for TIFF, but it certainly does for JPG). In those cases it will be truncated and the rest lost(*). Make sure to enable the option to compress that metadata to minimize the risk (darktable 3.8 user manual - storage)

(*) there is actually a warning on the manual about not using it blindly for XMP backup because of that

Well now that is good to know. I did notice the option about compression and was a little puzzled about why that might be desired.

What I think is an obvious question is “how do you determine that adding the processing parameter metadata failed?”. It sounds like the phrase “the rest is lost” means that you end up with a bunch of pretty useless metadata in the image file. If so, why does that make sense? Wouldn’t a good solution to this problem be to do what other software does and write a sidecar file which certainly does NOT have any size limits? Oh and since that solves the problem why NOT write the processing parameters used to create an image file into a comparable read-only sidecar file with a name that associates it with the resulting image file? If some users would rather NOT do that make it optional.

Also if I’m understanding correctly, does this mean that DT cannot be used to create an image file that is free of (without) this defective metadata? Fortunately, if such an image file is otherwise correct one can use other software such as GIMP to produce a comparable file without the defective metadata.

I personally only think this would ever happen if someone had a massive number of masks…someone should torture test it and see…there are scripts , at least 2 that you can run to strip the metadata I believe if that was a desire… google lua scripts and I think you will find it someone has one as part of an export process…ie running the script provides that in the export step …at least I believe I have seen that…

Think this is the one I recall…

Seems like option 4 allows for a custom exiftool action…perhaps this would allow you to create your output or strip data as you see fit…

From this topic

1 Like

You might also want to make use of this…

Probably there is an error in the console if you run darktable from the command line. But in any case according to this the limit for JPG is 64kB and for TIFF is 4GB (effectively infinite). If you’re exporting to JPG you could check the XMP sidecar size beforehand. As @priort said you’ll need quite a complicated development with many masks to get close to the limit (specially when the metadata is compressed), so it’s a very rare occurrence.

You could file a feature request (Issues · darktable-org/darktable · GitHub). Maybe a LUA script is enough.

Yes, just disable the option to save the development info in the metadata (it’s the only part big enough to even remotely hit the size limit).

Would the criteria for determining that the amount of metadata is excessive have something to do with the proportion of the resulting file size? If so, a low quality (highly compressed) jpeg would seem like the most likely candidate. My 16bit uncompressed tiff files are pretty big. On my old low resolution cameras more than 100MB typically. Thinking here is that it would take an awful lot of metadata to significantly increase the relative size of such files.

On the other hand, if the idea is that there is an absolute limit to the size of the metadata irrespective of the file size it might not be so hard.

@guille2306 Made some good points I think. I really don’t know that much about it although it would be easy to find out with some research… it might be useful if there was a way that dynamically you could flag any files during export that would exceed any physical limits on that embedded data. Then you could choose how to proceed knowing the file would not contain a complete record of all the required metadata but again I am not sure how to go about that or how hard it would be … Again as it may be rare I wonder how often it would happen but I am sure there is someone out there capable of those calculations…