I import film rolls but only edit a few of the photos in the roll so in preferences I set Darktable to create the xmp sidecars only when edited. The thought was why create every sidecar file when there are no plans to edit all the photos.
I have set some modules to auto apply to the raw file. So I open the raw file in darkroom and I see the history stack is populated with hot pixels, demosaic, and a number of other modules. I then return to Lightroom. But when I examine the file folder, no sidecar file is there. If I make a manual change in exposure module before going back to lightroom, the sidecar file is built.
I thought the pixel pipeline alone would automatically create a sidecar when the raw file was opened in darkroom.
Does the history stack actually apply changes that should show so a sidecar file should be created or only if a manual change is also made in addition that the sidecar file is created? Or am I missing something?
“after edit” is the most misunderstood term in darktable and I’m not sure that anybody has a good definition other than a frustrated user that said “this doesn’t do what I thought it would”.
If you open an image in darkroom, some modules are auto applied to make the image viewable. If you don’t make a change to one of the existing modules or apply another module, then this is not considered an “edit” for the purpose of after edit. If you add additional auto applied preset modules I believe they fall under the same rule.
There are also other actions, such as applying a tag, that are considered “edits”
The only good advice I can give you is test and see what happens, especially with things like ratings, color labels, tags, etc.
Thanks. That is the concept I seemed to have arrived at too from some testing. It is further confusing in that the image in lightroom is the embedded jpg and if you export the image from lightroom without making any significant changes which saves a sidecar file, it must go through the darkroom pixel pipeline prior to export.
And what is really confusing is, you are able to select a different pixel pipeline while in darkroom. And then return to lightroom, and export an image, it must use whatever pixel pipeline that was last selected in darkroom - not the original pipeline you may have used to view that image- since that pixel pipeline can’t be stored in the sidecar file because the sidecar doesn’t exist.
There are a few misunderstandings here:
- sidecars are an “extra”, for normal operation darktable uses its internal database. So even if you don’t use sidecars at all, any edits are stored in the database.
- “pixel pipes” are darktables “processsing engines”, there are a few for different purposes. Those pixel pipes are fed the editing instructions to create the final image.
- if all darktable has to do is apply the standard modules (instructions) there is no real reason to store anything… (this is a bit oversimplified, the actual situation is a bit more complex).
So in short, the sidecar is normally not read/used by darktable, only written into.
The sidecars function more as an extra backup in case the database gets corrupted. They also allow exchange with other programs for things like keywords, titles, and descriptions. What they do not allow is exporting the editing steps to other programs or read editing steps from other programs.
I would like to add a small but handy detail: If you work with images on external volumes and multiple computers, the sidecar’s role is crucial as your edits will be detected and synchronized on import and (with the appropriate setting) when starting darktable.
I thank everyone for their responses. I was attempting to understand how a long term workflow could be implemented. For example, you import a film roll, extensively edit an image, generate a tiff file. A year later, you wish to enhance the same image even more, but changes that may have occurred throughout the year - presets change, modules added or removed, may not allow you to get the same starting original edited image of a year ago unless all original edits are stored somewhere.
So if I understand, edits that occurred are stored in the library.db or other databases, not the sidecar file. And it seems some changes could possibly still affect the retrieval of the original edited image from one a year ago.
From the responses, I assume if the film roll is removed from darktable, all original edits are lost, even though the sidecar usually remains and is not deleted when the roll is removed.
darktable updates preserve existing edits. The module parameters are versioned specifically for this purpose.
You may not be able to recreate that edit with a newer version of darktable, since the module would be using the newer version of the parameters.
@garry611 , that’s prudent to think about the long-term position, and you set me thinking, however I don’t think you should be concerned. Hopefully someone will correct me if I have any of this wrong.
You’re always going to make some change to any image you care about and when you do, all the actual edit values for all the modules switched on are stored in the XMP, including modules where you’ve used a preset. So if a preset is changed for whatever reason, your edit is not altered. Example - if you select Canon Daylight preset in White Balance module, some “canon daylight” indicator is not stored, rather the three RGB coefficients for canon daylight are stored in the XMP.
@rvietor says sidecars are an extra but I would not think of them as inferior or “second class”. I rely on them for long-term storage and imagine lots of others do too. I regard the database as temporary working store and don’t back it up! However for maximum security, after I’ve output a jpeg/tiff/etc, and have named it appropriately, I copy the XMP and name the copy the same as the jpeg/etc., and both get stored together (and the raw also). Darktable never touches XMPs that aren’t named the default way, unless you explicitly tell it to. So for example you can use DT to delete raws from their folder on your disk. If you have 123.raw, 123.raw.xmp, 123-my-cat.jpg and 123-my-cat.xmp and you delete the raw using lighttable, then only the first two are deleted.
yes I think so
I think this is saying you might not be able to recreate your edit starting with a clean sheet, i.e. no xmp and no details in the library, because you’re now getting a different version of the module in question. However if you have your original xmp, or it’s still in the database, then you haven’t lost anything - you still have your edit. DT keeps the processing logic for old versions of a module precisely so that old edits are preserved. If a module is made obsolete and removed from the GUI, it is still present internally for old edits.
A few remarks:
- a jpeg or tiff should have all relevant metadata stored inside the file;
- the metadata for a raw file and a jpeg are not quite the same, so there will be a conflict (at least the “derived from” tag will be wrong in the copied xmp);
- I have no idea what will happen when you try to import such a jpeg/xmp combination.
It’s a choice, so “can” rather than “should”?
Point 2 sounds like a minor consideration. Is that tag documented anywhere?
Point 3 - yes it might look weird but how often do we edit a jpeg in DT that we’ve produced in DT? And it’s only a couple of clicks to Discard History and off you go.
Somewhere in the documentation is buried a suggestion to always backup the library and databases in addition to xmp files. I tested a situation where raw files were imported with the auto creation of the xmp file replaced. The option for creation when edited was chosen. The raw files were imported, and opened in darkroom, back to lightroom, back to darkroom and then back to lightroom. No xmp files were created. However, as soon as an exposure module change was made to lighten the image, for example, an xmp file was created. Of course, dT must process the raw file to create the image. I assume the raw processing engine uses defaults and must not consider these as changes. A raw image file copied from another raw image file would produce an identical image, so no xmp needs to be created. Now, if you make an exposure change to one of the images, that exposure change is stored in the xmp file for that file.
So after a year, if the two files are re-opened they will appear the same compared to the two of a year ago relatively speaking, one having the exposure adjustment. But they may not match the exact images that were displayed a year ago.
My only point is, if you print an image whether there is an xmp file or not, a year later, it is uncertain the same image would print the exact same image. Again it’s probably only important if someone requests a reprint a year later and expects it to match the original print.
darktable is built to ensure that the image will be the same. Changes to modules are versioned, so that an old version of the parameters will give the same result as before.
@garry611 , in line with what two of us have said, you will get the same image provided you keep the details of your edit, i.e. you have the XMP or the edit is still in the DT database.
You mention defaults and possibly not having an XMP. Are you wanting a workflow where you print your raws without any editing and want reprints to be the same, a wedding or event photographer maybe? In this case, changes to DT such as revised modules or changed default values would give a different reprint.
Of course, because darktable is only one element of the chain, and it’s not at all sure you’ll be using the same printer with the same inks, the same paper, and the same settings. Dt will send the same data to the printer driver, though. (Note that you have the same problem if you want to print an older Tiff, png or jpg file…)
And the XMP file is a bit of a red herring here, as it’s the data in the database that will be used. Darktable writes the XMP, but won’t read it, unless you re-import the file, or ask dt to read the sidecar.
You can tell dt to scan for changes in sidecar files on startup though.