I processed many thousand raw files (ORFs from Olympus) in Darktable a couple of years ago. When I open the files in the lighttable module in later versions of Darktable, the images are displayed as they did when I was done with their processing. If i go into the darkroom module, the rendering (colors, exposure etc) of the images completely changes and remains changed when I switch to the lighttable module, even if I have not applied any modifications.
I have tried to restore Darktable to “respect” the settings in the XMP files bound to each ORF. This has not been successful. One thing that has mitigated the problem is to copy the settings from a file that I have not yet opened in the darktable module to an image that has been opened. This makes the tones better but does not fully restore the original.
How can I make Darktable render my images as before?
When updating darktable, darktable teams have regression tests to avoid that, so this should not happen. Anyway, without any RAW and XMP to check it will be hard to answer more on your questions. Thus, which color profiles you use on darktable. Did you change them? Click on softproofing icon.
If you have an image you don’t mind sharing, it would be very useful to upload:
-
An ORF file
-
the .xmp (edit: all this needs to be ideally one that you haven’t opening with the new dt, so it’s original)
-
a jpeg exported from your old install as a reference so we know how it should look. If you don’t have one, then a screen shot of the image in lighttable looking how it should look would hopefully do.
I’ve no idea at this point why you have the problem, but hopefully the files might provide a clue! 
Thank you for being interesting in helping out. I have attached files below. I have to admit this is unfortunately far from the most obvious example but the tone and expose in the yellow and in his face are different. In some other images, there is much more of a difference.
JPEG from old Darktable:
ORF last opened in old Darktable:
P5020039.ORF (13.6 MB)
XMP last opened in old Darktable:
P5020039.ORF.xmp (6.8 KB)
JPEG from new Darktable:
ORF last opened in new Darktable:
P5020039.ORF (13.6 MB)
XMP last opened in new Darktable:
P5020039.ORF.xmp (10.5 KB)
When you say you processed these “a couple of years ago”, what darktable version was that? Do you use “modern” chromatic adaptation (color calibration module)?
If you look at your old XMP file you can see that it does not include the “temperature” (white balance) module but the new one does and this I believe this is the cause of the issue.
Prior to (I think) darktable 3.0 any modules that were automatically added and not subsequently changed by the user were not included in the XMP file. This meant that whenever you re-opened the image, darktable would re-add the modules with the new defaults and might change processing. In order to resolve this (and some other things), darktable 3.0 changed so that all modules were automatically added to the XMP.
This leads to the issue you have seen. If you open these old (pre-darktable-3.0) XMP files in a new version, darktable sees that there is no white balance module and adds it automatically using the current default values. Unfortunately if you use the modern processing it will add the white balance module using the “camera reference” setting (D65) and this is not appropriate for those old images and changes their appearance.
You have two options. Either enable the color calibration module on your images (which will convert them to “modern” chromatic adaptation – this will probably still change your processing but in a better way), or amend the white balance module to the “as shot” setting (which should get you back to your previous state). If you choose the former approach you can probably apply to all your images in bulk. The latter will need to be done on a per-image basis (since you cannot apply “as shot” to all images at once).
N.B. This will affect other modules as well (demosaic, highlight reconstruction, possibly others) where your image will change to use the new defaults (which have changed since 3.0) but hopefully this will mostly have a positive effect on your images. I guess it might affect module order as well (since module order wasn’t stored to the XMP file before 3.0).
Thanks! I’ll have a look at what happens on my system with the files.
When I apply the older .xmp it comes up as a black and white edit… if I switch off the color zones module it comes up looking like the “new DT” image. I’m a little uncertain due to the B&W edit, but it seems I might be reproducing the issue.
BTW, there’s no need to attach raw files multiple times, as darktable never modifies those.
The very last thing I did before I closed the ORF many years ago was to export one black and white JPEG. I had exported a color JPEG just before that. So if there is a line in the old XMP says it was bw then it can be ignored in this example, IMO.
Ignoring this, the colors and exposure are still different.
I opened the image with dt in legacy wb mode and manually applied the older xmp, and it comes up set ‘as shot’ but the image still looks like the “new dt” version in the OP.
I tried loading the “old dt” jpg as the sidecar, and filmic rgb comes on alongside base curve, but after disabling filmic rgb this comes up with a result much closer to the “old dt” image, althoughbut I haven’t worked out where the difference is yet.
The main difference between the image I get by loading the jpg as sidecar, and the “new dt” sidecar (which looks the same as the “old dt” sidecar when I load it) is the ‘tone curve’ module is different, exposure is different and the module order is slightly different too.
TBH I’m not sure what to make of all this, and I may be on the wrong track by using the jpg as sidecar. I think normally this should give the same result as the .xmp.
I haven’t loaded the XMPs and tested, just manually looked at the files, since I recall having this issue myself and I knew the cause (in my case). It might also be worth checking the module order (I guess it should be set to “legacy”).
You may have to set the modules order to legacy, especially if you normally use the scene-referred workflow for new images.
Done that now. It doesn’t make a big difference anyway.
@VictorB I still don’t know how this situation came about… (I’m no expert anyway!) I got very close to replicating the original jpg export, by duplicating the image, using either of the .xmp’s you provided on the first one, applying the original .jpg as sidecar (instead of an xmp) to the second one and copying and pasting the tone curve module to the first image, then manually adjusting the exposure. Not a straight forward process.
I don’t understand why the jpg loads a different edit to the xmp - normally they should be the same. Aside from that, I wonder if the darktable input profile could have changed from the early versions?
This is the jpg I got from the above process.
I think I may have found the cause of the problem and that is the Base Curve module. It seems like the files opened with a straight diagonal base curve line in the new Darktable until I managed to save a preset of the old base curve and now all files open with that preset which is curved.
The one who can understand the XMP can get a clue from this image.
ORF:
P7070187.ORF (16.5 MB)
JPEG from old Darktable:
XMP last opened in old Darktable:
P7070187.ORF.xmp (3.6 KB)
JPEG from new Darktable:
XMP last opened in new Darktable:
P7070187.ORF.xmp (3.6 KB)
Right, well done. Oddly, on the image you posted to start with, when I open it (edit - with the old xmp) , the base curve looks alright! 
I was about to post the below, when you made your post above - I’ll post it anyway I guess.
Update: it seems to me that there’s only really two modules causing the issue: tone curve primarily, and to a lesser extent setting input color profile to ‘enhanced color matrix’ brings the colors even closer.
Anyone know why something’s amiss here? Oh, and the fact that the tone curve seems to be pretty much a match in the dt data that’s in the old jpg makes me wonder if somehow the xmp files could have been changed before… but I don’t know how!
Unfortunately I don’t have any really good ideas on how to fix a large set of images. If that tone curve is the same on all the images then one could just make a style and apply it to the whole lot in ‘append’ mode so it only changes that, but I’m guessing the tone curve was adjusted on a per-image basis. Hopefully someone with more knowledge than me will find something!
I think you are right about the enhanced color matrix.
When it comes to Base curve, I cannot remember me changing it unless the image was very over- or underexposed.
Tone curve module however, is one I’ve used much.
If I apply the correct Base curve setting and enhanced color matrix in Input color profile to all files, can I assume most of the images will look as before, you think?
In my case, I think the settings for the modules “input color profile” and “base curve” were applied by default in the old versions and are the module settings I must manually re-apply to the images. It shouldn’t be a big issue because I used the same settings for those modules for most images.
So I downloaded your image and the two sidecar files… I named them old and new I went to LT view and made duplicate images. I discarded the side car as a precautionary first step then loaded the side cars. I made sure both were set to legacy. I removed the color zones from the old one and the small exposure from the new one… In any case they looked very close going back and forth so placed about a dozen pickers around the image and they were the same… any chance that you exported that jpg with some extra settings or maybe as adobergb or something… I will go back and check and also import from the jpg to see how it gets handled.
Module order does make a small difference ie legacy vs default …
Did you render the old ones at 100%?? they are bigger than the raw file which suggests maybe… The newer one is much smaller. I believe I read something once to never save 100% JPG as there is actually more artifacts… don’t quote me on this still I think there is also a bunch of extra meta data so I just wondered if you are comparing apples to apples??
This is interesting as it means that the processing when importing is not the same as manually loading the sidecar… if you load it manually the basecurve seems intact…




