Maybe I don’t understand the real problem. You currently have the raw, and 2 jpgs from the image (color + b/w). Yes, there were changes in darktable from your old edits that will not shows exactly the same. The intent is not to break the old edits, but stuff happens.
But do you need to re-export the image as a jpg? All of them? I’ve been on going back thru my library and on purpose discarding the history and processing using 4.2 (current master). I’m not doing every image, but the 4 star ones. I then export again and compare to the previous jpg. So far, I always delete the old jpg. It is not only 4.2, but also now I know more about editing. With presets, I’m spending less than 1min per image unless there is a need to retouch.
So far, I’m around 2012, and I can tell this is when I purchased a colorimeter for the monitor. Before this the jpg color/blacks are bad. Therefore, what about the idea of just reprocessing the important ones?
I didn’t notice a difference in them as I said I only had to remove the cz module and make sure they either hand no exposure or the same… A dozen pickers in the blues and other area did waiver as I toggled back and forth and the export was reflective of the screen image… We can’t know about the original rendering but the files are very different in size and there is a fair bit more meta data…
I hope so, but I can’t be sure. Sorry not to be more helpful! I’d suggest going through a selection of images manually and seeing if it works before applying it to the whole set.
Edit to avoid making yet another post… I’ve just downloaded a windows build of darktable 2.4 (!) from back in 2017. Haven’t tried it yet, so I hope it will work on my newer windows. When I have time I’ll try loading those images and .xmps to see if it sheds any more light on it.
I’ve had a look at those files with dt 2.4, and also tried the second image you posted in my current dt 4.2.
I think this thread will need very careful reading for anyone trying to follow it!
First point: in the first image, the one with a royal guard (or someone like that) when using dt 2.4 (from 2017) there is still a discrepancy in the tone curve module, compared to the image I get from the jpg. I think you may have adjusted it when you did the monochrome conversion. That put me on the wrong track I think. But as you’d expect, loading the old jpg as sidecar instead of the xmp gives a perfect match, in the “vintage” darktable.
Next point: Moving on to the new image you posted of someone sleepy , when I load this in new dt 4.2, it gives a perfect match right off. The base curve is correct, and a few newer defaults might come into play I think but the image looks perfect, the same as you found when you applied the base curve preset.
When I was in dt 2.4 (old) I noticed that the base curve is set to ‘olympus like alternate’. This preset still exists, and is correct, but in the new dt you must set ‘preserve colors’ to ‘none’ after applying the preset.
So disregarding a lot of what I wrote earlier in the thread, relating to the first image, going by this second image, it seems the only real issue is that for some reason your dt install is not applying the base curve settings from the .xmp. I don’t know why that could be. But…
I think you can do that and things should be fine! …(I hope)
Edit; I did a quick edit starting from a blank using the latest tools in dt 4.2 (won’t work on older versions)
I was able to get a pretty good match quite quickly after a little thought - I was pleased to be able to replicate your pleasing edit reasonably well using the newer tools. P5020039_03.ORF.xmp (9.9 KB)
It seems to work fine with those suggested settings. The “selective copy” and “selective paste” in lighttable come in handy.
Now I just have to figure out a way to restore my Nikon NEF files. So far, the base curve for them is “Kodak easyshare like” without preserved colors and not the “Nikon like” or “Nikon like alternative”. I find that a bit strange.
As input profile “standard color matrix” seems best for them in combination with the mentioned base curve.
Thanks for the update.
I as you say, it’s great that you can apply changes in bulk. I’m still puzzled as to why this has happened, but I guess it doesn’t happen very often that one moves from a pre-3.0 version to a new version now, and maybe there is a bug in there somehow. At least the original setting are available.
The files were edited in DT a couple of years ago and I did not export most of them to JPEGs then. The reason was because I wanted to have the option of re-editing or exporting them in the future with the best algorithms possible and that would be in the current release of DT, in case I export them now. If I can rely on my editing being readable in the future, I may not export most of the files to JPEGs because that would require new hard drives.
It could be argued that by staying with the base curve and tone curve modules, and the old ‘sharpen’ module you’re not getting the latest and arguably best bits of dt. My workflow would involve sigmoid, color balance rgb, tone eq and diffuse and sharpen…
On the other hand, what matters is that your images are how you want them to look so it’s very much up to you. I think ‘best’ is very hard to define anyway.
I prefer to export all my best images to jpg, often in a slightly lower quality, so that if, say, I find myself without a dt installation, I can still acess those best images. Not really sure how much point there is though!
Can you please refer me to a guide on how to transition or move old editing to new/better modules? I guess the regression testing (mentioned above) and the introduction of new modules that practically replace the old modules do not cover an automatic transition of previously edited files.
Why would you want to transition existing edits to the new modules, if you are happy with the “old” result?
I’m not even sure a transition to scene-referred with identical rendering can be done (at least not automatically).
And if the rendering is not identical, do you want it done automatically? A lot will depend on how involved your edits are, as well
TBH so much has changed it’s easier to start again with new modules (perhaps using the guide in the documentation as a starting point). As @paperdigits says, make a duplicate and start editing. I would only add “discard history” as an additional step before you start editing. You can use the original image as a guide but I found so much improvement when I moved to the scene-referred modules it only served as a guide to how bad my edits were before
haha, careful! i might use opinions like this as an excuse to not promise long term compatibility in vkdt. i always thought the “we don’t break legacy edits” promise of dt was really important. it’s this thing about the data a user produces is more important than the application and will still be valuable after the application cannot run any more etc. also otherwise the history stacks might break any time, destroying work that you had done only very recently too.
but i feel the same about old images: i usually re-edit with newer tools and a different eye and want it to look different.
The thing here from my analysis is that the images in DT from the old and new xmp looked identical so it was something about the exporting that must have introduced the difference. I’m not sure what version the older image exported from but to fully confirm you would have to go back and install that and export.confirming at the same time equivalent settings… Or someone with really old edits should see if they see the same thing ie open export and compare
The whole discussion makes me consider how many images I should save really. Just to manage them or even watch them all takes days, if not weeks.
Duplicates or close to duplicates are no-brainers - just delete them.
Great shots - simply keep.
Anything in-between - now it gets trickier.
One way to at least simplify the storage and management is to only keep raw files of the best shots and export JPEGs of the rest.
What’s your approach.
I realize, this could deserve being a new topic.
It is important! For me at least. Not long ago I was able to extract location from Google (that I did not have access to before) and I did re export 1800 images with the only change of altering the metadata.
Also - going back to old processed images and re exporting to different resolution is common for me.
I very much utilize the grouping / sorting in light table by tags. So my collections are dynamic and change overtime.
In tern they produce albums or sub albums for different purposes - external sites like instagram, flickr, sharing with friends and TV display. I can certainly think of uses that can come in the future but are not present currently because this did happen more than once for me.
I really don’t know about this, because I don’t have a particularly good system myself.
I think all the others have made better points than I can really.
To start with I think I would be content with the original processing. The photos you showed us look good. If there’s any images where you feel they don’t look as good as they should you could try reprocessing. But in terms of outright ‘quality’ you’re not going to see much difference anyway, by moving to the modern set of modules. Some of the improvements are more about flexibility in the workflow and that kind of thing, as opposed to ‘this makes much better images’.
Again, for what it’s worth, dt does take backwards compatibility seriously, so I think for the foreseeable future you should be safe relying on dt to preserve your edits. Glitches aside…
My approach, for what it’s worth, (not much!) is to keep all the images as raw files, good and bad, on two external hard drives (a 2TB drive doesn’t really cost that much and holds loads of images). One is my ‘working’ copy, and the other is a backup which only gets plugged in to copy files to it, then stored away again.
In each folder of photos, I have a folder called ‘processed’ where I export medium quality jpgs (around 1-2 MB each) of all the images with more than 2 stars. That’s it. Those exports go to my website/Flickr/email/print or nothing at all. If I want a higher quality version I will simply re-export it from dt.