Building using master branch and data versioning?

I would expect that that each successive release-build runtime does whatever upgrades are required to move to the next release version (e.g 3.8 to 4.0) of the database, any setup or configuration data/files and the XMP files. I initially installed 3.8 and later 4.0 and everything appears to work fine.

So wondering how this works for building the main branch incremental changes? Is it reasonably safe to assume the code and corresponding data are coherent and can be used with minimal risk?

Clearly there are things that can be backed up like darktablerc and various related setting files, but changes to XMP format seem more at risk. Is there anything else to do?

Thx for any info

In practice this might be mostly fine, but you should not assume as such. Master is for development, things can change and move around, your edits are not guaranteed to be preserved.

You need to check the commits merged into master - usually stuff that results in database changes can be easily identified.
Same for functional updates of existing modules - but there you just loose just the module specific edits

I think most people who want to really preserve the old information in the database, make sure that a new test-build runs with a different config-directory, so it has it’s own config and it’s own database. That means you need to (re)import images you want to work on, but for testing stuff out you get a ‘complete separate instance’.

You need to be careful if you write XMP files though, those can be overwritten since they are placed next to your image files, not in a config directory. But most metadata and edit-information goes into the database anyway.

Thanks for your responses. For now I’ll stick with trying to build 4.0 on my MacBook M1 to see what a native build brings to performance and learn a bit more about dt.