compiling Darktable from source breaks sudo apt-install

I’ve enjoyed compiling Darktable from source for a few years now. Somewhere on Github is a script that helps update shared libraries, as needed to successfully compile the latest sources.

But as much fun as that process is it almost inevitably leads to breaking the Debian apt install system–where subsequent sudo apt install attempts, for some other softwares–leads to the dreaded “error: try apt --fix-broken” …(which almost never works).

At that point I usually reinstall the OS. I’ve been using Mint recently. I have /home mounted on a separate disc, so reinstalling the OS is almost second nature at this point. I can do it in an hour or two.

But still. This is awkward. I guess that’s what flatpak is for.

However. Now my question. Is there a way to upgrade the shared libraries, as needed for a successful latest and greatest Darktable compile, in a way that also mends the broken sudo apt install mechanisms?

Or should I throw in the compile towel and work only with flatpak?

I don’t understand. I’ve built darktable from master for years, and never had an issue like what you describe. Do you build a deb package and install that? I just build and put the compiled binaries to ~/darktable-master, without building and installing a deb package.

1 Like

@pittendrigh , I’m curious what the problematic shared items are. I haven’t had any problems and I take no special action re. shared items, and don’t know how to anyway. I’m using ubuntu 20.04.

The only need for extra code I’m aware of for me is to add a repository in order to have CR3 raws working, but I haven’t done this, funnily enough because I wanted to keep my system clean/safe(r). And the new CR3 module is in ubuntu 22 I believe, so I’ll have it eventually.

I’m a Debian user, have been for years and also build darktable myself. Because I do/did a lot of testing I build, install and have access to 4 darktable versions at any given time, all self build/installed. I never ran into the problem you are describing.

I don’t need to update any libraries myself before building darktable successfully. Debian keeps everything up-to-date for me. The only problem I run into in the last few weeks is GCC, but that’s because I’m lazy and haven’t updated to Debian 11 yet, GCC 8.3 is basically too old. Building using clang instead fixes that issue, though. So even a rather old and almost deprecated (~90 days to go) Debian version satisfies the needs to build/install darktable.

I rather curious what you answer will be to @kofa’s question.

One thing that makes me frown:

Why would you need to run that, 'cause this might be at the base of your issue. Shared libraries are in Debians hands, unless they are for a very specific set of self compiled programs and installed outside of the normal paths. Everything in /usr/lib* and /var/lib needs to be kept up-to-date by Debian. Don’t go messing with those libraries unless you really know what you are doing.

Also: Where are the files installed after your build? I do hope you use /opt or /usr/local (if you want darktable accessible system wide) or in $HOME (which also negates the need to use sudo). Don’t use /var or /usr.

All libraries by the system are in /lib and /use/lib.

As long as you compile stuff from source and place it in /use/local/lib , you shouldn’t be able to break any of the distro packages. You might need to take care during compiling packages that the versions in /use/local are preferred over the ones in /usr , but that often can be done with a simple pkgconfig rule .

Overwriting files from a distro package also should not result in the errors you describe. Apt may start complaining about it not wanting to overwrite files already there , but that can be forced.
The errors you describe come from dependency issues.

It sounds like you installed packages not meant or not compatible with your distro (or your distro version).
Specially if you ‘use a script from github’ it may very well just install stuff from a newer Ubuntu version into an old version or something like that.

Or from a rolling debian ibstallation where your installed versions are too far behind the rolling release .

There is never a clear cut simple answer in the Linux world , but I’d say stick with packages meant for your distro and distro version. Do not cross install debian stuff into Ubuntu or take a .Deb file from a newer Ubuntu version into an old.

If you want a newer version from a package / library, compile it from source and put it into /usr/local (which is 99% of the time the default from source installs) or somewhere in /opt . Places meant for stuff outside distro control . Then you’ll never ruin distro managed files and (more important) the dependency hell.

At some times I’d take a deb-SOURCE package from a newer Ubuntu version , and then use the dpkg tools to build a version that is compiled on my system.

But that still can overwrite files from the distro version and often requires some hacking in the build script to disable checks or disable features .

But I’d suggest compiling the libraries you want updated from source. But the real question is which libraries you want updating to the latest . For the whole GTK/glib stack i don’t see the point , stick to distro supplied stuff.
The same for libraries with little noticable features (libpng / zlib / libjpeg). I can understand it for things like avif and other newer , still in active development, dependencies.
But there should be no need to recompile everything. Often, just darktable itself with it’s git submodules is enough.

I should not have posted this question because I’m buried right now and cannot do what it takes to diagnose the problem (work with suggestions above). Perhaps later this month.

The “sudo apt-get install” version right now is 3.8.1 which is more than new enough for me.
It has been further out of date in the past.

I think maybe the OP is referring to these instructions from the wiki to get the dependencies…

So maybe did something like this??