darktable 3.2 will be released in August

I’m still new to Darktable community, but I’ve been truly impressed with what’s been accomplished just since I’ve started using it. Many thanks to everyone who’s been working on it!

I’m not very good with Darktable yet, but I’m happier every day with my decision to commit to it, and the community is a significant piece of that.

Going to have to compile master on-the-side and check it out!

5 Likes

I suggest you check out master and then compile it, the other way round usually doesn’t work that well /s

2 Likes

I got my own style ; )

Brilliant. My top priority would be a speed increase in Darkroom, especially with profiled denoise on.

Don’t expect too much there - it will be a performance critical task even with the performance improvements currently in plan for 3.4 especially when zoomed in.
But there’s a workaround: simply activate the denoising at the end of your editing process. So darktable keeps being quite responsive while editing …

5 Likes

To add to this, if you do not need to preview the denoised image check out the style option of the export selected lighttable module. It can be used to temporarily apply a denoising style.

2 Likes

Denoising is already using OpenCL and is fairly optimized. It’s just that the computation uses a lot of neighbouring pixels, so lots of per-pixels computations. If the computation is heavy, the program is slow, there is only so much we can optimize, but optimization means “making it as fast as possible”, not “making heavy algos lightweight”. That would be “changing the algo”.

4 Likes

I can confirm, that’s what has always been done, that’s what I will do for 3.2.

3 Likes

Also if you want to follow what is yet to be done before 3.2:

One thing everyone (building master) can do is check modules if they are working as expected. Also checking the interface if there is nothing wrong or missing translated strings. QA is delicate as there is many modules and way to interact with darktable and no automated framework.

6 Likes

Perfect, thanks @Pascal_Obry!

This means that forward compatibility of the database and xmp files is (more or less) guaranteed in master in this phase up to the 3.2 release?

Reason to ask: I’d like to use 3.2 as my main program. I can work around or wait for bugs to be fixed, but what I don’t want to have are files edited now that are incompatible with future stable versions.

2 Likes

AV1F YAY!

Will be compiling soon, (based on when I complete my current project, with my current stable release, for obvious reasons), and will check for AV1F support in current exif/gthumb app. (Had to compile the latest exif, gThumb, and Darktable from source the last time to get metadata to work with WebP).

Yes, of course like for all other releases. We are really speaking about a release not a random master code set :slight_smile:

2 Likes

Hm, thinking about it, what about having a couple of user-selectable example regions (small crops from the image) shown 1:1 in the module that preview the denoising without requiring to do the computation for too many pixels? I always only check one or two regions (e.g. one eye and the background) to adjust denoising. But adjusting on the full screen can become slow.

1 Like

That might be a solution when tweaking denoise settings. But as it is very early in the pipe later modules might need several areas in a zoomed view to tweak retouch, masks etc. so a recalculation might be necessary for different areas.
So i prefer getting rid of performance critical calculations not really needed while editing. And if a module needs several recalculations like retouch, i do it very early in my workflow and disable the module until i finished my edit…

1 Like

Great. I’ll install it during the weekend :+1:

Hm, I thought only about the denoise module. And the preview would only be visible when the module is expanded. Therefore, the computation would only be done for the 1 or 2 little 100% previews when the settings are tweaked, but it would be a preview of the complete pipe - this would even allow to adjust settings at the end of editing, which may anyway be beneficial for the end result as heavy shadow lifting may require more denoising etc. The general preview would not be affected, even with the module on, the denoising would only be computed on export.

1 Like

Btw, I think I did find 2 or 3 bugs but I was kind of too lazy to report them. One of them was filmic creating CA-like fringes along the outlines of mountains sometimes. The other 2 I do not remember exactly any more. They weren’t very serious anyway.

Well, there is one issue: exiv2 doesn’t support ISOBMFF containers yet. So you want be able to import exif data from AVIF files …

I don’t have time to rewrite the darktable code to not let exiv2 read the file but only let darktable read it once and put the exif blob into exiv2 for parsing.

2 Likes

Despite the thread already being quite long, I just wanted to share some thoughts.

Until now I was slowly getting annoyed with sluggishness of the application (specially the lighttable) which had been causing some frustration while using darktable. Having an older system, despite a few upgrades, and a larger collection (around 100k photos) certainly didn’t help. Only now I was able compile and test drive the latest changes, and what a breath of fresh air it was. Looking forward for the final release. Kudos to all who submitted changes, specially the lighttable redesign!

Now please carry on with the ongoing topic :slight_smile:

5 Likes

After different redesigns of the lighttable from Aurélien and myself the latest work is from AlicVB and yes you can congrats him. The full rework was a huge (yes huge really!) work and was conducted to a final touch like a master by AlicVB.

17 Likes