PhotoFlow version 0.2.6 is out

[quote=“Carmelo_DrRaw, post:16, topic:1082”]
I suppose the crash happened when you double-clicked on a layer name in the layer list? Is this reproducible? If yes, could you shortly describe the steps to reproduce the crash so that I can try myself?
[/quote]Not really. I tried it two more times and it didn’t happen anymore. :confused:

Here is the result of playing around with Photo Flow.

I generally like the idea behind it. I think a graph would be a bit clearer to work with than the layers concept but that’s probably just me. :slight_smile: For now I find it to slow to use in practice. With darktable even complex edits rarely take more than seconds to update, with photo flow it was in some cases minutes. :confused: But keep up the good work on the project, I’m really curious to see how it will evolve. :slight_smile:

This would be indeed the most natural way of representing the processing pipeline. A node representation might work as well… However, the layer concept has the advantage of being more compact.

I have ideas to provide a node-based view of the filters chain, but that’s not for tomorrow, unless somebody knows some ready-to-be-used UI library for that.

If you could share the RAW and PFI files that are very slow to process, I’d be glad to have a look and see what are the real bottlenecks. Are you using some of the G’MIC smoothing filters? Some of them are known to be complex and cpu-intensive.

Thanks for the feedback!

In GTK land flowcanvas is the only library I know of. But you are definitely right, the graph still view also has it’s drawbacks and can become quite fiddly. [quote=“Carmelo_DrRaw, post:22, topic:1082”]
If you could share the RAW and PFI files that are very slow to process, I’d be glad to have a look and see what are the real bottlenecks. Are you using some of the G’MIC smoothing filters? Some of them are known to be complex and cpu-intensive.
[/quote]Yes but even then they should only be evaluated once right? If I change a layer after that it should pull form the cache right?

It depends… it is actually not possible to keep a memory cache of each layer, because otherwise the total amont of required memory could become arbitrarily large if many layers are used.

So the solution adopted is to let all the fast filters do their job on-the-fly, without any caching. This is the case for example of curves, gradients, inversion and many others.

Some of the slow tools are automatically cached to disk in the background, so that as soon as the caching is finished the pre-computed data is used.
If any layer below a cached one is modified, then the caching process is re-started automatically. During the caching phase, the slow tools have to be computed on-the-fly and therefore can temporarily slow down the preview.

It might be that some of the slow tools are not cached by mistake, and that’s the main reason why I’ve asked about which exact tools you have been using.

The scheme is not perfect, and I’m trying to improve things further. For example, I’m considering to introduce “permanent” cache files that are not deleted when the application is closed, so that next time a given file is opened one could re-use the existing cache and speed-up the file loading. But I’m still not sure if I want to go in this direction…

If you do implement permanent cache files, make sure to validate the source image next time you open it for non raw images.

darktable makes the assumption that the file doesn’t change, and that makes it annoying for editing anything except raws.

That’s exactly why I hesitate to go in this direction… one could compare the time stamp of the image and cache files, or store the MD5 sum of the image file in the corresponding cache data… in any case, it is not trivial.

I didn’t know that DT has a permanent caching mechanism…

[quote=“Carmelo_DrRaw, post:24, topic:1082”]
It depends… it is actually not possible to keep a memory cache of each layer, because otherwise the total amont of required memory could become arbitrarily large if many layers are used.
[/quote]In my case it was a 24 mpix file, I have 32 GB of ram so that should last for quite a few layers. But yes I was using the gmic noise reduction and yep, it was writing to the disk. But it looked like it didn’t read from the cache again (just stated caching for a long time).

I tried it again now with the following setup 24 mpixel 16 bit tiff, a single layer for wavelet denoising with 10 iterations, 10 scales, threshold 1. In gimp it completed in ‘a few seconds’. Photoflow is still running. :confused:

The tiff is ~150mb but if you think it will help you to reproduce the problem I can upload it for you.

Edit: killed pf, restarted it tried again and it worked this time. Smells like race conditions/synchronization trouble. :confused:

Cheers,
Jonas

This should be enough to try reproduce the problem. Was this with the official package or the one you compiled yourself? In the latter case, did you compile a debugging version?

Thanks!

[quote=“Carmelo_DrRaw, post:28, topic:1082”]
This should be enough to try reproduce the problem. Was this with the official package or the one you compiled yourself? In the latter case, did you compile a debugging version?
[/quote]It’s a debug gtk3 build. The other wird thing is it seems to reevaluate the wavelets even if I just have curves on top.

Another question regarding the curves, what are you using to fit the curves to the points? It almost looks like some kind of polynomial regression. I find it quite hard to get the curves I want with that, but maybe I’m missing something.

Debug builds will run slower, because compiler optimisations are turned off… but then it is easier to catch the exact line of code where crashes occur.
I will check if there is still some bug in the cache updating logic. Normally it should only re-cache a layer when some input is changed.

I am using a spline curve interpolation that, as far as I remember, I derived from GIMP curves.

[quote=“Carmelo_DrRaw, post:30, topic:1082”]
Debug builds will run slower, because compiler optimisations are turned off… but then it is easier to catch the exact line of code where crashes occur.I will check if there is still some bug in the cache updating logic. Normally it should only re-cache a layer when some input is changed.
[/quote]Aaaaah, completely forgot about that! :slight_smile:

What are you talking about? darktable assumes nothing about source images. We always reload the source and reprocess it when opening it in darkroom. There is no permanent cache whatsoever.

Always? The whole reason I showed up on the irc channel a week ago was because it wasn’t doing that. I restarted the program twice and then it updated.

Are you sure that you are talking about darkroom? I am 100% sure that we don’t write its cache to disk.

Now it works maximized. However, it performs VERY poorly at 1:1 scale for an a7rII file. Could you at least cache the full image at original resolution, so that whatever gets computed as you pan around doesn’t need to be recomputed if you pan back?

Also, it seems like some stuff is cut off in the UI:

And shortly after taking this screenshot, in which I had dragged the gaussian blur layer above the layer clone, it crashed with:

Gdk:ERROR:/build/gtk+2.0-gl_hAC/gtk+2.0-2.24.28/gdk/gdkregion-generic.c:1110:miUnionNonO: assertion failed: (y1 < y2)
photoflow': double free or corruption (fasttop): 0x00000000037152e0 ***
Aborted (core dumped)

The output of the RAW development phase gets cached in the background. This is indicated by the orange square below the preview area, with a “caching” label next to it. When caching is completed the square turns green.

Also, the speed of the gaussian blur strongly depends on the zoom factor. At 1:2 zooming, the blur is applied with half the radius to the scaled-down image, and therefore it is more than a factor of two faster… for radii larger than 20px, the gaussian blur should get cached as well.

Concerning the crash, I will try to reproduce it on my system. Was the caching still ongoing when you dragged the gaussian blur layer?

Thanks for looking!

It caches the result immediately after raw development, but it doesn’t cache the result after all the processing. As soon as it moves offscreen it gets forgotten, and also it doesn’t seem able to ‘cancel’ in-progress computations that are no longer onscreen.

Out of PhotoFlow, RawTherapee, and darktable, only PhotoFlow at least remembers what’s currently on the screen when you pan small amounts, but I feel like it could be better if, in background processing, it continued filling in around the image until the whole image gets computed, when nothing is being adjusted.

The photoflow gimp plugin is working but I also downloaded the stand alone version but it only sees jpeg image files, not the nef raw versions on my windows 10 computer.

You have to select “All files” on the dialog to get Photoflow to show you the RAW files:
Imgur

There is indeed still an issue with the MIME database under Windows… I’m trying to find the right fix.

@paynekj: thanks for giving the good temporary solution!