G'MIC and pc processing power

Hi
I wonder what is the minimum processing power under Windows based pc to run G’MIC (as plug in for GIMP). On some of the G’MIC filters it can take up to a minute on a ‘normal’ jpg (size 4922 x 3282). These include filters such as Tone Enhance, Sharpen (texture) and Local Normalisation
My pc is running Windows 10 64Pro, AMD Phenom II x4 955 320ghz chip, 8gb ram and video card is AMD Radeon HD5670 1gb.
I’m not certain if this slowness was there before I upgraded to Windows 10 as I’ve only been using G’MIC very recently (before that it was just the odd film emulation in G’MIC).

Not being that techie is there perhaps a Windows setting that I should be checking?

So far as I’m aware (and that’s not very far :wink:), no, there are no settings to check as far as speed goes. From what I’ve read and understood, gmic trades execution speed for ease of filter development.

There are no “minimal processing power” requirement.
G’MIC has hundred of image processing filters, some of them are reasonably fast, other are slow, other are really slow :slight_smile: All depend on the image resolution and the complexity of the algorithm behind the filter to apply. Is the algorithm parallelized ? how many cores it uses ? etc…
Note that from an image processing point of view, a 4922x3282 is definitely not a ‘normal’ jpg, but can be considered as quite large (look at the recent papers in this field, they still illustrate the results of their algorithms with 512x512 images…).

I can speak at least for the Sharpen [texture] filter : this filter is mainly based on the so-called Rolling guidance filter, which requires several iterations (a dozen) of guided bilateral filtering on the input image, so even if the bilateral filter in G’MIC uses multiple cores when available, the overall complexity of the algorithm is clearly high.

It’s actually interesting to see the differences of perception about what is a ‘small’ image for photographers and people creating advanced image processing algorithms :slight_smile: A 4000x3000 image is really small for a photographer, while it is huge for the others. The complexity of most advanced IP algorithms is actually too high for such image sizes (for real-time or computation within seconds). That’s probably why most image retouching software do not have these kind of algorithms implemented (except when they can be efficiently GPU-ised, which is not always possible by the way).

EDIT : That’s also why I still think that having only on-canvas preview for image filters in an image retouching software is not recommended. You cannot expect any interesting IP filter to be running fast enough for all kind of images.

2 Likes

So, a scientist in the field of image processing has to be be patient. I can imagine it took days to process an algorithm on a 128x128 image, 40-50 years ago, with a super computer. :smile:

I know we now use (a lot of) algorithms from that period in realtime.

I’ve just tested the filter ‘Sharpen [texture]’ on my machine, with a 4922x3282 color image, and it took actually less than 4 seconds to compute the result.
I admit I have a quite powerful machine (24 cores, 32Gb of RAM, running Ubuntu Linux), but this is still surprising there is such a difference.

I dream of such computing power!

I’ve also tested on my virtualized Windows, with 4 cores, and it runs also in less than 10 seconds.
So more than 30 seconds with the filter ‘Sharpen [texture]’ seems to be excessive in any case.