There are no “minimal processing power” requirement.
G’MIC has hundred of image processing filters, some of them are reasonably fast, other are slow, other are really slow All depend on the image resolution and the complexity of the algorithm behind the filter to apply. Is the algorithm parallelized ? how many cores it uses ? etc…
Note that from an image processing point of view, a 4922x3282 is definitely not a ‘normal’ jpg, but can be considered as quite large (look at the recent papers in this field, they still illustrate the results of their algorithms with 512x512 images…).
I can speak at least for the Sharpen [texture] filter : this filter is mainly based on the so-called Rolling guidance filter, which requires several iterations (a dozen) of guided bilateral filtering on the input image, so even if the bilateral filter in G’MIC uses multiple cores when available, the overall complexity of the algorithm is clearly high.
It’s actually interesting to see the differences of perception about what is a ‘small’ image for photographers and people creating advanced image processing algorithms A 4000x3000 image is really small for a photographer, while it is huge for the others. The complexity of most advanced IP algorithms is actually too high for such image sizes (for real-time or computation within seconds). That’s probably why most image retouching software do not have these kind of algorithms implemented (except when they can be efficiently GPU-ised, which is not always possible by the way).
EDIT : That’s also why I still think that having only on-canvas preview for image filters in an image retouching software is not recommended. You cannot expect any interesting IP filter to be running fast enough for all kind of images.