Hi All,
when using wavelet decomposition in Gimp 2.10 I noted the frequency layer is blended as “legacy” and not as “default” as other layers.
Can someone explain me because if I switch to default mode there is a very little difference with the original image??
Hi paperdigits,
could be the scripts is outdatated but if I make the manual procedure (duplicate the layer;blur it;make it as grain extract mode;make a new layer from visible and make it as grain merge mode;switch the blurred layer as normal mode) and set the higher frequency layer as legacy the result pixel of blurred+higher frequency layer is the same as original layer; if a set it as default there is a little difference.
There is also a native “Wavelet Decompose” function available using GEGL. Its probably a better option to use. (I’m not near my computer right now to actually find it in the menus, but take a look!)
Hi afre,
what I mean is when I use wavelet-decompose filter it creates the grain merge layer as legacy mode.
In this way if I put a sample point it has the same value on base layer and on sum of splitted layers.
If I change the grain-merge as default the sample point is different.
Hi, trying on some images I found the new grain-merge layer is created legacy or default by something I do not know.
with foglia.tif (37.0 MB) I have legacy mode
with _DSC9196.tif (57.4 MB) I have default mode
If a change the mode proposed the sample point changes.
Therefore my question is: Which is the variable which do a grain-merge as legacy or as default??