blur produces stripes

Hallo,

I made
1000,1000,1,1,1
blur[0] 750,0,0

and got a lot of stripes.

Daniel

blur is known to produce stripes when the standard deviation is a similar size as the image’s largest dimensions, but which version are you using (not just the version number but also whether it’s a plugin for something or the standalone CLI version)? Can we see the output images you’re getting?

I use 2.9.0 (Windows). I call it from Cygwin.

1000,1000,1,1,1
blur[0] 750,0,0
-o[0] f.tif

f.tif (3.8 MB)

The latest stable version is 2.9.1 though. Anyway, I’ve found those lines you were talking about in the image you posted and using local variance normalisation I’ve made them much more obvious. Ignore the halo effects towards the top and bottom, they’re to do with the large-scale blurring. The dark borders should also be ignored, that’s an artefact of the normalisation.

Change the third parameter to 1. That would give you true Gaussian.

'kernel' can be { 0=quasi-gaussian (faster) | 1=gaussian }.

I have another problem:

4600,3300,1,1,1
blur[-1] 2850,0,1
cut[-1] 0.15,0.2

The result is somehow off-centered.

DanielDD

That was unexpected. I would instead do

gmic 690,500 gaussian 100% cut 5%,60% normalize 0,255 output eg.png

eg

If you want to an oval, do

gmic 690,500 gaussian 150%,100% cut 5%,60% normalize 0,255 output egwh.png

egwh

Play around with the parameters.

This is due to the calculus precision (in float) used in the blur code.
I’ve switched to double by default, with this commit:

This does slow down the computation a little bit (-5%) but produces a correct result even for large sigmas.
It will be available in next version G’MIC 2.9.2.

Thanks.