stacking with RICE compression gets different results on the same data each time it's run

Hi,

Just a little interesting observation.

A few of us were testing Nazstronomy’s new smartscope stacking script and we noticed something a bit odd:

If we ran the exact same data through it with RICE compression, it produces slightly different results on each run (on the same data/same settings) as if the lossy element has a randomization element. (Yes we cleared the process data each time)

But we also noticed the resulting image size may change by +/- 1 pixel and it can be shifted 1 pixel and slightly differently registered each time we run it. We think it’s because if we compress the original calibrated files and they are used for registering and weighting and so forth, if RICE is outputting slightly different values each time for the same images, there’s a small chance it will trigger a cascade of small downstream differences.

So the resulting images are of the same quality, but when comparing them digitally, like one may do when doing regression testing, you get numerically different results and even a different image size by +/- 1 pixel.

Just an interesting observation if you every do any automated regression testing. Also, it made me wonder, does RICE have a randomization element, like perhaps if it seeds with a random number possibly for use of carrying any truncated low order bits or something?

The other thing I’ve wondered, based on my programming days years ago, is sometimes this is a clue of an uninitialized variable somewhere in an algorithm.

How to test? Just turn on RICE compression with quantization level set to 16 and run the OSC_Preprocessing_WithoutDBF script on some data multiple times and compares the results with pixelmath… you’ll see low order residual differences. Then if you do it with platesolving and registering/stacking with -framing=max, you may even see the image be shifted a bit or the output image be different in size by 1 pixel.

This may or may not be interesting to you, but I thought I should tell you.

Cheers!

  • Steven

Hi steven,

There is indeed some randomization when using Rice on 32b images because Rice algorithm was written for integer images. So the image is first quantized using some dithering (we use subtractive_dither_2 method in siril to preserve the 0 values are kept as exactly 0). More on this in cfitsio docs:5.6 Image Compression
So i guess what you have observed is to be expected, both the slightly different result and the final different size. From what i’ve seen, the images produced by the seestar are quite noisy and only a handful of stars are detected in each frame. So that registration may be affected when trying to match short list of stars between images and the reference or when platesolving. Small position errors cannot even out because of the limited number of stars and may be larger in magnitude if image is very noisy.
Hope this helps, thanks for reaching out,

Cheers

C.

1 Like

Ah, mystery solved!

I was an imaging scientist for HP developing algorithms for the printing pipeline where we would take documents, render them into RGB, often using lossless and lossy compression in portions of the pipeline, then do the colorspace conversion and color matching into CMYK (or CcMmYk on many 6 color photo printers). During this process, we would have algorithms that dealt with limited bit-depth precision (ex: 8-bits per pixel) but we didn’t want to lose any levels in the process during the numerical transformations, this often occurred during some compression or using a low order bit (of the yellow in this case) as a flag bit.

We found that if you had to truncate a low order bit, if you either randomly dithered the data before the process (or carried the lost bit to the adjacent pixel), you would recover the lost levels because you are spatially recreating the levels lost on a single bit. A pseudo randomness was useful because it ensured that if the incoming data had a geometric data (like it was already dithered with a bayer matrix) that no beat patterns would be created.

I couldn’t help but think of my experiences with developing our imaging pipeline when I was reporting on this.