Should I use PNG file format or not

Since it’s related to this discussion: I just stumbled upon a comparison of compression levels in the GIMP forum posted by @Ofnuts .

Summary: Increasing PNG compression level beyond 4 brings only little improvement in compression but significantly increases compute time.

1 Like

Ah, that would make sense.

Thanks for that! I guess I can stop worrying about the fact that RawTherapee forces level 6 now. (They probably hardcoded that after reaching similar conclusions.)

This is actually the zlib default value IIRC.

1 Like

I found the post around JXL effort/size. This was in Feb2024 using libjxl available at that time (llibjxl.x86_64 1:0.8.2-3.fc39). It might be better now.

Using a RW2 image (5200x3900) at full scale, I looked at the timestamp from the end of pixelpipe to export complete. Quality = 100 (lossless)

* e1 19.0320 - 18.1609 = 0.8711 seconds | 24MB
* e2 705.0124 - 704.0106 = 1.0018 seconds |22.5MB
* e3 158.8485 - 157.6353 = 1.2132 seconds | 19.8MB
* e5 299.7103 - 254.4952 = 45.2151 seconds |19.0MB
* e7 622.9569 - 490.2849 = 132.672 seconds | 18.7MB
* e9 2174.5955 - 941.1407 = 1233.4548 seconds | 18.4MB
* tif uncompressed 2520.4391 - 2519.6864 = 0.7527 seconds | 58.5MB

Therefore for Q=100, an effort more than 3 is not worth the encoding time to save a small amount of size.

Same image and full scale, but quality = 80 (lossy like jpg) and a jpg export

* e1 25.6337 - 25.2424 = 0.3913 | 1.0MB
* e3 35.1569 - 34.767 = 0.3899 | 937kb
* e7 46.5521 - 43.6494 = 2.9027 | 912kb
* e9 99.1974 - 52.7649 = 46.4325 | 841.3kb
* jpg q=80, chroma=auto 197.1039 - 196.7244 = 0.3795 | 1.7MB

Size differences are from lossless are significant. Compared with jpg is also almost half the size. e3 compares to the same encoding speed as jpg. Comparing e1, e3, e9, I cant see a difference at 400%. I might see some minor differences at 800% zoom. I think an effort of 3 is also good for the lossy exports.

2 Likes

Perhaps, but it is written in the code nonetheless. I found it by searching this weird PNG_FILTER_PAETH constant name, because… it appears verbatim when I run the CLI with -h. :rofl: “Compression is hard-coded to PNG_FILTER_PAETH, Z_RLE.” :alien:

Unlikely.

The file size is not directly related to the image bit-level.

There is a lot of white border involved. I presume (only presume) that PNG compresses this information more efficiently. Also bit depth of 8, 16 or 32 didn’t contribute to file size in my tests. I tried that as well.

This exercise has been very informative to me and I have changed my Tiffs to compressed and my PNGs to uncompressed based on the results. BTW, some of my tiffs are not readable by programs including Canon’s own editing software DPP, but DPP doesn’t recognize the PNGs at all. So there are potential issues with both file formats to be aware of. However, for me PNGs are only an intermediary step in the workflow and having a unique file extension makes their identification as such easy in my workflow.


Screen shot of DPP. Note the white borders I talk of and the fact that not all tiffs are read correctly.

Could it be a case of “file too heavy, no thumbnail generation to save resources”? Did you try actually opening them, or just looked at this global listing thingy?
(Not using Canon’s app myself ’cause not cross-platform and I prefer sticking with RT anyway.)

I have had this problem numerous times with tiff files and it is not based on file size. Tiff files can be very complex and include layers or even small thumbnails of the image. When opened in GIMP sometimes the offending tiff file has two layers. One a full size image and the other a small thumb.

1 Like