In the DT preview I have a strong effect haze removal module (negative values; adding haze), but in the jpg image (exported) the effect is much smaller. What could be the cause?
darktable usually operates on the displayed image and limited resolution (just whats needed to fill the frame on your display) to speed up processing. If you’re in 100% view then the result should match.
You might activate 'high quality" processing in darkroom.
the the calculation is done on the full resolution - at cost of speed
If you have an integrated GPU, then watch for your RAM usage when you click it (DT fills my RAM and crashes if I have many other things open)… if that’s supposed to happen?
you use ‚unrestricted‘ darktable resources setting?
Then darktable intentionally doesn’t take care on memory requirements of other software - so you need to care yourself. This side effect is explained in the manual.
Nope, it’s set to ‘Large’. Even OpenCL settings are set not to use all memory.
There have been a lot of improvements in the 4.9 branch, I’m not sure if you are working on that branch and if you are running on Windows with little RAM and a not too fancy GPU (things that apply to my system).
But you may anyway have a look at Culling and zooming to 100% puts the system to full load / crash of dt / Windows blue screen · Issue #17684 · darktable-org/darktable · GitHub that also includes a hint by a developer to add a larger timeout for the GPU to prevent the system from earlier falling back to CPU. This issue was related to the lighttable view, but perhaps this hint also applies to your issue.
No, I run 4.8.1
16 GB RAM with integrated GPU on Win 11, I actually haven’t tested it when my RAM is half empty yet. I could try, to see if it also crashes.
That was the reason. I didn’t know that. I see I’m lacking in the basics. Thank you.