color calibration DT4.0

Currently just off the master…I did merge the new HR branch but that was after I tested it… I do use msys2 and I just use the build script to build it…ie build.sh…

I will try to look at the best way to summarize my build and OS settings in detail incase that heips…

I have this too, and have it since 3.8+ somewhere

Never thought much of it.

I don’t know if my win10 system has it , but my current win11 system has it. Always Nvidia based though (gtx1060 desktop or rtx3050ti laptop)

Does changing your display profile make any difference?? Just throwing things out there… I will redo what I did in the video with debug on. This should provide a log that does not have the same lines as a log that comes from a fail at least one would think…this still might not show much if the issue is an OS setting… I know i did go through my OS when I got the new PC with Win11 and made sure all the color management settings were manually selected by me to be sure I was getting from the OS what I was expecting…

Just tried my old PC at work and again no issue like this on the few images I tried from a playraw folder I have…its an older core i7 intel with Win10 Pro and 16GB memory and old GTX 745 Nvidia card with 4GB…

I can toggle back and forth in CC and I can’t trigger the black screen…

@priort can you try with denoise wavelet on? Or anyone else on windows?

Ref: https://github.com/darktable-org/darktable/issues/12197#issuecomment-1200338607

Still no issue. I went back and downloaded the image and the original xmp from the OP and found no issues… I see that it was using non local means…switching it to default wavelets did not cause any issues either…sorry I haven’t had time to do any debug log. I might later today and I will post it so you can compare with the sections on yours that you suspect…

… I see this with

this is darktable 4.0.0
copyright (c) 2009-2022 johannes hanika
darktable-dev@lists.darktable.org

compile options:
  bit depth is 64 bit
  normal build
  SSE2 optimized codepath enabled
  OpenMP support enabled
  OpenCL support enabled
  Lua support enabled, API version 8.0.0
  Colord support enabled
  gPhoto2 support enabled
  GraphicsMagick support enabled
  ImageMagick support disabled
  OpenEXR support enabled

on

System:
  Host: Rechner Kernel: 4.19.0-20-amd64 x86_64 bits: 64 compiler: gcc 
  v: 8.3.0 Desktop: Cinnamon 3.8.8 Distro: Debian GNU/Linux 10 (buster) 

Example is with “detect from image surface”

CAT16 with 3948K (invalid) on the right … image straight after open on the left (scene-referred, modern color)

“detect from image edges” works with this image:

The problem is not the algorithm yielding a weird cast , the thread is about turning the image completely black or blue… like unrecognizable.

I hardly use denoise and I do have the issue , so denoising with wavelets or not has nothing to do with it in my mind.

The suspicion was 'nvidia on windows ’ but priort seems to rule that out (did you try resetting color calibration And then retrying one of the auto detect options ?).

Could still be Nvidia windows related , because so far the people describing the issue are on a gtx1000 series or newer .priort’s 700 series card is quite a difference architecture .

And also no issue with my 3060Ti

@jorismak and @priort The issues happens to me with OpenCL turned off (CPU path).

Todd, can you pastebin a pacman -Qe from your build environment?

By the way, I’m reading channelmixerrgb.c trying to understand how the ai works. This comment in row 1181 is hilarious:

// note : it will only be luck if I didn't mess-up the computation somewhere

Will do. Also I have seen log output after crashes and from the command line but if I do -d all and run a GUI instance where is the output saved… I know I should know…

build config.txt (1.9 KB)

priort saying no issue with his 3060ti means my theory is already broken.

Your message that it also happens with opencl turned off means it’s completely down the train :wink: .

Bit flabbergasted now though. It would hint at a problem in gtk / gdk internals of the Windows port… but since some Windows people are not having this issue while others do… :thinking: . Maybe it’s a nasty combination of packages in my msys2 installation.

I actually tried again (it’s been ages and I just don’t touch those auto-detect options, because they never gave good results for me). It works fine now on a .ORF file (although both options make the colors way too warm).

But if I try them on the DNG I normally use (of the same file), the first ‘ai’ seems to work, but gives a completely different color cast (very wrong, quite blue / green), and the second option gives a complete black screen.

If I change the white balance from ‘camera reference’ to ‘daylight’ (not that much difference visibly) and then try the auto detect again, the first ‘ai’ option now suddenly gives a way too warm result (just as with the source ORF file), but the second ‘ai’ option still gives a black screen.

With the ‘black screen’ syndrome, the hue and chroma sliders show no value, so they appear to have a value way outside the normal range. The moment I double-click on both of them (so they show a value again) the image is back.

Messing with the white balance module can make one of the ai options suddenly ‘work or not’, but also affects the outcome of it (??). And messing with the black-correction in the exposure module also seems to effect it.

But this is all with a DNG file, which has is values rewritten to be between 0 and 65535 instead of original black/white values of the RAW file.

As I said, the original RAW file works fine. Maybe it isn’t Windows related at all, but has to do with the black/white levels and white-balance presets of the file, and those can differ camera to camera / shot to shot.

I think the last paragraph is the most interesting part in this thread. You could check data for differences in your orf and dng’s.

BTW my cam writes dng files out of the box and have never seen such a problem.

I’m seeing the problem in cr2 and orf files too.

It seems to me that whatever the ai function is doing yields a value for the hue degree and temperature that the slides can’t handle. The history shows NaN for 3 values (NaN = not a number).

For me the issue happens more often in the edges (borders) detection. I started to read thru the code but I’m not sure if I’m looking at the right section in the 4k rows. The math seems complex (radians to degrees and a lot of division).

The DNG is a linear dng, written by another program, so already demosaic’d / denoised / sharpened / lens corrected. And although the ‘color space’ isn’t altered (no profile or matrix applied yet), the data is scaled, so black level is always 0 and white level is always 65535.

‘Comparing data’ seems a bit difficult in this case.

Since it does happen in other files that are true raw files, I have a feeling with the NaN notices that others have, something isn’t just properly clipped in the calculations.

Yes, nan pushed to the pipeline would exactly generate such issues. The Cairo writing to the surface doesn’t like nan at all.

Thanks for sending the build config. I checked and we both match except that you have mingw-w64-x86_64-portmidi 1~2.0.3-1 that I dont use. I also have the UCRT packages since I’m building on UCRT now.

1 Like