Nikon: a specific raw sample wanted.

Hi everyone.

Pretext blurb

As you may or may not know, nikon cameras are rather interesting piece of work.
The compression is configurable (three different types), the bitdepth is configurable (2 variants).
That already results in a 6 different raw types.

If only it was that simple. :confused:

Under some unknown conditions, some nikon cameras, when configured to store lossy compressed raw files, actually produce some strange modification of raw files, that internally are split in two, with slightly different compression between two halves. Iā€™m not aware of any official documentation about it, but i suppose it is most often refereed to as ā€œlossy after splitā€.

That oddity does not seem to be stated in the EXIF, only deep within nikon makernotes, and, if decoded properly, there is no difference between normal lossy compressed raws, and these raws with split. Right now iā€™m aware of 1 J5 1 and D3400 2 producing such raws.

We always create our own problems, donā€™t we?

After the maintainership of the rawspeed library, which is used by darktable to load the raw files, was transferred to us late last year (2016-12), the RPU was created/resurrected right after that. And then the ā€œcprā€ on the library begun.

Now, here is the problem, due to the huge [unnessesary] rush, [which partially led to] and a gross code review mishap, the support for these NEF with split raws was broken. Since RPU was very new back then, and due to the [unnessesary] rush, there was no such sample, so the breakage was not detected. Later, it was finally caught by Wolfgang Goetz, and nowadays we have two 1 2 samples.

That is great, and i will finally be able to work on unbreaking that support. The one small caveat is that both of these samples are 12-bit.
It would be really nice to also find the 14-bit raw with the same problem.

HOW TO DETECT IF THAT IS THE RAW WE ARE LOOKING FOR

Now, as i have already said, iā€™m not aware of any way to auto-detect that raw easily using exiv2/exiftool. The following command can help you check compression and bitdepth of the raws.

$ find -iname \*.nef -print0 | sort -z | xargs -0 exiftool -compression* -nefcompression -bitspersample
======== name.nef                                                                                                                                                                                                                                    
Compression                     : Nikon NEF Compressed                                                                                                                                                                                                                         
NEF Compression                 : Lossy (type 2)                                                                                                                                                                                                                               
Bits Per Sample                 : 14

If the output matches this example, please try opening that raw in a fresh darktable (from git master, NOT 2.2.x or earlier). If it fails to load, and in console you see the following message, you have found it!

[rawspeed] (name.nef) void rawspeed::HuffmanTable::setCodeValues(const rawspeed::Buffer &), line 176: Corrupt Huffman. Code value 92 is bigger than 16

Please help! :slight_smile:

3 Likes

Iā€™ve found quite a few (Lossy type 2, 14 bits per sample) but Iā€™m not comfortable (experienced) with building a fresh darktable. If you can provide some detailed direction, Iā€™d be happy to give it a shot. In fact, Iā€™ve always wanted to be able to try the latest and greatest features before general release!

FWIW, the original photo for this PlayRaw is Lossy type 2, 14 bits per sample. I donā€™t have time right now to throw up a fresh DT to see whether it has the problem.

Sadly (fortunately?), that raw is not one of these split raws, loads fine.

Youā€™ll have to be more specific than that.
Building. For fedora/opensuse there are packages by @darix

I have found some on my hard drive, the good news is that all of those RAW are opening fine with current master.

Thank you for checking! Hopefully someone will have worse news than that :slight_smile:

Up. Still looking :slight_smile:

I think that such splited RAW are obtained when using a slow flash card (for example, Š”lass 10) for long serial shooting. Maybe someone will try to?

@LebedevRI-
I have 2.2.5 installed.

I pulled 2.4.0 RC2 from git master.

I have at least two images that are 14-bit lossy Nikon RAWs that fail to load using 2.4.0 RC2, but load and have been edited in 2.2.x.

The images were taken with a Nikon D5500.

Hereā€™s an exiftool snippet:

======== ./DSC_0248.NEF
Compression                     : Nikon NEF Compressed
NEF Compression                 : Lossy (type 2)
Bits Per Sample                 : 14
======== ./DSC_0258.NEF
Compression                     : Nikon NEF Compressed
NEF Compression                 : Lossy (type 2)
Bits Per Sample                 : 14

Here is a snippet of the errors I saw with 2.4.0 RC2:

[rawspeed] (DSC_0248.NEF) void rawspeed::HuffmanTable::setCodeValues(const rawspeed::Buffer&), line 176: Corrupt Huffman. Code value 92 is bigger than 16
[temperature] failed to read camera white balance information from `DSC_0248.NEF'!
allocation failed???
[rawspeed] (DSC_0258.NEF) void rawspeed::HuffmanTable::setCodeValues(const rawspeed::Buffer&), line 176: Corrupt Huffman. Code value 92 is bigger than 16
[temperature] failed to read camera white balance information from `DSC_0258.NEF'!
allocation failed???

@mikrom-
The images were taken with a SanDisk Extreme Pro UHS-I SDXC, so I can at least confirm that it doesnā€™t only occur with slow cards.

1 Like

@lptech
Hi!

Oh great, sounds just like what i was looking for!

Could you please contribute at least one of such raws to the https://raw.pixls.us/, please? :slight_smile:

@LebedevRI-

I uploaded one of the RAWs.

Do you expect to have this fixed before the 2.4.0 official release, as a 2.4.x patch, or in a later release entirely?

Thank you!

1 Like

Well, given that 2.4.0 has happened alreadyā€¦ :slight_smile:

Thanks, butā€¦ I donā€™t suppose there is some other raw without people in it?

@LebedevRI-

I donā€™t see any images in my library that appear to be 14-bit ā€œlossy after splitā€ Nikon RAWs that donā€™t have people in them.

While it perhaps isnā€™t the best for long-term public verification/testing (as I see the image uploader rule), could you use the image for development and hope someone uploads another sometime in the future?

Thanks!

Iā€™m afraid i canā€™t really give any credible time estimates. ā€œOnce itā€™s doneā€.

Ok then, this is not too bad, too.
Iā€™m sure this will bite someone else and there will be more samples :slight_smile:

Thank you for this sample!

I have a (probably) controversial suggestion - what about detecting these photos in darktable (or any other program using rawspeed and under your control), and showing a dialog once the user imports such a wanted photo?

And generally - what about checking if the combination of camera/lens has a record in RPU, and if not, kindly asking the user if he doesnā€™t want to share some photos? Of course this could be seen as a breach in anonymity of use of darktable.

By now there is enough (~5, including 2 elusive 14-bit ones) samples with this problem,
so there is no further need to actively look for these samples.

Believe me, i did think about it :slight_smile: I do not see any sane way how to do this.
Fetching the ā€˜missing camera listā€™ from RPU is, well, will access the network, and would need to be implemented in dt. And just bundling the current list of missing cameras will, at the very least, quickly become outdated.

So no, i donā€™t think it is feasible.

Yep!

@LebedevRI-

I have another likely 14-bit ā€œlossy after splitā€ RAW sample from a Nikon D750.

The image does have one person in it. Should I upload it?

Thank you, but not that one, there are two samples already (one without people at all)

Hi.

Iā€™m using a Nikon D500 and have encountered some files that fails to load in darktable, due to failure to load the cameraā€™s White Balance.

I have done some testing and it looks like I am able to ā€œforceā€ the failure of getting the camera white balance within darktable for images with 14-bit Lossy Compression if the ISO value is higher than 25600.
Lossless Compression, or uncompressed, does not seem to suffer from the same problem.

I can not say that this is the cause of the problem, but it could be worth pursuing.
Hopefully someone else can test with another camera and see if they have the same problem.

My darktable version is 2.4.1 (had the same problem with earlier versions too.)
D500 firmware: C=1.13 / LD=2.016 / W=3.00

4 images with different ISO-settings shows like this using exiftool:

$> find -iname *.nef -print0 | sort -z | xargs -0 exiftool -compression* -nefcompression -bitspersample -iso
======== ./_5001196.NEF
Compression                     : Nikon NEF Compressed
NEF Compression                 : Lossy (type 2)
Bits Per Sample                 : 14
ISO                             : 64508
======== ./_5001197.NEF
Compression                     : Nikon NEF Compressed
NEF Compression                 : Lossy (type 2)
Bits Per Sample                 : 14
ISO                             : 51200
======== ./_5001198.NEF
Compression                     : Nikon NEF Compressed
NEF Compression                 : Lossy (type 2)
Bits Per Sample                 : 14
ISO                             : 25600
======== ./_5001199.NEF
Compression                     : Nikon NEF Compressed
NEF Compression                 : Lossy (type 2)
Bits Per Sample                 : 14
ISO                             : 6400

Upon opening darktable this is shown in the console:

[rawspeed] (_5001196.NEF) void rawspeed::HuffmanTable::setCodeValues(const rawspeed::Buffer&), line 176: Corrupt Huffman. Code value 92 is bigger than 16
[temperature] failed to read camera white balance information from `_5001196.NEF'!
allocation failed???
[rawspeed] (_5001197.NEF) void rawspeed::HuffmanTable::setCodeValues(const rawspeed::Buffer&), line 176: Corrupt Huffman. Code value 92 is bigger than 16
[temperature] failed to read camera white balance information from `_5001197.NEF'!
allocation failed???

As we can see, ISO 25600 and below does not show an error. Higher than 25600 produces the error and the file will not open in darktable.
Is the same code, (rawspeed::HuffmanTable::setCodeValues(const rawspeed::Buffer&)), executed if the file is Lossless Compressed, or 12-bit?
Lossless and 12-bit seem to work with higher ISO-values, so the problem looks a bit inconsistent.

I can upload the 4 images referred to above if it will help resolve this issue.
Just let me know.