Scene-referred workflow: file format question

Hello everyone, this is my first post here.

I switched to Darktable at the end of last year, and recently I have been exploring how to enhance my current workflow by switching to scene-referred processing.

I’ve read a few tutorials and I do believe I get the gist of it, but there’s one detail I never see mentioned: saving the file.

For over a decade I have used a raw to 16 bit ProPhotoRGB TIFF workflow. I did this in Lightroom and continued in Darktable.

Now that I have switched Darktable’s settings to scene referred, I understand that I optimized the raw conversion. Meaning I need the original raw file, the XMP, and a compatible version of Darktable to read both and give me the image that I can output as desired.

But if I want to save the image in its processed state so that it is independent of Darktable—ie an archival version of the post-processed result—should I continue saving as 16 bit TIFF but in RIMM colorspace? Or 32 bit?

Would saving in ProPhotoRGB defeat the purpose of using scene-referred?

Or should I use OpenEXR format (which I have tried but introduces contrast and saturation shifts when opening in GIMP)?

Thanks for any enlightenment you can provide. :slight_smile:

I don’t see what ProPhoto gets you in this situation. The images is most likely tonemapped and prepared for an SDR display, so just use something like AdobeRGB or sRGB. If you want to be a little bit ready for the future, maybe something like P3.

1 Like

I don’t understand color profile much so I can’t give you explanation. But as I experimented for my dt-nind-denoise workflow, I don’t need to specify --icc-type when exporting to TIFF from darktable-cli. It’s more important to set the input color profile of the TIFF file when re-imported back in.

TIFF 16-bit should work fine for most cases, but I found data from highlight reconstruction is only retained when exported as TIFF 32-bit. For my workflow, the TIFF files are only temporary and would be deleted later, so setting them to huge-filesize TIFF32 almost has no downside for me while ensuring no data is loss.

I’ve also experimented with OpenEXR but gave up shortly, mostly because there’s no Python library that supports OpenEXR extensively (it even throws a warning about a current security issue in OpenEXR format). Besides, EXIF is not embedded in OpenEXR so you’ll lose all the needed info about the shot.

1 Like

The original raw data was at best 14-bit (are there any 16-bit cameras out there yet?), so 16-bit TIFF doesn’t lose anything you captured tone-wise.

Color-wise, if your archival copy isn’t meant to be viewed I think you could save it in most any high-gamut colorspace as long as you make sure a corresponding profile is embedded for the downstream software to use. If one of the intents is viewing, then a rendition-gamut colorspace can work, as long as you recognize that the gamut has been compressed way down from the original capture.

Me, I do it backwards; I first do a batch-conversion to small JPEG ‘proof’ images. Then, if I need an intermediate for further processing (about 5% of the time), I’ll re-open the raw with the proof’s processing and build the intermediate from there.

1 Like

I defer to more knowledgeable people here, but for me I export JPGs as a sRGB for web display. I do a 16bit tiff file in Adobe RGB profile for archival storage of the edit. I welcome criticism of this approach if I am wrong. I don’t worry with ProPhotoRGB as I can always go back to the RAW if I need the extra color space.

3 Likes

As far as I know, there’s no reason to change your exporting habits just because you switched to a scene-referred workflow; the final step in the pipeline is the same for both workflows: conversion to the selected output color profile. And in both workflows there’s a conversion to display-referred somewhere.

What changes is where you do the editing: before or after the conversion to display-referred… And that is largely transparent for the user, as the order in which you apply edits is independant of the pipeline order.
Caveat: in a pure display-referred workflow, you have to be careful to keep all pixel values within the range 0…1 all the time, unless you know what you are doing (any kind of tone/color mapping can cause issues for values outside 0…1, as the results are not defined)

Not claiming to be more knowledgeable, but that seems the most reasonable as it fits the intended usage. OP was talking about archival storage, and that’s a more complicated issue, as it implies future proofing and optimising data preservation. (I would suggest a floating point format in the working colour space, linear Rec2020 RGB, but that might be overkill, or have issues I’m not aware of).

5 Likes

Yeah, got me to thinking about the use cases of ‘archival’, seems that there’d be two objectives:

  1. Preserve as much information and resolution as possible of the capture;
  2. Be able to read it.

Sounds like to me a 32-bit TIFF image encoded in linear ProPhoto with the corresponding profile embedded would meet the need. It wouldn’t be what you’d want to display, but renditions for that purpose could be readily made.

2 Likes

Indeed, after reading all the replies (thanks everyone!) and doing extra research, I think a 32bit tiff in linear prophotoRGB is likely the solution.

Just to think outside the box a bit: I’ve been archiving only the original RAW files for 15 years now (about 4TB collection), not the processed files (except the final exported JPG).

I do keep the XMP together with the RAW files, of course too-old version of the XMP won’t re-apply correctly with new versions of DT, but I always start from scratch again whenever I need to pull a RAW file from the archive, simply because the tool got better, and my taste also changed over the years (hopefully better).

For example, I used to tweak sharpening and profiled denoise (multiple instances for chroma and luma noises), and I was proud of my work :smile: But none of that can compare to using nind-denoise and RL-deconvolution now, ISO10,000 looks as clean as ISO200, details reconstructed without halo. Who to say the scene-referred workflow is the final best solution? Something better will come out eventually.

You should archive the RAW files for sure, why archive another copy with even larger filesize that can be derived from the RAW? Can you explain more about what you’re trying to archive, and/or the reasoning behind it?

2 Likes

Quite frankly, I think it is an artistic thing. The tiff is the end result of the artistic process and vision.

I strive to have a software-independent collection of images (I’d been planning to dump Adobe for over a decade, but had to stick with it for work reasons), so being dependent on the Lightroom catalog or the Darktable XMPs is a weak link in the chain.

I enjoy the tech aspect of storing all the image data, and making sure my collection is archival in that sense too, so as they say storage is cheap. I guess I don’t worry about hard drives much.

1 Like

Same here, I just keep buying bigger HDs.

That said, my archival format is the original raw, owing mostly to my aerospace background where we relied on the original recording of telemetry as the data of record. I also wrote my own raw processing software, so I’m pretty confident of it working through my life. What my kids do with all my crap after I croak is their business… :laughing:

5 Likes

So far my whole collection is mostly a journal of my daughter’s life, but she has too many activities (theater, karate, dance, …), the collection has been growing non-stop, especially with the addition of HD videos lately. We’ve been revisiting those albums from time to time, at least they’ve been useful.

I have 3 copies of backup (one off-site). I do a read-verify occasionally and replace the failing drives with new ones. 2TB+ HDDs are only available recently (mostly the slow SMR), I had to come up with a scheme involving multiple smaller drives and a hacky bash script for rsync (basically freezer-fridge-cooking model).

1 Like

I use rsync a lot for file transfer, not just for backup. I maintain a site for the railcar restoration that holds what I call ‘engineering’ photos, mainly to record how things were put together before they take them apart. I have an rsync script that syncs only the JPEG images with the website, then syncs both JPEGs and raws with my backup computer. I just wish it played better with Microsoft filesystems.

1 Like

Out of curiosity, what problems are you having? I use rsync from several Linux systems and one Win 11 system to another Linux system. The only problem I have run into is with file names that contain colons (like log files, whose names I could easily change).

My concern might be dated, and it also may be due to the particular file system I used, but when I tried rsync with portable hard drives formatted for Windows it didn’t resolve the modify dates well enough to copy only then changed files. It may be because I used a FAT filesystem, don’t remember.

I think rsync has added something to handle that scenario:

       --modify-window=NUM, -@
              When comparing two timestamps, rsync treats the timestamps
              as being equal if they differ by no more than the modify-
              window value.  The default is 0, which matches just integer
              seconds.  If you specify a negative value (and the receiver
              is at least version 3.1.3) then nanoseconds will also be
              taken into account.  Specifying 1 is useful for copies
              to/from MS Windows FAT filesystems, because FAT represents
              times with a 2-second resolution (allowing times to differ
              from the original by up to 1 second).

              If you want all your transfers to default to comparing
              nanoseconds, you can create a ~/.popt file and put these
              lines in it:

                  rsync alias -a -a@-1
                  rsync alias -t -t@-1

              With that as the default, you'd need to specify --modify-
              window=0 (aka -@0) to override it and ignore nanoseconds,
              e.g. if you're copying between ext3 and ext4, or if the
              receiving rsync is older than 3.1.3.

(rsync(1) - Linux manual page)

1 Like

for me with Daylight Saving, I even have to set --modify-window=3601 as the clock shifts 1 hour twice a year :frowning:

1 Like

okay, that might work. Thanks!

1 Like