Why RAWs instead of, say, TIFFs?

You could start a PlayRaw where you share a difficult RAW and corresponding JPEG. Maybe you get some inspiration from the edits of others on how to approach your processing?

3 Likes

Yep, that’s the thing that kept me JPEG, although not so much duplicating its results but rather getting a result as good. But, that’s the edge of the rabbit hole; it takes a bit of teasing apart the process to understand what it takes, and then you can deftly put the parts together into pleasing renditions.

The realization that crystallized it for me was that tone and color are the two fundamental things we change to go from the flat RGB after demosaic to an acceptable rendition. They’re related in that changing one usually affects the other, but treating them as distinct operations helps one work intuitively to good effect.

Also, I stopped thinking of JPEGs as anything but final renditions for a specific medium and purpose. My current workflow for anything, family snapshots to carefully considered landscapes, is to shoot raw, then batch process to 800x600 JPEGs for proofs. For my family, that’s usually all they need. I wlll go through the proofs and re-do ones that maybe need exposure adjustment or shadow lift, but I do that by re-opening the raw and applying the proof processing, and using that as a starting point. I never edit a JPEG anymore.

rawproc!!!

That’s exactly what my hack software does. When I use either rawproc interactively or the command line img to create an ouput JPEG, TIFF, or PNG, the software stores the toolchain in the EXIF:ImageDescription. Then, rawproc has a special 'File → Open Source…" menu selection that, when you select, say, a proof JPEG with that information in ImageDescription, it’ll open the source file and re-apply the processing to put you at the starting point for subsequent work. You can either modify a tool that’s already in the chain, or add new tools anywhere in the chain.

Sorry for the blatant marketing, but doggonit, it really works well now, at least for me…

2 Likes

Well strictly speaking if you use a Canon camera with CR2 …IIRC that uses a TIFF container format :stuck_out_tongue:

1 Like

@ggbutcher Glenn scanned git for rawproc…very quickly …instructions are to build for linux but I think you mention you have complied on Windows?? 32-bit?? Are there any specific instructions to do so??

The github wiki has a somewhat dated page on compiling rawproc:

I made this back in the day when Ubuntu’s packages weren’t at sufficient versions to support rawproc; with 19.01 you can apt-get all the dependencies.

Well, I haven’t tested wxWidgets that way. I like to statically link it anyway, as most folk would have to install wxWidgets just for rawproc. So, what I generally do for both linux and msys2:

  1. Get and compile wxWidgets. Here’s my configure (in a separate build directory):
    $ ../configure --enable-unicode --disable-shared --disable-debug
  2. Get and compile and install librtprocess (not a package in any distro, yet)
  3. Install the other dependencies (libjpeg, libtiff, libpng, liblcms2, libraw, lensfun)
  4. From there, do the rawproc compile as described in the README

There are a few different angles here:

Why do camera manufacturers deliver RAW files instead of demosaiced TIFFs?

Because your computer can run more powerful algorithms on the RAW (especially denoising and demosaic) than cameras could. I guess this gap has become a bit smaller but is very likely still relevant.

Also if I’m not completely mistaken the white balance is usually performed before steps like demosaic & denoise. This makes intuitive sense as well changing the white balance might actually reveal new edges which were not visible before.

If you just want slight tweaks and you got the dynamic of the compressed in camera already, you might actually get away with editing jpegs, especially if captured using a high resolution camera and targeting ‘web’ resolutions.

But I usually struggle with recreating as pleasing colors and sharpness as the camera’s JPEGs.

From my experience this also depends a lot on the camera. My D810 seems to be nice and linear, just using the standard color matrix gives my usable colors. My A6000 is the opposite. The colors out of the camera are usually off, and it requires a color profile and/or fidgeting to become usable (if still far from perfect). The out of camera jpegs of the Sony look fine if not entirely realistic.

In the end I’m 100% with you, getting decent high(ish) bit depth out of camera images would be wonderful especially for quick snapshots where nothing more than a quick edit is desired.

1 Like

darktable does the same if you enable the option to export processing history to JPG (it ends up in the XMP.darktable.history tag)

1 Like

Oh, 32-bit… it’s easily compiled/built, but the real problem with 32-bit is the 4GB memory space limit. rawproc makes a full-sized copy of the image for each added tool (unless you group tools…), and you can run out of memory really quick if you only have 4GB addressable…

Why would you still run 32bit windows in 2020?

Pretty close to rawproc, except its the history stack, not the applied toolchain. Although it probably could be made equivalent if the history stack is compressed, but the module order looks to still be defined elsewhere…

Yes. I guess that it will be addressed once the module order presets are implemented.

That is a brilliant idea! I will do that tonight!

No idea I thought that is what I read in the readme notes that is why I asked the question??

Mail](https://go.microsoft.com/fwlink/?LinkId=550986) for Windows 10

Thanks Glenn my question was not worded the best …I did mean in general were there instructions to build for windows…I was not actually looking to do a 32 bit build…thanks for responding…

Mail](https://go.microsoft.com/fwlink/?LinkId=550986) for Windows 10

The first post will do, then. For msys2, just pacman -S each package…

Great thanks will try to find some time to give it a whirl…

Btw, the most recent DNG spec 1.5 allows also for the storage of an “enhanced” image in high precision, which is at least demosiaced by definition, and possibly denoised and sharpened. However, it should stay in the camera color space, so quite a bit of camera specific processing is still needed.

Unfortunately, it seems there is no requirement to disclose and record what was exactly done to the raw pixels in order produce that enhanced image.

This is the closest thing I can find in any metadata spec to store such:

Exiv2 - Image metadata library and tools, scroll down to xmpMM:History. I think I read somewhere recently that darktable can optionally store the history stack here.

All the information gathered by the sensor (bayer/xtrans, I don’t speak about 3-channel foveon) is available by one color value per sensel (usually 10 to 16 bit, depending on camera, often even lossless compressed). You can go from that data to demosaiced data but you can not easilly go back, means by storing the demosaiced 16-bit tiff data instead of the raw data, you will lose valuable information, increase the size of the file and also increase in camera processing time (reduce burst speed)

Yes, I don’t see any need for such in a raw file, if one supposes that the raw data comes unmodified. In most raws I’ve inspected, the Makernotes do include the Picture Control settings (sorry, a Nikon nomenclature) that produced the embedded JPEGs.

But a rendition, e.g., a TIFF for use in GIMP, or just a JPEG for some audience, does have a use case for containing the tool chain that produced it. That’s how I open most files in rawproc; I select a rendition, and rawproc finds the source file, opens it, and re-applies the toolchain to start me off with the rendition as-processed. From there, I can change any of the existing tools in the chain, or add/delete tools. Once done, I can choose to either save the new rendition over the old one, or save to a new filename.

I can’t imagine working as efficiently any other way…