Why RAWs instead of, say, TIFFs?

That’s what I’ve been trying to get at, yes. Actually, my Fuji files come very close in most circumstances. But being JPEGs, they are not very amenable to editing. Which is why I would love me some “editable JPEGs”.

That’s how it went for me, too, on my Nikon D7000, Ricoh GR, and Pentax Q7. The Fuji X-E3, however, is a different matter. That’s actually the source of my question: I find the Fuji’s images very hard to recreate. My current workaround involves mostly editing the JPEGs and fiddling with LUTs where that’s not an option.

But that solution is not satisfying. Hence my wish for editable JPEGs (which I erroneously called “TIFFs” in the original post).

But perhaps that’s really more of an issue with that particular camera and less with file formats.

Most cameras use a ‘bayer’ color matrix on their sensor, Fuji has got a different approach with ‘X-Trans’.
Maybe that’s the reason why you got different results with Fuji.

Indeed, that (stupid) X-Trans filter array is a headache.

But my particular problem is not one of demosaicing and sharpness. It’s the rendering of colors and tones that I find hard to recreate.

You could start a PlayRaw where you share a difficult RAW and corresponding JPEG. Maybe you get some inspiration from the edits of others on how to approach your processing?

3 Likes

Yep, that’s the thing that kept me JPEG, although not so much duplicating its results but rather getting a result as good. But, that’s the edge of the rabbit hole; it takes a bit of teasing apart the process to understand what it takes, and then you can deftly put the parts together into pleasing renditions.

The realization that crystallized it for me was that tone and color are the two fundamental things we change to go from the flat RGB after demosaic to an acceptable rendition. They’re related in that changing one usually affects the other, but treating them as distinct operations helps one work intuitively to good effect.

Also, I stopped thinking of JPEGs as anything but final renditions for a specific medium and purpose. My current workflow for anything, family snapshots to carefully considered landscapes, is to shoot raw, then batch process to 800x600 JPEGs for proofs. For my family, that’s usually all they need. I wlll go through the proofs and re-do ones that maybe need exposure adjustment or shadow lift, but I do that by re-opening the raw and applying the proof processing, and using that as a starting point. I never edit a JPEG anymore.

rawproc!!!

That’s exactly what my hack software does. When I use either rawproc interactively or the command line img to create an ouput JPEG, TIFF, or PNG, the software stores the toolchain in the EXIF:ImageDescription. Then, rawproc has a special 'File → Open Source…" menu selection that, when you select, say, a proof JPEG with that information in ImageDescription, it’ll open the source file and re-apply the processing to put you at the starting point for subsequent work. You can either modify a tool that’s already in the chain, or add new tools anywhere in the chain.

Sorry for the blatant marketing, but doggonit, it really works well now, at least for me…

2 Likes

Well strictly speaking if you use a Canon camera with CR2 …IIRC that uses a TIFF container format :stuck_out_tongue:

1 Like

@ggbutcher Glenn scanned git for rawproc…very quickly …instructions are to build for linux but I think you mention you have complied on Windows?? 32-bit?? Are there any specific instructions to do so??

The github wiki has a somewhat dated page on compiling rawproc:

I made this back in the day when Ubuntu’s packages weren’t at sufficient versions to support rawproc; with 19.01 you can apt-get all the dependencies.

Well, I haven’t tested wxWidgets that way. I like to statically link it anyway, as most folk would have to install wxWidgets just for rawproc. So, what I generally do for both linux and msys2:

  1. Get and compile wxWidgets. Here’s my configure (in a separate build directory):
    $ ../configure --enable-unicode --disable-shared --disable-debug
  2. Get and compile and install librtprocess (not a package in any distro, yet)
  3. Install the other dependencies (libjpeg, libtiff, libpng, liblcms2, libraw, lensfun)
  4. From there, do the rawproc compile as described in the README

There are a few different angles here:

Why do camera manufacturers deliver RAW files instead of demosaiced TIFFs?

Because your computer can run more powerful algorithms on the RAW (especially denoising and demosaic) than cameras could. I guess this gap has become a bit smaller but is very likely still relevant.

Also if I’m not completely mistaken the white balance is usually performed before steps like demosaic & denoise. This makes intuitive sense as well changing the white balance might actually reveal new edges which were not visible before.

If you just want slight tweaks and you got the dynamic of the compressed in camera already, you might actually get away with editing jpegs, especially if captured using a high resolution camera and targeting ‘web’ resolutions.

But I usually struggle with recreating as pleasing colors and sharpness as the camera’s JPEGs.

From my experience this also depends a lot on the camera. My D810 seems to be nice and linear, just using the standard color matrix gives my usable colors. My A6000 is the opposite. The colors out of the camera are usually off, and it requires a color profile and/or fidgeting to become usable (if still far from perfect). The out of camera jpegs of the Sony look fine if not entirely realistic.

In the end I’m 100% with you, getting decent high(ish) bit depth out of camera images would be wonderful especially for quick snapshots where nothing more than a quick edit is desired.

1 Like

darktable does the same if you enable the option to export processing history to JPG (it ends up in the XMP.darktable.history tag)

1 Like

Oh, 32-bit… it’s easily compiled/built, but the real problem with 32-bit is the 4GB memory space limit. rawproc makes a full-sized copy of the image for each added tool (unless you group tools…), and you can run out of memory really quick if you only have 4GB addressable…

Why would you still run 32bit windows in 2020?

Pretty close to rawproc, except its the history stack, not the applied toolchain. Although it probably could be made equivalent if the history stack is compressed, but the module order looks to still be defined elsewhere…

Yes. I guess that it will be addressed once the module order presets are implemented.

That is a brilliant idea! I will do that tonight!

No idea I thought that is what I read in the readme notes that is why I asked the question??

Mail](https://go.microsoft.com/fwlink/?LinkId=550986) for Windows 10

Thanks Glenn my question was not worded the best …I did mean in general were there instructions to build for windows…I was not actually looking to do a 32 bit build…thanks for responding…

Mail](https://go.microsoft.com/fwlink/?LinkId=550986) for Windows 10

The first post will do, then. For msys2, just pacman -S each package…

Great thanks will try to find some time to give it a whirl…

Btw, the most recent DNG spec 1.5 allows also for the storage of an “enhanced” image in high precision, which is at least demosiaced by definition, and possibly denoised and sharpened. However, it should stay in the camera color space, so quite a bit of camera specific processing is still needed.

Unfortunately, it seems there is no requirement to disclose and record what was exactly done to the raw pixels in order produce that enhanced image.