My sharpening workflow for base ISO mages with the new features in RT (aka extreme pixel peeping)

@gimbal I’m also a noob regarding astrophotography :frowning: I read some stuff about drizzle but don’t know if that’s done on demosaiced or cfa data

This is going into really interesting directions. Admirable.

1 Like

It’s quite easy to add other demosaicers. So

  1. RCD+VNG4
  2. DCB+VNG4

are easy to add.

I’m a bit reluctant to add free combinations because that will confuse users and will increase the number of issues and that will take more of our rare development time.

I have no objections to add the two methods mentioned above but I would also like to hear rt users and contributurs opinions before I add RCD+VNG4 and DCB+VNG4.
For RCD+VNG4 I see the use case, while for DCB+VNG4 I need at least one example to be convinced.

Ingo

I knew all this was coming. The automation and different combinations. Thanks @heckflosse for getting the ball rolling. :+1: I look forward to RCD+VNG4. I rarely use DCB (mostly AMaZE) but I am not as experienced as @samuelchia. :slight_smile:

2 Likes

Next time please tell me. Then I’ll need less time to think about what’s possible :wink:

1 Like

I think you are referring to bayer drizzle, as opposed to merely just the drizzle process. Bayer drizzle is much more involved, apparently. I’ve not done this myself but rather understand it in theory only, from reading about it. Bayer drizzle works on non-demoasiced data indeed, but the data must be dithered. That is, the camera must be moved randomly in all directions by a sub-pixel amount of displacement. You need specialized motorized trackers that can do this for you, and you also need very many exposures for this to work (say 20-30 at the very minimum, but more is better). For normal astro processing, and normal drizzle (you don’t necessarily need drizzle for all astro processing work), you have to first demosaic before alignment and stacking and averaging.

I do what people call astrolandscape photography, so I don’t focus only on shooting deep sky objects, but I want to also pair it with landscapes in the foreground. It’s much harder to carry all the extra gear on the trail so I pack a more portable tracker which cannot do dithering, and because I image a lot of the sky (wide-field) and I don’t want to lose too many stars at the seam where the sky joins to the landscape from the blurring induced by the tracker, I have to work fast, but I also need time to gather enough photons. It’s a battle for balance for good enough. I cannot afford to stack hundreds of exposures. Per camera position, 8 frames at 32 seconds each is what I have determined to be optimal for my current workflow.

So I have to demosaic first regardless, and the difference in quality is not so dramatic in print for wide field imaging work. I also like using RT to demosaic and output a TIFF, so I can use my own custom camera profiles I built using Ander Torger’s Lumariver Profile Designer software. I’ve not yet tried out the linear (Roger Clark calls it traditional processing) processing workflow, which in theory is supposedly the more “correct” way to approach astro image processing, combining lights, darks, flats etc. At some point we are counting angels on the head of a pin. It’s way more work for marginally better image quality, invisible to most people, and impossible anyway for the kind of approach I want to take - a combo approach of deep-sky techniques for the sky+composite with the landscape, plus doing multi-row stitching for ultra-high resolution at that, all in one session. Not by faking a sky from a different location, on a different night and compositing it into a landscape from who knows where.

I’m too quite new to this, and perhaps I’ll think differently as I learn more about astro processing techniques. Right now it seems as if the astro processing talk in the upper echelon is more about math than aesthetics! We still don’t know what’s the best way to represent colour in our astro images. Very often people show wildly impossible nighttime colours, or greaty exaggerated hues.

1 Like

Thanks for the example! Here’s one of mine. The difference in the bright star in the bottom right is perhaps the most obvious. One can also see in the star cluster on the upper left that neither AMaZE or RCD is showing a clear advantage. Generally speaking, RCD usually results in less false colour artifacts overall, but not always. Some stars shift in hue from white or off-white to become yellow-green for example, with RCD. Indeed stacking and averaging would smoothen and round out the stars more, so it doesn’t matter as much. But since we have RCD available, why not use it? :slight_smile:

@plaven Thanks Peter for showing how the demosaicing methods are displayed in the dev version, and for the link to the tutorial on how to compile RT!

I’m in full agreement on this. It would not be good to allow free combinations where someone could pick a combo that would give a bad result like VNG4 for high contrast regions and AMaZE for low contrast. Yikes!

Oops! LOL. I wanted to say that I really appreciate having this feature available in RT. Currently, I have to output two versions of each raw file, and blend them together myself in Photoshop. It’s time well-saved by doing in RT.

Hi Ingo,

I quite often head out and do Astro/Milkyway Landscape photography as a single image.

I’m more than happy to provide a range of canon 5d mk3 raw images for your testing if you like. Mostly long exposure ISO3200 and some ISO4000 images.

Peter

1 Like

@plaven or anyone else who might be able to help: I am trying to follow the instructions to compiling RT on my Windows PC. I’m stuck at this step:

Afterwards the Makefile needs to be opened using a text editor to remove iptc and docs from the lists named SUBDIRS and DIST_SUBDIRS, as building or installing will fail otherwise:

$ nano Makefile

Replace

DIST_SUBDIRS = m4 libiptcdata po iptc docs win python

with

DIST_SUBDIRS = m4 libiptcdata po win python

And replace

SUBDIRS = m4 libiptcdata po iptc docs win $(MAYBE_PYTHONLIB)

with

SUBDIRS = m4 libiptcdata po win $(MAYBE_PYTHONLIB)

I’m not sure what I am supposed to do? When I tried to find the Makefile in the msys64 folder, there are so many of them, I’m not sure which one I should be modifying with a text editor. And when I looked into what looks like the correct Makefile, I cannot find neither the DIST_SUBDIRS nor SUBDIRS parts that I need to replace.

I see that the “$” sign is used to indicate something that I should be typing into the command line of the MSYS2 app, so I then tried typing in “nano Makefile”. And lo, it returns this and I’m not sure what to do with it:
Untitled-1

I would greatly appreciate a little help with this, thanks! After which I hope to be able to prepare an example for @heckflosse which shows why the need for DCB+VNG4. Though I am not sure why the examples I have provided in this thread do not suffice to prove that the diagonal artifacts with AMaZE are both real and visible.

@samuelchia
You are editing the appropiate makefile.
If you are not used to nano yo can do as follow in an other window :

If you followed exactly the instruction to building iptc, you should find the appropriate “makefile” file in C:\msys64\home\<USER>\libiptcdata-1.0.4

Then use an editor to remove iptc and doc in the make file .

When building RT do use the MINGW64 shell.

I’m afraid I’ve never compiled on Windows, so I haven’t any experience to assist. I use Linux exclusively at home these days. Generally speaking it’s a more straightforward experience for compiling from source given all the tools you need are just an install command away.

@gaaned92
Thanks for the guidance! Good to know I was doing it correctly at least at that step. Just out of curiosity, how do I edit in “nano”? I’m afraid I do not know what that means. I see that I can use the arrow keys to navigate and delete text and type in new text, but I do not see how I can easily find the “DIST_SUBDIRS…” and so on easily. I see there are some kind of key combo or something at the bottom of the window, but when I press those keys, it just types it out in the window.

I’m not quite sure what you mean by that?

@heckflosse Why not allow free combinations according to a predefined list of algorithm sharpness? Simply check that the high contrast algorithm is “sharper” than the low contrast algorithm, and users won’t have weird results. It would save you the trouble of finding use cases for all the combinations, and make it easy to add more algorithms in future.

Thanks to @gaaned92 I was able to complete compiling RT for my use, and also thank you to @plaven for turning me on to it. I’m very noob in this regard, although I have been using the stable releases of RT for years now from the download page. Please excuse me! Finally I can show an AMaZE+VNG4 vs DCB comparison.

AMaZE+VNG4 on the left, DCB on the right.

Here is the raw file again (I’ve forgotten if it was still on the other thread or not)
_MG_1571.CR2 (25.3 MB)

The screenshot shows the sharpening settings that was used. Indeed less aggressive sharpening would exaggerate the diagonal artifacts less, but nontheless the artifacts are present. DCB just looks much better. @heckflosse Ingo, I hope this example helps to rest the case. :slight_smile:

@samuelchia
I regularly make builds for W64 of all development live branches available at RTW64NightlyBuilds/ – Keybase.pub

@gaaned92
That’s awesome, thank you!

I added DCB+VNG4 and RCD+VNG4

1 Like

Since there is the contrast threshold adjuster for sharpening, I use much more aggressive settings :slight_smile:

1 Like

Thanks!
In this image (noisy, as usually are mine), I attacked noise before applying your steps. That is, I worked in Amaze as I’d normally do and applied either Denoise and Wavelet Denoise and Refine. Denoising an image such as that ends up with a significant lost of detail.
After applying your steps, the detail is back almost entirely, and without any apparent noise increase.
Screenshot with before (left, neutral profile) and after (denoise + heckflosse sharpen):


Just in case, here’s the Creative Commons, By-Attribution, Share-Alike licensed raw: CRW_3783.DNG (17.8 MB)
CRW_3785.DNG.pp3 (10.3 KB)

My question is: since this relies on changing the demosaic method, should all these steps be applied at the beginning of the workflow? Or, as usual, apply them at the end of the raw workflow? Or should I only change the demosaic method first, then do the usual edits, and finally do the rest of heckflosse sharpening steps?