NR Reduction washes out colour - some random experiments without conclusion ...

Well, I can see it! And the point is that noisy regions are desaturated, not randomly more saturated and or sometimes lighter/darker/value-shifted. Sounds a bit nitpicky, but it relates to your second effect: the detail which is removed is specifically saturation. The algorithm used seems to desaturate noisy saturated patches. Whether or not this is real…it is at least perceived as such.

To find out I propose an image pair from a tripod. One low iso, one high iso with and without applied NR. After that define 3pxby3px patches in both pics with mean and sd values for Hue Saturation and Value. Or simpler take the difference of both pics’ ab/uv/cbcr planes. If the residual has a consistent negative value where it was saturated before…that should tell more than an eyedropper. Would that setup suffice? Did I miss something?

The comments are a bit long. :slight_smile:

What I would say is that when you are removing noise there will always be a softening effect because you are essentially smoothing the image. This will reduce features and flatten contrast and hues locally and globally. The goal is to find ways to recover from this. Of course, finding a suitable algorithm and optimal parameters would help. The key is to be careful not to remove too much signal while removing noise. If you nuke the image, you won’t have anything left to recover.

PS Upload your image. I am sure plenty of people can provide you with a *.pp3 that will perfectly address the issue. :wink:

1 Like

I had some nr brick issues a while ago. See this thread

1 Like

Hi, sorry for not responding, I hope tommorow I will have the time to react for all great response, especially I would like to try some provided hints,

here is my original raw photo, sorry for not uploading it the first post - generally I have no problem uploading the files,

regards!

DSC09955.ARW (20.2 MB)

Ah, so by “intensity” perhaps @nullnull meant “saturation”? I can’t see it (dodgy eyes or screen or whatever) but I agree it is there.

f:\web\im>%IMG7%magick e04c1c4c7b9d934f5a9f68516abe48a6c6df1384.jpeg
 -crop 37x18+38+78 +repage -colorspace HCL -channel 1 -separate -format %[fx:mean] info:

0.0970853

f:\web\im>%IMG7%magick 3904708ea9fe41d2b6712dc5ed20b0631c8b0fcc.jpeg
 -crop 37x18+38+78 +repage -colorspace HCL -channel 1 -separate -format %[fx:mean] info:

0.0752635

The denoising has reduced saturation, the chroma channel of HCL. Well, that’s not surprising. Consider a photo of a neutral gray card, so the chroma is zero. In any pixel, the chroma cannot be decreased. Noise will either increase it, or leave it unchanged. So any noise will increase the average chroma.

Your photo isn’t of a neutral gray card but, like any normal photo, the chroma is low: about 0.1 on a scale of 0.0 to 1.0. So adding noise will increase chroma, and denoising will lower chroma.

EDIT: My misattribution of a quote. Sorry.

I am using one of GMIC’s inbuilt filters as a base, so I am not entirely sure what is going on inside. However, it is a patch-based noise reduction, so I assume that it is using a full RGB vector to compare the difference between patches. In fact, I use an RGB guide image that I desaturate somewhat to improve chroma noise reduction, so I guess that confirms my assumption.
I’m definitely not denoising the channels independently.
One downside to this approach is that it does not remove as much chroma noise as other approaches.

1 Like

I never talked about intensity, that was @nullnull . Some mixup there.

Every actual photo has noise. Every actual color photo thus has chroma-noise (greycard chroma noise is not zero). Adding noise will leave the mean of a group of pixels unchanged. Any objections against this reasoning?

The decrease in saturation is a factor 0,0752635/0,0970853=0,77. It’s fair to call this an artifact. And yes artifacts will happen, during NR, no question.

Thank you for the answer! I guess if it’s patch based it should be the full RGB vector, it would make less sense to do a channelwise patch-aggregation as this has the highest computational cost.

Have you tried different guide images? I mean different methods for generating the guide?

Cheers!

@PhotoPhysicsGuy and @nullnull: Sorry for my misattribution above, now corrected.

I agree with your reasoning. Adding noise to any channel will leave the mean of that channel unchanged, more or less. (But almost certainly not exactly.)

In my fairly limited experience of photographic noise, it isn’t an addition to a chroma (or saturation) channel. Rather, it is an addition to each of the camera RGB channels. If the chroma was low, this noise increases chroma. If chroma was high, this noise decreases it.

Late to the thread, but this quote resonates with something I do: most of my final images are downsized. To that end, I generally don’t denoise until I’ve had a chance to view the image at its final resolution, as the reduction interpolation really does mitigate a lot of noise expressed in the higher resolution. And, if I do decide to insert a denoise step after that, it is mostly less than if i’d have done it straightaway.

One thing I haven’t played with much is the introduction of a gaussian blur just prior to resize. That is supposed to tame the resizing artifacts from noise, but it might not mitigate the color changes being discussed.

FFT… (food for thought)

You can probably take a look at
CImg/plugins/nlmeans.h at master · GreycLab/CImg · GitHub
CImg/plugins/chlpca.h at master · GreycLab/CImg · GitHub

Yes, the guide image in my filter is user definable, so I have tried many things.

Blurring the guide image can be useful for noisy images, and I included a control for a very light blurring in the filter. Blurring the guide does not degrade the result as much as you could expect.

Another approach is applying a different method of noise reduction to the guide (you could use the same kind of NR if you want). This improves finding similar patches. Using frequency domain noise reduction on the guide can be quite good because it can recover patterns that are swamped by noise much more easily than patch-based methods.

However, I have found the best use of frequency domain NR in conjunction with patch-based, is to do a slight over-smoothing with the patch based and then use frequency domain NR on the difference to recover residual patterns.

Oh! Almost forgot. You can also draw on the guide image to emphasise edges etc. that you know are there but are overwhelmed in noise.

If that is your experience, then are there colorspaces which show less desaturation artifacting than this example case?

Oh gosh! So the nlmeans has it (I can read commented code, yay!) and the chlpca probably too, but my reading skills of C (I am assuming this is C) are not developed enough. :frowning:

That is a really sexy trick! And you do that normalized to local lightness if I remember correctly?!

Alot of food for thought, indeed!

I normalise the noise levels in the residual locally before applying the frequency domain NR. Then I undo the normalisation to get back to ‘normal’.

You might be interested in this paper which uses a combination of patch-based and frequency domain NR:

Data Adaptive Dual Domain Denoising: a Method to Boost State of the Art Denoising Algorithms

1 Like

Not as far as I know. I suppose the amount of desaturation will depend on both the cause of the noise, and the denoising method. But I suspect that denoising of ordinary photos (which have low saturation) generally results in saturation decrease.

When a photo has high saturation, I get virtually no change in saturation. See examples at Camera noise.

As I mentioned, I generally denoise the R,G0,G1,B channels independently, before demosaicing. With DSC09955.ARW, going through the process but with no denoising, we get:

c_bay_den
I have cropped to some bricks near top-left. For this, the C channel of HCL has a mean of 0.148742. The Cz channel of JzCzhz has mean 0.00341112.

With my dnsePsh2Mn denoising, we get:
img
c_bdp_den

The C channel of HCL has a mean of 0.111013. The Cz channel of JzCzhz has mean 0.00283577. So chroma has decreased by denoising. We can always boost chroma if we want:

%IMG7%magick ^
  c_bdp_den.png ^
  -colorspace HCL ^
  -channel 1 -evaluate Multiply %%[fx:0.148742/0.111013] +channel ^
  -colorspace sRGB ^
  c_bdp_den2.png

img
c_bdp_den2

EDIT: Bother, the images aren’t showing. They are at:

Non denoised:
http://snibgo.com/imforums/c_bay_deg

Denoised:
http://snibgo.com/imforums/c_bdp_den

Denoised, with adjusted chroma:
http://snibgo.com/imforums/c_bdp_den2
… but put “.png” at the end of each.

EDIT2: Uploaded images with drag-drop.

EDIT3: Now the original images show. I’ve been notified: “system 1 hour — downloaded local copies of images”. Hooray. Perhaps it was a sidekiq problem, as mentioned on [Friendly reminder] Limit JPEGs to maybe full HD resolution - #40 by Thomas_Do , and now fixed.

@snibgo possible you have some sort of hotlink protection?

I dunno. Nothing in my pixls.us profile looks like a problem. I did my usual practice: upload images to snibgo.com/imforums, and paste the urls here between [ img ] and [ /img ] tags. After a few seconds, the forum software usually copies the images from snibgo.com, but not this time.

When I paste the URLs with [ img ] tags, the forum software doesn’t show that text, but a “image unknown” symbol.

Creating a local HTML page with <img src=“http to snibgo.com”/ > works fine. I can’t find a control panel setting in my web hoster that might create problems.

It’s probably a bug somewhere. Possibly in my brain.

I’m going to do a reminder of noise processing in RT and before.

Before 2010 we worked, Emile Martinec, Manuel LLorens and me on Dcraw (not the original), then on Perfectraw.
Emile Martinec had made a very good noise treaty and we came up with several types of tools : Wavelet, DCT Fourier, Median, Bilateral filter,…
Emile Martinec in the same era, works on Amaze, and me to bring LMMSE to RT

When appaers Rawtherapee, and independenly of tools, the first question was : where to put the treatment, at the beginning or at the end ?
The first version was…at the end.

Indeed there are advantages and disadvantages to the two positions, but everyone explores these advantages and disadvantages.
Now, Denoise is at the beginning.

One of the important things Emil has brought is the notion of MAD (Median Absolute Deviation), which takes into account the signal (wavelet) by level and in a way the “histogram of this signal” (whether for luminance or chrominance).

This “MAD” acts as a filter (or as a guide) and for the same adjustment (slider, curve, etc.) will modulate the action of “shrink function”.
This allowed me around 2013 to develop an “automatic” action for chroma noise.
I also introduced Lab mode for denoise : Lab is better with “normal” noise… the real difference is introduced by the internal gamma (3.0) due to lab, which you can bypass with the gamma slider.

But, because denoise is complex, there is always a but, and we didn’t solve the problem before / after.
When we do a treatment to bring out the shadows or act on the contrasts - shadows higlight, Tone mapping (Fatal, Mantiuk, Desmis…), Encoding log, etc…, the noise increases considerably…and it’s after “denoise”!.
More it can be interesting to keep noise and to be able to differentiate the action.

The first action of this type was in 2014, by introducing denoise luminance in “wavelet levels” which acts at the end ot the threatment. But, two remarks, it’s global and only acts on luminance.
Others actions to improve quality originate from Ingo @heckflosse : very good Capture Sharpening or dual demoisaicing (ex : Amaze + VNG4)

But, the last action is the creation of “Local adjustements” with “Denoise module”. This module allows:

  • of course local denoise (with the 4 delimiters)
  • change the action with “scope”, for example you can denoise only the “reds”
  • I have keep the sames good algorithms from Emil, but I add
    ** possibility to differentiate the action according to the level of detail : for the luminance with 4 levels instead of 1, for chrominance 2 levels instead of 1
    ** add DCT (Discrete Cosinus Transform) for chroma (uses a lot of resources)
    ** other settings to improve DCT and differantiate action between shadows / highlight , red or blue…
    ** posibility to use masks
    ** etc

Of course, nothing is perfect, noise processing is complex, maybe one day artificial intelligence will bring more, but it will always be necessary to take into account the variations of luminance, illuminants, the perception of the photographer and his wishes, etc.

I think a “good” treatment is one that mixes before and after:

  • before (with “Denoise main”) to allow sharpening and reduce some of the defects : this settings must be to the minimum
  • after (with Local adjustements") to refine the treatment that the user wishes, according to the colors, the parts of the image

Jacques

3 Likes

I remember I read this at some point (reading is not understanding, but still).
I can only recommend to go to the ipol.im homepage and play with the demo, not only of this paper, but all other available demos…quite fascinating what denoising with a modern algorithm can achieve. The DA3D paper mentioned lets you also play with a multiscale DCT version as a guide for DA3D, superfast and not much different to a NL-Bayes guideimage for DA3D!

Two things that I forgot to mention that are important :

  1. you can choose the type of wavelets most suitable for your image. Indeed RT chose to use the Daubechies wavelet, and you can choose the number of “taps”, ie the shape and the number of moments of the wave.
    In theory, the higher the number, the higher the “edge performance”. In practice this is not always true and nothing is worth trying
    You can choose (in settings) between : D2 (Harr), D4, D6, D10, D14 (default = D4)

  2. you can get an overview of the action of the various parameters on the noise :

  • “luminance denoise” by wavelet levels (0, 1, 2, 3 and beyond …6)
  • “luminance detail recovery” DCT Fourier
  • “equalizer” white black
  • “chroma fine” by level (0, 1, 2, 3, 4)
  • “chroma coarse” by level (5, 6)
  • “chroma detail recovery” DCT Fourier
  • “Detail threshold” recovery Luminance and Chroma
  • “Equalizer” blue red

And of course “scope” - the “deltaE” function to choose the range of colors to be treated, and “transition” to smooth the result
This deltaE take into account : a) spot reference (the circle center of the Spot) with hue = href, chroma = cref , luma = lref; b) all pixels in selection with there own hue, chroma, luma.
DeltaE is calculated by something as : SQR(href - hue) + SQR (cref - chroma) + SQR (lref - luma)

For that you have to activate in “1+* Mask and modifications - Smooth Blur & Denoise”

  • one of the 3 choice usable for that : “show modifications without mask”, “Show modifications with mask” or “Preview selection deltaE” according to usage.

Of course, the result is not exactly the reality, I amplify the signal so that the results are better visible.

jacques