I have a stacked astrophotography image which very dark but has a 32 bit depth. I noticed that I tend to lose bit depth or something like that when brighten my image using the gray point of the levels tool.
I don’t know if this is a problem with Darktable (latest version) or if I don’t know the theory of how the tools work.
I have a really clear example of what I’m talking about. Here is the result of what happens if I use the levels and the tone curve tool:
I think that without access to your image it will be very difficult to give a profound explanation what happens. Also your screenshots do not show the history stack with the modules applied, so one can not comprehend your full workflow.
In your place I first of all would try to get an overview of the raw data, detected by the sensor, then demosaiced. Do not apply modules like base curve ore tone curve at this point. Go back step by step in your history stack to find the “real” distribution of intensities in your image. With this knowledge you can go forward, module by module, to explore the influence of each individual module and it’s parameters.
Sorry, I forgot to say that the only active modules are the ones shown in the screenshots.
Here I’ve uploaded the image (cropped to resuce file size) with four sidecar files: A, B, C and D.
A: Looks OK. Has “tone curve” and “rgb levels”
B: Looks bad. It has exactly the same “tone curve” and “rgb levels” as A, but it has a “rgb curve” enabled with a linear curve (i.e. should do nothing)
C: Looks bad. Has “rgb levels” and “rgb curve”
D: Looks bad. Has “tone curve” and “levels”. it can be fixed when setting “preserve colors: sum RGB” in the “tone curve module”
So in the end looks like “rgb curve” does some kind of quantization and breaks the images just by being there. But this is not the only problem, looks like I have to be careful in lots of places when editing my images for astrophotography (e.g. in the “preserve colors” settings)
I’m new to image editing. I normally use a combination of 3-5 levels/curves to adjust the contrast of my astrophotography images.
As you can see if you open my image, it is very dark. Also it is not a matter of adding exposure, I have to add exposure/set levels but I also have to create sharp curves to differentiate nebulas from “empty space”. But yes, I can live with curves only if it is the right thing to do
Anyways, my question is about what causes that “loss of bit depth”.
It is a bug?
I’m using RGB curves wrong because it is meant for … and not for …
I’m using preserve colors wrong because X means…
I have to be careful with tools X, Y and Z because they …
The tone curve module has a working space of LAB, which is probably not what you want to use. I’m not sure about RGB curves.
Your tif file doesn’t not have a lot of metadata written to it… is this file linear or gamma encoded? What is the profile assigned to it? I don’t have any astro experience, so YMMV.
I used two instances of the Exposure module both pushed to to the max. Then I tried with the Filmic module to get some contrast into the file.
You might want to talk to the Siril guys to make sure you’re processing correctly in Siril.
Let me start by saying that I have no experience in astrophotography.
Are you certain your original data has a “smooth” histogram ? My impression is : there seem to be well defined levels of intensities of objects. Editing your image using exposure and rgb curve I get
with this histogram
If I additionally desaturate the image the histogram changes to
To make certain how the “real” data in the file looks like, I would use a tool like mathlab, octave, python numpy … to create a plot of the intensity distribution of the pixels in your file. In other words : create a histogram where the binning is equal to the number of grey levels of your image.
Only with knowledge of your real input you can decide, if your image processing is working as you expect it to work.
Thank you very much, your images look quite OK. I think your histograms show the real bit depth of my image. Also, I was able to make my image look Ok at least for me:
This is what I wanted to show with the histograms. Your image data is coded into a 32bit tif file, but the “real” information depth of your data is about 7 or 8 bit (or less ?). To get something visible on your screen you have to apply a very steep tone curve. This means an extreme spreading of the histogram in the range of interest. And with the spreading I used in my example you clearly can dissolve the discrete “grey levels” of your input data.
In my opinion your problem is not image processing but the “quality” of your input data. And as I said before : analyze your input data in detail before you start processing it, otherwise processing is like “poking in the fog”.