Hugin pano/HDR + darktable in scene-referred

Aha! Back to the drawing board! Thanks for your most valuable guidance.

More importantly, perhaps, you did counsel me: “First off, I suggest you work on one problem at a time.”. Point taken: I’ll revert to simply trying with jpegs to the point where I can correctly drive Hugin to develop a stitched panorama.

I did not tested it, but In the Lightsroom, I would create a style with the required set of modules, and then apply that style to the rest of the pictures in “Overwrite” mode.

What about ghosting management with hugin?
I have a set of bracketed images taken in Florence, Italy, over the Arno river.
There are a couple of boat in the rivere which of course result in ghosting in the final, merged image.
Is there a way to manage that in hugin during the hdr-blending process?

You can use exclude/include masks.

Yes but I don’t understand how they works.

I use them like this:

  • If I have moving objects in the overlapping part of the pano, I exclude them in the picture that adds them.
  • If I have a sharp version of a non-moving object in one image and a blurry version in another image (often on the edge, as my lens isn’t that sharp there), I draw an include mask around the sharper version.

If you exclude objects make sure there is at least one image that can still provide this area, otherwise there will be a hole in the resulting pano.

Tutorial

HTH,
Flössie

1 Like

Ok, I will try.
Thank you!
Only thing, in my case I don’t have a pano but just a “plain” 7 shots bracketing of a single composition, but I don’t think that it would modify the approach to ghosting.

I’m clearly doing something wrong here :sweat_smile:

The only way to make it work is to set the blender to “enblend” using the 2022 version of HUGIN:
immagine

(I’m on ubuntu 22.10 btw)
But I’m not so much satisfied with the blending of the mask I set.
In the crop here it’s clearly visible.
Is there a method to blend it more subtly on the borders?

You have at least two moving parts in your image: the boats and the water. So it’s likely that any mask will be visible in the final result: wherever you have several water layers, you’ll get ghosting in the water. That’s not very objectionable if it’s the same over the whole image, but any local changes in the number of blended layers is likely to be visible.

You don’t say why you had to bracket for that image (nor did you show the complete image), so giving more detailed advice is hard…

1 Like

This is the composition: sunset behind Ponte Vecchio in Florence, Italy. There was a lot of dynamic range in the scene and I want to preserve one frame of the boat in the river without ghosting.
I tried masking other parts of the image to see if there also is some “blend lines” visible and yes, there always are, even on flat colours.
(the image below is a screenshot of the raw file from darktable)
immagine

I think you only need one (long exposure) shot in the sequence for the lower part, so there is no need to mask the boat and water around, and one (short) for the sky.

I was there last week, cannot take the shot again. Also I was handheld (and there was a very strong wind, the tripod would have been useless).
I have to work with these shots.
Anyway, I’d like to know how to manage these kind of situation with ghosting and, also, the long shot couldn’t be able to freeze the boat, which adds some element in the composition.

That screen shot is already quite promising. You’ll have to check in dt where there is sensor clipping, but you may have an image in your stack that is good enough.
“Good enough” will also depend on what use you plan for the image: screen use will have a lot less noise problems than a large print, as you will downsize the image a lot for screen viewing. So you have more leeway in brightening the shadows (here: buildings and water).

If you have to combine images, try with two or three rather far apart in exposure, and without masking (to let hugin do the blending, which in HDR work includes selecting areas of the image). Reason: you have two clear zones in this scene, so one image per zone should get you a decent starting point.

If you really want to see what can be done with this scene, you can always post the raw images as a play-raw, with appropriate licence.

2 Likes

Thanks for the CC.

Masking is useful bc of the boat in the water. There is a lot of ghosting without masking an “include area” in one of the images. I ended masking the entire river in one of the shots where the exposure of the water is good enough without any blur on the boat.

I’m struggling with a magenta color cast when following my steps from Oct 22. I created the pano from 16 bit TIFFs in linear Rec2020 with most of the modules disabled, including WB and color calibration. I used the advice from @flannelhead (steps 1 and 2) and exported the image. When applying the same WB coefficients and same hue and chroma as in the orginal there is a clear magenta cast.

Source image with D65 WB, and color calibration “as shot in camera”.

Same part of the resulting pano, when applying WB and color calibration with same parameters. The cast is clearly visible.
image

It looks like Hugin (2022.0) is still doing some tone mapping, but I don’t see where I’m making a mistake.

The problem seems actually be in dt. If I import one of the TIFFs, and apply same WB and CC as to the RAW, I get similar, not exactly same, magenta tint. I would have expected that there would be no difference.

I wonder if you could apply WB and CC already before exporting the linear Rec.2020 TIFFs. I think that should work as long as nothing is clipping (when checking against the linear Rec.2020 profile) and you use the same settings for each image. After all, this is just linear algebra on the RGB values…

Thinking more about this, maybe exporting a non-whitebalanced image in Rec.2020 can be destructive due to clipping of individual channels.

My thinking has been kind of just the opposite; if I don’t apply the WB, I’m not multiplying any of the channels, so there is less risk of clipping. All the coefficients in WB are >= 1, so they increase the risk of clipping.

But the color profile conversion to linear Rec.2020 is still being applied. I think that might throw some of the pixels out of the [0,1] range without white balancing.

Edit: and if things clip after the WB, you could always just apply a negative exposure.