Astrophotograpy - First attempt

This is my first attempt to shot stars using photo staking to reduce noise. I captured the Orion belt and the Orion nebula.


30 staked shots, Canon 1000d, Canon 50mm f1.8, f3.5 6s 1600 ISO. Basic preprocessing of RAWs with darktable. Alligned with hugin and combined with enfuse. Final processing with darktable.


Here is an example of JPG obtained from the RAW files. Only small exposure and white balance correction applied.

I was hoping to record much more little stars and nebulosity details. Unfortunately, in my region light pollution is quite high and the background luminosity is quite intense. For sure I need to change location: I hope to go to the Alps one of the coming week-ends, whether permitting.

Maybe there are some limitations with my budget equipment. Iā€™m sure that the 1000d sensor performance are not up to par with modern high-end sensors.

Iā€™m a very beginner with astrophotography and staking. Any comment or suggestion is welcome!

2 Likes

Looks pretty darn good! I know Iā€™d be happy with results like this.

In fact, I now may need to go out and give this a try tonight.

Did you output tif/png from darktable, align in Hugin, combine with enfuse, then go back into darktable?

What are the options for stacking the raw data I wonder? A mean/median stack of direct raw data might be useful, no?

I know thereā€™s something like DeepSkyStacker, but is there a nice free-software option instead? Or is this a case where manual tif/png + hugin (align_image_stack) + enfuse/mean/median is the current best case?

Impressive difference! Iā€™d love to see a tutorial on your process, since not much has been written about open source image stacking, and all the programs in your chain have somewhat steep learning curves.

I suspect youā€™d get a lot more nebulosity if you just keep doing what youā€™re doing ā€“ your stars look very nice and your sky is very black ā€“ but shoot more of those photos. 30 x 6 seconds is only a minute and a half; in the film days, I donā€™t remember many people shooting less than 20-minute exposures at ISO 1600, and many shots were an hour or several. (One reason astrophotography was so hard ā€“ try guiding for an hour without a break!)

You might get me trying astrophotography again. Last time I tried, I had a fight with my Rebel and the camera won. I need an ā€œM and yes I really mean itā€ mode. Iā€™m lucky enough to live in an area with very little light pollution, but even here I could benefit from stacking.

1 Like

(longtime lurker here :slight_smile: )

This is actually a nice guide on astrophotography with free software:

1 Like

Itā€™s technically a very nice photo and I like the orion nebula being clearly visible.

[quote=ā€œpatdavid, post:2, topic:858ā€]
I know thereā€™s something like DeepSkyStacker, but is there a nice free-software option instead? Or is this a case where manual tif/png + hugin (align_image_stack) + enfuse/mean/median is the current best case?
[/quote]As far as I know hugin is the nicest thing we have at the moment to align images. After figuring out which options to use itā€™s quite ok. Iā€™m still planning to do an astro photography guide once the milkyway returns around here. :slightly_smiling:

Itā€™s technically a very nice photo and I like the orion nebula being clearly visible.

[quote=ā€œpatdavid, post:2, topic:858ā€]
I know thereā€™s something like DeepSkyStacker, but is there a nice free-software option instead? Or is this a case where manual tif/png + hugin (align_image_stack) + enfuse/mean/median is the current best case?
[/quote]As far as I know hugin is the nicest thing we have at the moment to align images at least for ā€˜astro landscapesā€™. After figuring out which options to use itā€™s quite ok. Iā€™m still planning to do a landscape astro photography guide once the milkyway returns around here. :slightly_smiling:

@Narayan That looks like free as in free beer software. :neutral_face:

1 Like

Well, there is Siril, but I havenā€™t tried it yet. Last time I went the Hugin way and used ImageMagickā€™s median stacking, as you suggested in your petapixel article.

1 Like

Light pollution is one contributing factor as to why you donā€™t see the fainter details. Even if there were no city lights (as you are hoping for in the Alps), 6 seconds simply donā€™t suffice to catch those. I know, you canā€™t expose longer due to the star trails, but consider using a wider lens. The Samyang 16mm f/2 is really nice for astro on a crop. With it you can expose up to 18 seconds without needing a star tracker.

True, but the resulting images are not summed up, instead the median pixels were taken to prevent noise. Still the resulting image was only exposed for 6 seconds, and not enough photons from fainter stars made it to the sensor.

Would it yield a better result, if you summed up, letā€™s say, groups of five images and did a median on those six groups?

Thanks! Love you think it is a good result!

I outputted pngs from darktable and I fed align_image_stack with them using default options. Than I used enfuse with --saturation-weight=0 in order to make the the algorithm work only with exposure weights. Iā€™m not really aware of what are the true computations inside enfuse when staking evenly exposed images. I will try to compare the results of the median filter of imagemagickā€™s convert. For sure I saw that hot pixels where making trails in the fused image, the median filter could be a better option to tackle this problem.

I think it would be definetly useful. Intuitively I think the rawer the data and the better the averages. I would love to be able to do that.

As @Jonas_Wagner said, I think the hugin workflow seems the current best option. I didnā€™t know about Siril, but it looks interesting.

@Jonas_Wagner, It will really nice a guide from you with tips about capturing and processing the milkyway (l liked a lot your nightscape from the august thread).

Actually I am not an expert of this workflow, but if you want to follow my route I can briefly summarize the process. Hopefully it will be useful to you.

  1. I waited for a clear sky new moon night, I took my tripod and I shoot a reasonable amount of photos. I found particularly difficult to focus properly the stars: Iā€™m shortsighted and there was no way I could focus well through the viewfinder. I found quite useful focusing in the live-view.
    I opened up the aperture of the lens and I used the maximum ISO value of my camera. I didnā€™t use the maximum aperture (f1.8) because the optical quality of my lens degrades a lot when full open.
    I used the maximum exposure time I could without causing star trails. It can be estimated with ā€œthe rule of 500ā€: 500 divided the focal length of your lens (full frame equivalent) gives you the exposure time in seconds.

  2. I imported all the raws in darktable and I converted the images in an uncompressed format (png in my case); I avoided jpeg compression because it could damage the averaging process. I switched off nearly every module: I applied only a small white balance correction and a little exposure increase. I switched on also the highlight recovery and the hot pixel removal (with about 30-40 pixels corrected).

  3. I put in a folder all the pngs and I aligned all the images at once from the terminal using hugin, the command is:
    align_image_stack -a *.png
    After several minutes a series of tif images were generated in the folder.

  4. I fused all the images with enfuse using in the same folder the command:
    enfuse --saturation-weight=0 *.tif
    Another tif image was genarated after a while.

  5. I imported in darktable the fused image. I used the base curve and tone curve modules to bring the sky black and to bring out the maximum number of stars. In the process I carefully checked to not loose to much details in the Orion nebula.
    The sharpness and equalizer modules were useful to enhance the smaller stars.
    I used a gradient filter to neutralize a gradient in the image and to achieve an homogeneous black sky.
    Finally a subtle noise reduction was applied to clean up the residual noise of the sky. I used a parametric mask to denoise only the very dark portion of the image (I didnā€™t want to loose any star after all of the hard work to capture them :wink:).

Iā€™m not sure about that, but I can give it a try. It sounds like achieving a mixed result between an average and a median filter. Thank you for the idea.

1 Like

I had the impression that hot pixel removal (in RT at least, but it should apply to DT as well) is a star killer. It is much better to take a darkframe. Donā€™t know if DT handles darkframes, but RT does.

Here is a tip to update your cameras internal hot pixel database, which removes those pixels even in raws. I think itā€™s Canon specific, and Iā€™ve tested it successfully on my 600D:

  1. Detach your lens and attach the camera cap.
  2. As a precaution darken the room or use the viewfinder cap.
  3. Start the manual sensor cleaning mode.
  4. Wait for one minute.
  5. Turn off the camera.
    Thatā€™s all.

Me neither. :wink: Iā€™ve been thinking about it last night but didnā€™t come to a final conclusion.

There is a feature request for HDRmerge pending. Perhaps someone like you or Morgan Hardwood could approach Ingo ā€œheckflosseā€ Weyrich, who maintains the most recent fork of HDRMerge. But I guess HDRmergeā€™s alignment algorithm will not suffice, so it will enforce using a star tracker.

[quote=ā€œfloessie, post:11, topic:858ā€]
I had the impression that hot pixel removal (in RT at least, but it should apply to DT as well) is a star killer. It is much better to take a darkframe. Donā€™t know if DT handles darkframes, but RT does.
[/quote]Darkframes are a much better solution but the hot pixel removal should not affect stars too much. Hot pixels are usually very bright single pixels of the bayer matrix, stars should cover more than that.Taking a few test shots and slightly tweaking the focus can also help nailing it.

[quote=ā€œarctic, post:10, topic:858ā€]
Iā€™m shortsighted and there was no way I could focus well through the viewfinder. I found quite useful focusing in the live-view.
[/quote]You could try building a Bahtinov mask. But at least for me live view was usually fine. If possible I try to get the lens ā€˜prefocusedā€™ before itā€™s totally dark already.

Regarding your RAW conversion, try using rawtherapee for that. It has got support for dark frames and flat fields which can be quite useful as they get rid of persistent noise that averaging wonā€™t help you with.

1 Like

Itā€™s an ingenious thing this Bahtinov mask!
I didnā€™t know about flat field and dark frame corrections of Rawtherapee. Really nice, Iā€™ll try them!

@floessie Oh, I didnā€™t know that I maintain the most recent HDRMerge fork currently :slightly_smiling:
I made the fork to improve speed of HDRMerge and (that was more important for me) to solve some Issues with Nikon D700 NEF files. Without the custom white level option I couldnā€™t use HDRMerge with my D700 files because I got a lot of green artifacts in Highlights.

Ingo

1 Like