I’ve tried slowmoVideo, but that results in stars sometimes moving - they get pulled out of position by the moving polar lights.
I then tried to use basic crossfading with ImageMagick’s convert command and its “morph” option as per here, but that gets killed after quickly using all my RAM and swap even if I set resource limits.
I’ve also tried this bash script which uses ffmpeg, which seems to be promising. The resultant video shows each of the source images for a second or two after each crossfade, though, and I’ve left a question for its author on how to remove that as I don’t understand the commands used.
It looks like G’MIC might be able to provide for the crossfading, but its documentation scares me!
Does anyone have any ideas or guidance on how I could go about this? Many thanks.
Buy @David_Tschumperle a hot chocolate and press him for a simple G’MIC script (or any of the G’MIC filter authors for that matter).
Possibly doing something in Blender? I’m not sure how best to handle crossfading so many sequence of images, but there might be a way (I’m not near my main blender machine atm to test it).
The problem with imagemagick is that it will try to load all of your images into memory(!) before it can start to create output. Have you considered doing it in batches maybe? Like, every pair, then every pair, etc… until you reach the end? (I used to have to do this until David fixed it for me in G’MIC).
OK, I’ve done a quick hack, maybe you can test and tell me if that works for you.
It tries to minimize the memory usage, and compute a temporal fading of a serie of image, in a streamed way.
First, you need to update your command definitions (only once):
$ gmic -update
Below is the command line that works for me with the bash, if you are on Windows, there should be some differences (no need to backslash the * I think). I’ve not tested on Windows though.
Wow - thanks for that! It seems to be working well for me in Ubuntu 14.04; I installed G’MIC from the Ubuntu repository and then followed your instructions. Here’s my test result:
After a bit of experimenting, I decided to go with 7 intra-frames into 60fps video for my eight-second exposures. It’s not perfectly smooth of course (I suppose I should try to reduce the exposure length in future), but I’m pretty pleased overall. I’ll probably tweak the tones and darken the night sky slightly before adding some Sigur Rós music.
The frame_step parameter can be used to ‘skip’ some frames for the computation, like considering only one image out of 3 in the image sequence. It is not that useful if the image sequence is a list of image files, but can be if your input image sequence is given as a video file (using command -fade_video).
I’m happy to see this is working as expected! Your resulting video is truely amazing
http://slowmovideo.granjow.net/ might be worth trying too. It’s calculating optical flow and then morphing the images. With that said, you video does look very good already!
Actually, command -morph already does that, but it’s a bit memory consuming right now (because it keeps all interpolated frames in memory). So, I’m working on a ‘streamed’ version of -morph.
Thanks for the feedback. I’ll let you know when I’ve got a proper edit done which I think will contain some other sequences, too.
I see - thanks.
That’s the one I initially tried, but I found that sometimes some stars would be moved along with the lights and then snap back into place on the next “keyframe”. I’d definitely be up for testing a streamed version of G’MIC’s morph!
OK, so I’ve added another set of commands -morph_files and -morph_video which basically works like fade_files and fade_videos, with some extra parameters for the morphing algorithm.
The morphing is done between two consecutive frames by estimating both the forward and backward motion vectors (using an optical-flow like method), then interpolate the frames temporally using these two motion vector fields.
No need to say this can be very time consuming ! (even if like me you have 24 cores!)
How to use it?
First, update your filters, and check the commands are recognized:
gmic -update
gmic -h morph_files
...(help should display here)...
An example of use, using image files as the input (here again, using bash on Linux, may be slightly different on Windows):
(you may want to replace output.png by output.avi to gets an .avi video file as the output).
And that’s almost the same if you have an input video file, instead of a sequence of images:
The smoothness parameter is important to set correctly for the morphing algorithm. Basically if the frames you want to interpolate can be well registered by a rigid motion (translation,shift,…) then the smoothness can be high (like 1 or 1.5). For non-rigid motions, try a lower value (0.1 is medium, 0.01 is low).
After using the update command, the morph_files command doesn’t seem to be recognised.
$ gmic -update
[gmic]-0./ Start G'MIC interpreter.
[gmic]-0./ Update commands from the latest definition file on the G'MIC server.
[gmic]-0./ End G'MIC interpreter.
$ gmic -h morph_files
gmic: GREYC's Magic for Image Computing.
Version 1.7.3, Copyright (c) 2008-2016, David Tschumperle.
(http://gmic.eu)
[gmic] Command 'morph_files' has no description. Try 'gmic -h' for global help.
hum. Isn’t it because of some caching problems ?
Try to get the update file directly from http://gmic.eu/update174.gmic, and put it in your $HOME/.config/gmic/ folder (replacing the older one).
Ah yes, sorry but if you have version 1.7.3 of G’MIC, you need to rename this file $HOME/.config/gmic/cli_update173.gmic. It should work then (but in this case, don’t invoke $ gmic -update, otherwise the cli_update file173.gmic will be overwritten.