Since I got entangle up and running on the mac here I thought I would try something I’ve always wanted to do. It’s a view of Mt. Hamilton with clouds going by and jiggly trees.
EF-50mm II
The ffmpeg command that creates timelapse from stills for youtube:
Remember this thread where a timelapse was smoothed?
I might be able to add 1 intraframe between exposures, but that comes at a bandwidth price. I was going for the basic illusion of cloud development, taking frames 4 s apart, exposing each for 1/250 s, and running them at 16 fps. A 1 intraframe addition would bring that up to 1/32, which is closer to standard video. Or I could skip any interpolation and take shots every 2 s.
Since we both share the same camera model: Why have you waited for entangle on the Mac when there has almost always been Magic Lantern’s intervalometer?
I’ll have to give that a try one of these days. You see, the warnings about bricking thy camera with Magic Lantern have been taken close to heart, as the camera was a gift. One of the things I like about the T3i is that the interface was so simple, I was able to immediately shoot manual RAWs without opening the superfluous documentation. If I were to give the camera away, which might happen when I go overseas, I’d want it to be as a base Canon. If I decide to give my camera away, I might be gifted a better model. That’s just how our οικονομία works in my family.
Btw, Entangle isn’t really for mac; it’s a docker hack (kd6kxr/entangle has 51 pulls already). I didn’t wait for it to come out, it simply emerged out of curiosity when @Tobias announced the new 2.0 version. I also appreciate the general going about something innocuously unsupported and experimental.
I wonder if Magic Lanterning ones camera interferes with tethering.
I have had similar reservations about ML but took the plunge anyway because the features were irresistible. Especially, for a tinkerer and programmer like yourself, it is an opportunity you wouldn’t want to miss!
You’re right but seriously I’m giving away the camera overseas to someone less techie thank I. But when I acquire something newer I will need to give ML a shot.
You need to be mindful of its compatibility. The ML folks haven’t been able to crack the dual pixel tech; you would have to stick with the older models… Unless, you are willing to help; they seem to be in need of more devs and tinkerers.
The one I’m looking at is the 24MP T7i / 800D which ML says porting has been started, and there are a number of commits for 800D in the source code, so that looks promising.
The only thing ML alters inside the camera is a boot flag to boot from SD. This can be undone in the ML installer. With a “reset to firmware defaults” the camera should be in a pristine state afterwards and ready to ship overseas.
I’m running the vintage ML 2.3 on this model and nightly build from 2017 on my 6D (note to myself: update!). No major problems so far, but I can understand your hesitation. It was a shocking moment, when the display stayed black after installation on the 6D. After seconds I gladly realized this was due to the factory reset I performed to prepare for the installation. The 6D won’t show anything on screen by default.
BTW if you don’t know already, you click the little gear to access High Def video:
It’s not perfect, I converted from .cr2 with dcraw -h so there are many stuck pixels, and a few hiccups when I restarted the 100 shot batch; I didn’t realize my battery would last for three hours. Yes, I don’t have a ac/dc converter.
Here’s a little explanation of the crossfade command. ffmpeg -i ~/video/clipA.mkv \
^-- first clip 6 seconds -i ~/video/clipB.mkv -an -filter_complex \
^-- second clip 17 seconds long "[0:v]trim=start=0:end=4,setpts=PTS-STARTPTS[firstclip];\
^-- clipA from 0s to (the end-2s) [1:v]trim=start=3:end=17,setpts=PTS-STARTPTS[secondclip];\
^-- clibB from end of fadein to end of clip [0:v]trim=start=4:end=6,setpts=PTS-STARTPTS[fadeoutsrc]; \
^-- clipA fade out part / last 2s [1:v]trim=start=0:end=3,setpts=PTS-STARTPTS[fadeinsrc];\
^-- clibB fade in part 0s to 3s [fadeinsrc]format=pix_fmts=yuva420p, fade=t=in:st=0:d=1:alpha=1[fadein];\
^-- generates the fadein [fadeoutsrc]format=pix_fmts=yuva420p, fade=t=out:st=0:d=1:alpha=1[fadeout];\
^-- generates the fadeout [fadein]fifo[fadeinfifo]; [fadeout]fifo[fadeoutfifo];\
^-- crossfade parts go thru buffers [fadeoutfifo][fadeinfifo]overlay[crossfade];\
^-- The video is crossfaded [firstclip][crossfade][secondclip]concat=n=3[output]" -map "[output]"\
^-- three video streams combined to the output -crf 5 ~/video/AplusB.mkv
^-- the output