I've been meaning to do some time-lapse work for a while now, and last weekend I headed down to Embankment bridge. Looking south one can spot Big Ben, the Houses of Parliament and the London Eye.
I was particularly keen to catch the dazzling motion blur of car headlights. This meant long exposures, so I set the shutter speed to half the frame interval. Conventional film and video cameras usually operate with the shutter open for half the frame.
My intervalometer was a PocketWizard MultiMax. Normally these are used for wirelessly triggering flashes, but the bells'n'whistles MultiMax can also be connected to a camera's shutter release.
The video above demonstrates four time-lapse speeds (shutter/interval): 0.4sec/0.8sec, 2sec/4sec, 4sec/6sec and 4sec/8sec. Each clip is ten seconds (250 frames at 25fps) long.
I've also added a subtle amount of "camera drift" in post. It's as though I am on the bridge holding the camera. A little movement definitely helps bring the scene to life.
I experimented with a variety of intervals, from 0.4sec to 8sec. At the fast end I encountered problems. One was that my camera - a Canon 5D MkI - had trouble writing the JPEGs to card fast enough; the memory buffer would fill up, preventing the camera from taking more shots until space became available. I worked around this by setting the JPEG size to Small, thereby reducing the amount of data that had to be saved. A small JPEG on the 5DMkI is still large enough to comfortably accomodate a 1080p HD video frame. The second problem involved the lag of the camera's mechanism. Although the shutter speed may be set to 0.2sec, additional time is required to flip the mirror up and return it to rest afterwards. With such a relatively long shutter speed, a frame interval of 0.4sec was insufficient. The camera missed every other shot.
Another motivation for this test was to put my DSLR Stills -> Video -> Web workflow through it paces. The goal was to arrive at a colour-accurate 720p HD version online, in this case using Vimeo. The full details are beyond the scope of this short blog post but needless to say, colour management was a major headache. In the film and video world, the delivery target is usually better defined. More assumptions can be made. Colour management is therefore approached rather differently, and some may say, more loosely. Final Cut Pro for example has no concept of colour management, instead relying on the operator to use monitors calibrated to a specific target. Apple Compressor also expects to receive video in a certain colour space.
Nevertheless, I spent a couple of days investigating the problem and managed to cobble together a solution. Further work is required to streamline the workflow though and test it with some proper colour cards.