Category Archives: Tips

Tentacle Sync gives timecode a good name

Tentacle sync and sound devices mixpre 3

Tentacle Sync in “green” mode feeding timecode to a Sound Devices MixPre3

When I made my first film about 7 years ago, I was sort of lucky. There was this little software app called PluralEyes that enabled me to do dual-system sound without knowing the first thing about timecode. That was cool, and it served me rather well for a long time. But recently I’ve been edging out of this friendly zone into more dangerous territory where I’m getting hired by directors who expect me to provide files with timecode. Because, well, timecode.

But I’ve been pleasantly surprised to discover that, with the right tools, timecode doesn’t have to be scary. In fact, it has really speeded up my post-production workflow. But I’m getting ahead of myself. First, a reminder of why timecode can suck.

Timecode is complicated

When a director hired me earlier this year for a timecode shoot, I did what any self-respecting DP would do: I hired a professional sound recordist. They usually come with everything needed for timecode – a recorder/mixer that generates timecode, and the “lockit” boxes that plug into cameras that accept timecode, for a complete solution. But this time, the recordist told me her lockit box was broken, so I’d have to provide something myself.

I was able to rent a lockit box from Lensrentals, and I brought it to the shoot assuming the sound recordist would know what to do with it. She responded as if I’d just handed her a snake. Fifteen minutes later, she allowed that she didn’t know how to make this particular box work. I took one look at the long row of cryptically labeled dip switches, and…told the director I’d record reference audio and he could use PluralEyes to sync the footage.

The look on his face told me I’d disappointed him.

So the next day, I did some research. Turns out there are a lot of solutions that claim to be “industry standard” on the market. Most of them are big, clunky, and expensive. For example, the lockit box I’d rented is about twice as big as a Sennheiser wireless mic receiver, and has to be somehow anchored to the camera. But my Sony FS5 doesn’t have a timecode in port, so how do I work around that? Finally, it needs to be externally powered somehow…

But there’s a better way. It’s called Tentacle Sync.

Tentacle Sync is simple

Tentacle makes a tiny plastic box (less than half the size of a Sennheiser G3 wireless mic receiver) that has built-in velcro for attaching to your camera. It doesn’t need external power, as it comes standard with a built-in lithium ion battery that will last for even the longest day of shooting.

Tentacle Sync doesn't take up much space

Tentacle Sync doesn’t take up much space on a Sony FS5

There is only one switch on the Tentacle – on or off. All other controls are accessed through an app, which connects wireless using bluetooth via either iOS or Android.

tentacle label

Each tentacle is named at the factory (or you can assign your own custom name) to tell them apart from each other

I use a Sound Devices MixPre 3 as my primary recorder when I’m not working with a pro recordist, and it does not have timecode built in. However, it supports timecode, meaning, it will accept timecode from a generator. In practice this means that the MixPre 3 is a fine timecode-generating recorder/mixer when paired with a Tentacle Sync. Used in this way, you need one Tentacle Sync unit for the recorder, and a Tentacle Sync box for each of the cameras you will be using on the shoot. A Tentacle Sync Sync E Timecode Generator with Bluetooth (Single Unit) sells for $289 apiece, but you can save money by purchasing them in a set. The Tentacle Sync Sync E Timecode Generator with Bluetooth (Dual Set) sells for $519.

I now own three of the boxes, and they allow me to pair two cameras with timecode generated by the third unit, which lives on the MixPre 3.

Tentacle Sync E comes with everything needed to get up and running, and more

Tentacle Sync E comes with everything needed to get up and running, and more

How to configure a Tentacle Sync

Setting up the units is a breeze. Each comes with a unique name from Tentacle (which I’ve labeled them so I can tell them apart easily). Press and hold the on switch until the light flashes green – that puts it in timecode generator mode. Put this one on the recorder. Press and hold the others until the light turns blue – this puts them into timecode receiving mode. Then, briefly connect the master to the slave, and this syncs the timecode.

The Tentacle Sync units come standard with a 1/8″ to 1/8″ trs cable, which works great for pairing to the auxiliary input of the MixPre3, which can be set to receive timecode. It also works great to send timecode to DSLRs or mirrorless cameras via the audio jack.

Larger cameras like my Sony FS5 that don’t have a dedicated timecode in port can also receive timecode via one of the two audio channels, although you’ll need a Tentacle Sync Tentacle to 3-Pin XLR Cable (16″) to get the timecode out of the Tentacle box (which they sell).

I have a Shogun Inferno, which has a dedicated Sync port for Linear Timecode (LTC) in, and for that, you’ll need a Tentacle Sync Tentacle to BNC Cable (Straight, 16″) or Tentacle Sync Tentacle to BNC Cable (Right-Angle, 16″).

A professional sound recordist I work with frequently, Scott Waters, loves the Tentacle boxes. He owns the required Hirose to 1/8″ connector cable required to sync timecode from his Sound Devices 633 to each Tentacle Sync on my cameras.

Tentacle workflow

It’s one thing to get timecode sent to the audio port of your camera, but that’s a non-standard way of doing timecode. I’m not even sure Davinci Resolve can read timecode on an audio channel if you using Resolve to batch-sync, as many productions do.  So how do you take advantage of it?

Syncing is near instantaneous and offers XML and file export options

Syncing is near instantaneous in Tentacle Sync Studio, and offers XML and file export options plus multicam mode

Enter Tentacle Sync Studio. This is drop-dead simple software designed with smart defaults, which looks for timecode on one of the audio channels of all media dropped into it. If it finds timecode in an audio channel, it automatically uses that to sync, rather than the timecode in the video file. Once you drop all of your footage and audio into the app, it automatically builds a sync map, which you can view to see which of your footage is synced.

You can export footage via XML into your NLE of choice, or you can export new clips with the good audio embedded in them. You can optionally keep your reference audio, but the default is to replace the reference with the clean audio (which is what I almost always want).

Shogun’s 2.5 frame delay

I use Tentacle Sync with my Shogun Inferno via a BNC cable, and it works great, with a small caveat. For some reason, there is a 2.5 frame offset in the timecode, meaning the audio needs to be slipped 2.5 frames forward in the timeline to be perfectly in sync. Luckily, the Shogun Inferno has a timecode preference that allows you to set a timecode offset, but it only works in 1-frame increments. So I’ve set mine to -2 frames, and it’s close enough.

Shogun Inferno's timecode controls allow offsetting the TC signal. I find a -2 offset works best.

Shogun Inferno’s timecode controls allow offsetting the TC signal. I find a -2 offset works best.

Tip: When sending timecode to your MixPre 3, if you use presets (and you should because they are a huge timesaver), make sure you configure each preset to accept timecode via the aux in. I forgot to do that and when I chose a new preset, it reset to the default (timecode in via HDMI) and I was very sad when I discovered in the edit that most of my files contained no timecode).

Why use timecode on your projects instead of PluralEyes? Because it saves time. Syncing based on audio waveforms is very processor intensive, and if you’re working on a large project, it can literally take hours. Furthermore, it’s messy (creating folders full of temporary files), and not always accurate. Some files just never sync, forcing you to align them one at a time, a very cumbersome process.

With Tentacle Sync Studio, timecode syncing is almost instantaneous. And the Tentacle Sync boxes themselves are unobtrusive, and can be attached with velcro (included in the kit) to any size of camera.

Since I’ve discovered the joys of syncing with Tentacle, I pretty much use it on every project now. What about you? How do you keep your clips in sync?

SaveSave

SaveSave

Exploring Apple’s new ProRes Raw with Sony FS5

Apple’s ProRes Raw is the most exciting thing out of NAB this year. Fancy new LEDs that immitate police lights are cool, but they will be surpassed by even cooler lights within a year or two. ProRes Raw, however, is going to be with us for a long time. And for good reason.

I’ve been playing with ProRes Raw for the last few days, using my Sony FS5 and Shogun Inferno, bringing the footage into Final Cut Pro X to learn how it works and what it can offer me and my clients.

First of all, let me just say, I’ve never been a big fan of 4K. Few of my clients ask for it, and when they do, it slows me down in my post-production by a factor of three or four. So it’s expensive. Everything takes forever to render, and in the end, we invariably end up delivering HD anyway. So why shoot it?

Well, ProRes Raw just gave us some pretty compelling reasons.

  1. First, it doesn’t slow your editing down nearly as much as before. I found that by setting playback to “Better Performance” in FCPX, I was able to play back a ProRes Raw 4K timeline on my late 2013 MacBook Pro without any frames dropping.
  2. Secondly, to get ProRes Raw using FS5 and Shogun, you MUST acquire in 4K. There’s no option to record a 2k signal, apart from recording in high frame rates. So it would take a lot longer to down convert your footage and transcode to HD than to stay in 4K.
  3. Thirdly, the options you get in post by acquiring the raw and the 4K are actually pretty cool given that they no longer slow you down the way they used to.

So, if you could have a Tesla for the same price as a boring gas car, which would you buy? (Only, you’re going to have to wait a long time for that Tesla, and you can have ProRes today.)

Consider this:

  • No transcoding = no need to make dailies. Your raw files ARE your dailies – and you can change your mind about camera LUTs later without re-rendering 🙂
  • 12-bit color depth. Woot.
  • File sizes aren’t much bigger than current flavors of ProRes

So how does it work?

To get started working on your ProRes Raw footage in FCPX, you need to show the “Info Inspector” and change the default view from Basic to General. This reveals the ProRes Raw-specific controls that allow you to debayer your footage.

RAW to Log Conversion controls which flavor of Log you want to debayer to (it’s also possible to choose None and work directly with RAW – more on that in a moment).

Camera LUT applies a LUT to the LOG signal before it’s sent out – akin to baking in a look on your recorder (but it allows you to change your mind later).

The first thing I wanted to test out was how the Shogun was interpreting the raw signal, and see if that was the same as the way Final Cut Pro X interprets it. Turns out, they’re different. And that has implications on how you shoot.

Shogun interprets the raw signal from a Sony FS5 like Slog-2 – only it’s not.

When you look at the raw signal on the Shogun, it doesn’t matter whether you send it Slog 3 or Slog 2 on your FS5 – the Shogun thinks it’s getting an Slog 2 stream, at least for the purpose of applying LUTs. It’s NOT an Slog2 stream, though – it’s raw sensor data. But Shogun is receiving metadata from the camera so that it can interpret it, and whatever it’s doing is in SLOG2. So when you drop on a LUT designed for Slog3, it looks all wrong (way too crunchy). So until we have more options in FCPX, it’s important to stick with using SLOG2 LUTs for monitoring on the Shogun.

When you bring your ProRes Raw footage into Final Cut Pro X, however, there’s no option to debayer the footage as SLOG2. Currently the only options for Sony cameras are Slog3 in two flavors – either Sgamma3 or Sgamma3.cine.

You can also apply a camera LUT in FCPX at the same time you are debayering, but it’s important to note that if you do, it’s the same thing as applying a LUT at the beginning of your color pipeline, rather than at the end of it. Before ProRes Raw, this was generally the wrong place to do this, because if the camera LUT crunches the shadows or clips the highlights, you won’t be able to recover them when making adjustments to the clip later. But. And this is a big but: When you apply a camera LUT to ProRes Raw at the source, THERE IS NO RENDERING. Did you hear that correctly? No rendering.

Actually, with FCPX 10.4.1, it’s not only possible to apply a camera LUT to RAW, but you can also apply a camera LUT to any LOG footage.

Yep, you can play with LUTs to your heart’s content, find one that gives you a ballpark without clipping your highlights, and take it from there. So this really is a fantastic way to make dailies – but with the ability to change your mind later without incurring any rendering penalty. How cool is that?

But (a small one this time). Since you can’t shoot with SLOG3 LUTs using Shogun (or more correctly, since Shogun won’t interpret your raw footage as anything other than SLOG2), and since it doesn’t work to apply an SLOG2 camera LUTs in FCPX due to the fact that it’s debayered as SLOG3, we at a bit of an impasse, aren’t we?

Matching Slog2 to Slog3

Matching Slog2 to Slog3

Turns it it’s not that difficult to overcome. It’s entirely possible to find an SLOG2 LUT for shooting that comes fairly close to matching what you’ll see off the sensor in SLOG3 in FCPX. I did this by placing my MacBook monitor on a stand next to my Shogun, and trying lots of different LUTs on. Turns out that loading the SONY_EE_SL2_L709A-1.cube LUT onto the Shogun, I’m able to get pretty close to what I’m seeing when I open the clip in FCPX using the default Slog3 raw to log conversion, and the default slog3/sgamma3 camera LUT. I do look forward, though, to FCPX supporting more options for debayering.

OK enough about LOG. Let’s roll up our sleeves and get straight to the real RAW, shall we? All you have to do is select None as the option on Raw to Log Conversion, and you get this:

You get an image that looks, well, raw. To work on it, you have to drop it into a timeline, and start applying color controls. That’s great and all, but now you have to deal with rendering. So right away you can see that there are some workflow reasons why you’d want to stick with Log. Having said that, you can do all the corrections you want to do and save them using “Save Effects Preset” so you can apply them in a single click later. Not the same as a LUT, but definitely a timesaver. But I digress.

I was blown away by how you can pretty much, within FCPX, do the same things I used to doing to an image in Photoshop. You can recover highlights until forever! But you do have to be careful – pulling highlights down has an impact on the rest of the image, so you have to jack a contrast curve something crazy if you want to recover highlights outside a window. But you can do it:

Recovered highlights in a bright window

Recovered highlights in a bright window

You’ll notice that the brightest highlights went a little crazy there, but good news – you have pinpoint control over color saturation using hue-sat curves in FCPX. After a little playing around, I was able to calm it down like so:

Tamed highlights

Tamed highlights

The bottom line is you quickly find yourself spending a LOT of time fussing with things when you get into grading the raw RAW. But it sure is a neat thing to be able to do it when you want to.

I think I’m going to find that I stick with a Log workflow for now, but with the option to to straight into raw for some challenging shots, where I really want the kind of control that Photoshop gives a still photographer.

Blowing the image up to 200 percent, a little noise becomes apparent after all the pushing around:

Noise in 12-bit raw

Noise in 12-bit raw

But dare I say, it’s filmic noise, not ugly digital noise. That right there is the benefit of shooting 12-bit, most def. Shooting 10-bit 4k on the Shogun you definitely have noise issues in the shadows. I’d say definitely less so with 12-bit.

Here’s a side-by side comparison of 10-bit SLOG2 footage with LUT applied next to RAW 12-bit:

4k enlarged 200 percent

There’s currently no tool in FCPX raw controls akin to the denoise setting in Adobe Camera Raw tools. So to denoise this, you’ll need a plugin like the excellent Neat Video Denoiser. And that, my friends, is where the Apple ProRes party slows down a bit. You’ll definitely want to apply that filter as the very last step of any project, because the render times with denoising are painfully long.

But never mind a little noise that has to be enlarged to be seen. ProRes Raw has me all fired up. This codec looks like a game changer to me. More workflows will emerge quickly, supported by more software and hardware tools such as cameras that shoot direct to ProRes Raw. In the mean time, Apple has handed us Sony FS5 owners a way to stay on the bleeding edge for the foreseeable future (without upgrading to the Sony FS5 MKII, thank you very much).

 

SaveSave

SaveSave

Sennheiser AVX is a great wireless system – but not with DPA lavs

Tip: power AVX receiver with a cellphone backup battery

Tip: power AVX receiver all day with a cellphone backup battery

I recently took advantage of Sennheiser’s trade-in program for wireless mics in the 600Mhz range. I was able to get $100 bucks from Sennheiser for trading in my soon-to-be illegal G3 wireless package for a fancy new Sennheiser AVX system, lured by its simplicity. I figured it would be a real beast paired with my DPA D:screet 4061 mic. And I was right. But not in a good way.

Turns out the the DPA 4061 turns the AVX into a virtual theremin. If you move the cable anywhere near the antenna, you get humming and buzzing interference. That would be fine if you were creating sound effects for Gravity, but not so great if you just want clean dialog.

It appears the culprit is the thinly shielded cable on the DPA 4061, which is no match for the transmitter on the AVX. The problem disappears when you plug in the OEM mic from Sennheiser, which sports a thicker rubber coating.

Another gotcha with the AVX is that the receiver only lasts 3 hours before needing a recharge. So unless you plug in while recording, you’re likely to need extra of those $50 Sennheiser proprietary batteries. I thought I’d found a clever solution by powering the receiver from the same battery that powers my Sound Devices MixPre-3 mixer/recorder. But turns out that causes interference too. So my solution is to use a portable cellphone battery recharging stick to power the AVX receiver while I’m working. Luckily, the AVX receiver can take a charge in this way at the same time that it is operating, so this works great, even if it is a little unwieldily with all the cables.

If I’d known all of this when I bought the mic, I probably would have waited for the forthcoming Sennheiser G4 wireless. Because you can’t beat the sound of the DPA D:Screet mics.

 

Sony FS5 shooting tip: toggle auto ND on and off

Sony’s variable ND filter on the Sony FS5 is a killer feature. It gives you precise control over exposure levels by allowing you to dial ND levels up or down, in steps that are fractions a stop. Virtually all other cameras on the market today engage ND in 2-stop increments, which is like using a sledge hammer to set a thumbtack. And yet…

Sony’s variable ND isn’t truly stepless until you engage the auto ND

So if you want stepless exposure control, I definitely recommend turning it on, at least for brightly lit scenes that require ND.

But wait, isn’t cinematography all about control? Isn’t auto-anything giving up control to the camera instead of keeping it for yourself? Well, if you’re shooting outside, you’re likely going to be using ND anyway. And using auto-ND simply let’s you get into the ballpark of where you’d get manually, only much, much faster – and with the ability to seamlessly adjust to the light if you want it to. I view this as the camera giving me MORE control, not less.

Here’s an example of scene in which using auto ND gives a perfect result. Notice that I’ve set up auto exposure to +1.5 because I’m shooting log (which should routinely be overexposed to reduce noise):

Auto ND on Sony FS5

Auto ND enabled

As I pan my camera around through a door frame, leaving auto-nd engaged would result in overexposure. So I simply toggle the auto-ND off, and it freezes our auto-levels and gives us full manual control of everything.

Auto ND toggled off

Auto ND toggled off. Exposure is frozen at last auto setting, preventing our exposure from changing as we pan through dark foreground area.

I set up my #6 button (located on the inside of the grip) to toggle auto ND on and off. So as soon as I get my ballpark exposure, I turn it off. That way the exposure isn’t riding all over the place if I pan the lens behind a dark wall that I want to keep dark.

#6 custom button is located under the Sony FS5's rotating grip.

#6 custom button is located under the Sony FS5’s rotating grip.

My (wicked fast) auto-ND workflow

  1. Set aperture.
  2. Toggle auto ND on (boom – exposure is now perfect)
  3. Toggle auto ND off (freezes settings without changing them)
  4. Adjust exposure up or down as needed with variable ND wheel or other means.
  5. Repeat as needed.

Using this approach, I can be up and rolling almost instantly, and adjust to just about any lighting condition in real time. I don’t have to think about anything other than whether that auto  button is on or off. Sometimes I want it on (if panning through a scene that has different brightness levels or traveling from one room to another, for example). Sometimes I want it off (when dark object momentarily passes through the frame, for example, like panning through doorway above).

Do you use auto Nd to control exposure on your Sony FS5? Got any tips to share about how you use it?

SaveSave

Sony FS5 Tip: Keep your internal battery in when rocking an external battery

Sony FS5 with Vmount battery and most importantly, its internal battery on deck

Sony FS5 with Vmount battery and most importantly, an internal battery on deck

I got all fancy and started using a v-mount battery with my Sony FS5 recently. I had to jump through some hoops to get it working, but with the help of a Wooden Camera V-Mount Battery Plate for Sony FS5, everything worked as if it were designed by Jesus.  I didn’t even need to put in the camera’s internal battery, and it saved weight to skip it. But this morning, I was shooting this timelapse…

It was a long timelapse, and the v-mount battery died during the shoot. No biggie, right? We know from experience that when an FS5’s internal batteries dies, it saves the clip before it shuts down. But when I opened the SD card on my laptop, woe and behold, I see the clip size is zero KB. Oh Shit.

File sizes of zero bytes are big trouble

Yep, it turns out that Sony FS5 likes having its external power cut when rolling about as much as a DJ likes it when a raver trips over the cord to his mixer.  It’s a train wreck, full stop.

So I thought, hmmm. Wouldn’t it be cool if you could insert the camera’s internal battery, and just have it automatically know, when you plug in that external power, that it should defer to that, while providing instant backup if the power is cut?

So I said like, a little prayer (to Sony engineering), rolled the camera with both batteries in place, put my finger on the vmount eject button, and pressed it. Off came the battery with a clunk, and… the LCD screen dimmed slightly, and… the camera kept right on rolling!

Bless you Sony engineering

In hindsight, it makes perfect sense. So maybe I was the last camera person in the world ignorant enough to make this mistake.  I’m just grateful I was able to learn this lesson on my own dime, rather than at the end of a long client interview.

It’s good to know that, with the camera’s internal battery in place, I can hot swap in and out as many vmount batteries as I like, and all my files will be recorded safely to the SD card.

Did you know to keep your internal battery loaded when powering your FS5 externally? Have you ever experienced a power-related data loss?

 

SaveSave

SaveSave

SaveSave

Plugging the Sony FS5 ND filter gap

Before I say anything bad about the built-in ND filter on the Sony FS5, let me begin by saying that it doesn’t suck. In fact, the Sony FS5 ND filter is so good, it’s part of the reason I switched from Canon to Sony a couple years ago. I’m still amazed by how easy it is to dial in precisely the level of ND you need every time. Well, almost every time.

When you need just a little bit of ND, there’s a gap.  It’s not a huge gap, a tad more than 2 stops of light when you go from no ND to the lightest 1/4 setting. But it’s one that I’ve encountered again and again when, for example, shooting an interview wide open.

A common situation

I often want to shoot interviews wide open to separate the subject from her background. So with an f/2.8 lens, I’m as wide as I can go. And what I often find is that the room is just a little too hot, about a stop over. When I engage the variable ND, at its lowest setting, it takes the room down 2 1/3rd stops. So now let’s say I’m a good stop under exposed.

A common workaround

The only in-camera solution that won’t degrade the image (apart from increasing the shutter speed, which I don’t want to do for aesthetic reasons) is to stop down to f/4. Note, if you’re shooting glass faster than f/2.8 (and you don’t mind really throwing your background out of focus) all you have to do open to f/2.0, and engage the ND. Boom, done. But a lot of my glass is f/2.8, and that’s where I find this gap.

What the gap looks like on a monitor

F2.8 wide open without any ND. I’m looking at those highlights and saying we’re 1 stop too hot.

But when I engage the ND at lowest 1/4 setting, now we’re 2 1/3 stops under exposed. Too dark.

Turning off ND and closing the aperture to f/4 gives me the exposure I want – but I want to set f/2.8.

Finally, here’s the shot at f/2.8 with a Tiffen .3 Water White ND filter to bring us down one stop. Just right!

Closing the gap

I’ve found the right tool for plugging this gap is a .3 neutral density filter. The .3 gives you just one stop of light reduction, the least powerful ND filter that you can readily buy.

The one I prefer is the Tiffen 77mm Water White Neutral Density 0.3 Filter. The quality of the glass is superb.

I’ve also got a 4×4″ Neutral Density 0.3 Resin Filter made by Lee, which is also excellent. But I like the 77mm screw in best, because, with appropriate step ring, I can screw it on to the front of most of my lenses very quickly without rocking a matte box.

Have you encountered this situation with your Sony FS5? How do you “mind the gap”?

 

Can I power my Sony FS5 with V-mount battery? Yes, but it’s tricky or expensive

Sony FS5 with V-mount power solution

Sony FS5 with V-mount power solution

The best thing about the Sony FS5 is its small, ergonomically balanced design. It’s a lightweight camera that begs to be hand held. So why would you want to screw that up by adding a big battery?

Well, actually, there are some good reasons. I’ve recently discovered the magic of using a Teradek ServPro so clients can follow along during a shoot. And, I almost always use an external monitor. If you’re powering those things independently, you begin to spend a lot of time changing batteries instead of making pictures.

So I picked up an adapter cable on ebay for $26, bought a Redrock micro cheese plate, and started cobbling building a solution that could give me three or four hours between battery changes.

FS5 v-mount battery error

FS5 v-mount battery error

But when I plugged in one of my four v-lock batteries, the camera gave me an error (right):

Hmm. So what the heck? I tried another (also fully charged) battery. Same error. Then I tried a bunch of other more expensive things, like buying a battery plate from Wooden Camera. Same problem! Then I accidentally grabbed a battery that was partially discharged already, and boom! It fired up just fine.

After a little testing, it turns out that you have to discharge your battery for just a couple minutes, for the FS5 to recognize it. Don’t ask me why this is a thing, but it is.

To power your FS5 with an v-mount battery, buy an adapter, and a battery plate. And then, prime your batteries by discharging them for a few minutes by firing up the monitor and other devices, before turning the camera on. Good to go!

A better (albeit more expensive) solution

Wooden Camera battery plate for Sony FS5

Wooden Camera battery plate for Sony FS5

I purchased a Wooden Camera V-Mount Battery Plate for Sony PXW-FS5/FS7, to see if that would fix the battery priming issue. It costs $195. The first one I purchased didn’t work at all. The helpful staff at Wooden Camera arranged a quick return. After the repair, the unit works flawlessly – without requiring battery priming. But you do have to purchase a Wooden Camera battery slide, for $193.03. So by the time you get done paying the bill, this proper solution adds up to $388.03. The Wooden Camera battery slide is also a pound lighter than the Redrock Micro cheese plate.

Wooden Camera battery plate for Sony FS5, reverse side

Wooden Camera battery plate for Sony FS5, reverse side. The Teradek ServePro is attached to the back of the battery slide with two strips of velcro.

Footnote: It turns out that the way I’ve powered the camera in these photos is actually quite dangerous. As I discovered later, if the brick battery dies while you are rolling, your clips won’t be saved to disk, and you’ll lose the shot. Luckily, there’s a simple solution.

SaveSave

SaveSave

SaveSave

SaveSave

SaveSave

SaveSave

Two filmmaking tools I’m excited about right now

Maybe it’s because I’m over 50. I don’t know. But I haven’t been very excited about new gear lately. Nevertheless, I’ve recently made a couple of discoveries that I think are pretty special. Here’s what and here’s why:

  1. Kupo Medium Baby Kit Stand with square legs
  2. Aputure COB300D led light

I shot a project last week where I needed to light a two-person studio setup (similar to the “I’m a Mac” campaign) in front of a white background. For this I needed to rent a fair bit of light – two lights to evenly light the background, and two bright key sources that I could bounce into 4×4 sheets of foam core to give me a nice, even key illumination on both subjects with a little room for them to move around. Here’s a couple frames from the shoot:

I went with two Zylight F8s for the background lights – tough to beat the 8″ fresnel with wi-fi control that allows both lights to be wirelessly linked. When you adjust the dimmer on one light, it automatically dims the other light identically. A real timesaver for evenly illuminating backgrounds.

When the key lights I initially asked for weren’t available at my rental house (I think I had asked for Kino Tegras), the helpful tech at Glazers suggested I try the new Aputure COB 300Ds. I’d heard of them after they won Best Lighting Product at NAB last year. But this was my first opportunity to try them. So I jumped on it.

Wow. The 300Ds are *$# BRIGHT!

I went from expecting to be at f/2.8 to f/5.6. I’m serious, it’s like a Joker 800-watt light, but it’s LED-easy to work with. You can throw two or three into your car and roll out with minimal crew or grip. And by minimal grip, I mean you don’t even need c-stands to rock these bad boys. You DO need to soften these lights, though – so hence, the extra stands needed for holding bounce/modifiers.

Kupo: a bad-ass baby stand

Two Kupo Baby Kit Stands will fit perfectly in a Sachtler Flowtech 75 tripod bag, with room for short grip arms.

Two Kupo Baby Kit Stands will fit perfectly in a Sachtler Flowtech 75 tripod bag, with room for short grip arms.

That’s where the Kupo Medium Kit Stands come in. Sporting a 5/8″ spud, the aluminum stands are so light that you can fit 4 of them into the same space that you would carry 2 c-stands – and a lot less weight. They reach to 9.5 feet tall, which is plenty for just about everything I do. And they have beefy, square legs that will support up to 23 pounds.

By chance, it turns out that the tripod case that comes with the Sachtler Flowtech 75 is precisely the length and girth to fit two of these stands, along with room to carry a short grip arm for each.

Each bag containing two stands with grip heads weighs 23 pounds. I pack four of these stands in two Sachtler bags, which is plenty of stands for most of the jobs I roll out on.

The Kupo medium baby stands are perfect for supporting the Aputure lights, and much more. Adding 20″ grip heads, these stands accomplish 90 percent of what a traditional c-stand can do, while weighing about half as much. They even have a leveling leg – something none of my c-stands has. All for a price of about $100 each. Kudos to Kupo.

Bright but loud…

One thing to note about the Aputure COB 300D: the fans are loud. You really need to get the power supply away from the lamp head if you’re going to be rolling sound. They come in the kit with a 6-foot 4-pin xlr power cable, which is very high quality, but it’s just not long enough. I don’t plan to purchase these lights – they are super cheap to rent and readily available. And six months from now, there’s apt to be something brighter and better, the way led lighting is changing these days. But because I will be renting them so much, I do require a longer cable to power them, to allow placing that noisy power supply in a corner away from the filming area.

…So make your own cable

I found a custom cable maker, Show Me Cables, where I was able to built custom 30′ cables for $61 a pop. I purchased two of them, with the following specs:

Connector A

  • XLR 4 Pin Male

Connector A Label

  • 4-pin XLR male

Cable Type

  • 4 Conductor Cable

Cable Length

  • 30 ft

Connector B

  • XLR 4 Pin Male

Connector B Label

  • 4-pin XLR male
    In the comments field, you should specify “straight pinout.”

 

 

SaveSave

SaveSave

SaveSave

SaveSave

SaveSave

Behind the scenes on “a lifeline to mental health”

I just finished another film using my never shoot interviews approach.

You’ll notice there are no talking heads in this story. There IS an interview, of course, but it was recorded with a microphone, not a camera. This allowed me to break production into two parts – story development, and video production. 

I interviewed the subject first, using just a microphone. This keeps production costs low, and allows the interview to double as casting, to be sure that the subject is the right one to pull off the story. In this case, she nailed it, so we moved on to the next step, the radio edit.

In this step, I cut a story, complete with music, with the goal of making something that could air on the radio, complete with music and pacing. We haven’t even shot video yet. I submit this to client for approval.

By eliminating video, the client is forced to focus exclusively on the story, not how somebody’s hair looks. Changes they make at this stage are very easy to make, because there’s no video to screw up – you haven’t shot any yet.

Once you get approval, you schedule the shoot. At this point, you have the story in your head, and you know exactly what you need to shoot. This makes your shoot go much smoother and more quickly than if you are covering your butt because you don’t know what the story is going to be.

When you’ve wrapped, you move to rough cut. And now, the rough cut comes together really quickly. You’re already more than half done when you start, with the spine of your story already in your timeline.

If your client is an organization that has a lot of brass in the approval chain, this two-stage approach to story development has major benefits. By introducing the stakeholders to your story gently, first with the radio edit and then with the video, it removes the shock that clients always have when a rough cut lands on their desk. Because they’ve already approved the radio edit, when they see the rough cut, it’s like seeing an old friend wearing a different outfit. They will have some comments about the pants or the shirt, but they like what’s underneath it.

This approach paves the way for an easy approval process, makes the job easier every step of the way, and ultimately, makes clients happier.

5 ways ditching your camera leads to more cinematic storytelling

5 ways ditching your camera leads to more cinematic storytelling

Turn on your TV. What’s the first thing you see? Somebody talking to the camera. Blah blah blah.  Open a film on Netflix. What’s the first thing you see? Action. The essence of cinematic storytelling is showing, not telling. So, if you aspire to cinematic storytelling with your documentary filmmaking, why film talking heads in the first place? Why not commit to showing instead of telling?

I’m a little hesitant to share this insight, because it’s valuable to me. Big medical organizations hire me to to tell their most important stories, and it’s a financially rewarding gig. Why give away my secret? But ideas are cheap – it’s execution that matters. So here’s a cheap idea that, when applied, has proven invaluable to me and my clients.

Stop shooting interviews with a camera.

Wait, what? How can you say that – aren’t you a camera guy? Yes, I am. But my first commitment is to story. And I’ve observed that:

  1. Being on camera feels like a performance to most people; being on mic feels more like a conversation. Talking heads are boring; mic’d conversations are more authentic.
  2. Cameras need lights, grip and crew. Sound needs only a mic, a recorder, and a quiet place, so audio costs less.
  3. If the audio interview doesn’t move you, you move on. It’s easy to do that because you’re less invested when you use a mic.
  4. Not having any interview footage forces you (and your client) to choose a visually interesting character who provides cinematic b-roll opportunities.
  5. Recording interviews audio-only adds a “radio edit” step to your editing workflow, building client buy-in early in the process. This translates into fewer client changes at the rough cut stage, faster delivery, and a happier client.

Let’s unpack this.

Talking heads are boring

Most people aren’t actors. So, they are uncomfortable when a camera is pointed at them. There are ways of minimizing this, but the simplest, most effective way is to simply nix the camera. Instead, bring only a microphone. That way, you can be sure they’re never thinking “how do I look?” But that’s just the tip of the iceberg.

Great interviews are genuine, human conversations. And when cameras are not in the mix, everything gets easier. Not only for the subject, but also for me.  No part of my brain is thinking about the lighting, or the frame rate, or the ISO. I’m just thinking about the person I’m talking to. I’m fully present. And THAT is the foundation of an extraordinary interview.

Camera interviews are expensive

In the large-nonprofit productions I do, a typical interview involves a crew of 3-4 people and a station wagon full of equipment. If we split the equipment up between picture and sound, about three-quarters of the “stuff” we bring belongs in the picture category. By going audio-only with your interviews, you eliminate three-quarters of your stuff. This also allows you to cut your crew in half. You see where I’m going with this: Cameras are expensive; talk is cheap.

Documentaries need casting, too

Great documentary films are built from great stories. Great stories emerge from great interviews. And great interviews come from great characters. To find great characters, you have to do casting.

When you use a mic for your interviews, you turn your interviews into casting sessions. If the interview is weak, it’s easy to move on to another story candidate without incurring the high costs of camera production. It allows you to cut your losses quickly.

On the other hand, if it’s an extraordinary interview, it’s easy to get your client to buy in to the story, with a radio edit. And the only thing left to do is the b-roll. More on that below.

It forces you (and your client) to choose wisely

Too often, clients settle. They settle for a boring character and an uninspiring story that has few b-roll options. They do this because you let them. And you let them by shooting interviews.

When you shoot an interview, you can always cover your lack of good b-roll by cutting to the talking head, like they do on TV. But no matter how well you’ve lit the interview, it’s still a talking head.

When you have no talking head, you force both yourself and your client to pick a strong character, who will provide you with action to film. That is, you force yourself to show instead of tell. That’s sometimes an uncomfortable place to be. But trust me, it’s a good place to be.

The radio edit advantage

We’ve all been there. You work really hard on an edit, you present it to your client, and you hold your breath. Will they like it? How many changes will they make? What if they don’t like it?

I have discovered that if you take the time to introduce your client to a story with a “radio edit,”  you (and your client) will be able to breathe a lot easier when it comes time to deliver your rough cut.

A radio edit is an audio-only version of your film that would totally work if it were aired on the radio, including music and pacing. The first time I did this, the client came back to me with a surprising comment: “Wow. This sound like a This American Life piece. We can’t wait to see what it looks like with video!”

By giving them a polished radio edit first, you introducing them to your story gently. You invite them to buy in to the story, and to your approach. I find that clients have a few changes at this stage. But these few are much easier and less expensive for you to make than they would be after you’ve added b-roll.

Here’s how it works

On this project, my client needed help launching a $2 billion fundraising campaign. They wanted to find a story that would inspire their audience to believe that heart disease was beatable, and that their donations could have a direct impact. Because of the big numbers at stake, the approval chain included a lot of brass, including the CEO. Red flag!

The first person we interviewed (mic only) didn’t move us. Neither did the second person. So we kept looking. Then we found Jim. Here’s his radio edit:

The client liked the radio edit, and suggested some minor changes. Here’s the final film:

When they saw the rough cut (for this and another film we made concurrently using the same approach), they sent me this email:

Dan. I don’t think we have ever given this little feedback on videos. They’re universally loved, and we think they’re going to work really well.

Werner Herzog once said “Good footage always cuts.” The visuals don’t have to literally match the dialog (as long as you have great story as a foundation). Somehow by choosing a character who inhabits an interesting environment, who does interesting things, the footage will always cut.

So. Next time you’re faced with a high stakes interview-based project, consider how ditching your camera can help you do more with less.