Category Archives: Tips

5 steps to color perfect c100 footage using LUTs

Why is the C100mkii a ridiculously awesome camera? It’s not the fluid ergonomics, the built-in ND filters, or the camera’s snappy autofocus. Your audience can’t see how you got the footage; they just see that you did get the footage. And the footage is what’s epic. Canon’s color science is why we pay the medium bucks for the EOS Cinema series cameras. But to get the most from your footage requires a little work. By following these steps, you’ll get 12 beautiful stops of highlight-holding dynamic range, every time you press record. Here’s how.

Before shooting:

1. Set minimum ISO to 850. Unlike with a traditional DSLR, choosing a lower ISO will NOT result in a better image on this camera. Instead, it will rob you of dynamic range. So always always always set a minimum ISO of 850. In low light you can go higher – much higher, in fact, and get great results. But never go lower.

2. Shoot Canon Log (CP Cinema Locked). This is the only way to get the full 12 stops of dynamic range out of your camera. The footage will initially appear flat when you view it. But you have plenty of quick options for giving it snap, crackle and pop by applying LUTs. More about that momentarily.

3. Enable View Assist. The image on your LCD will appear flat, and to fix that and give you an approximation of the final image while shooting, you’ll need to turn on the view assist. It’s located in the LCD menu. TIP: I add this and several other settings to the custom menu, which makes finding them much quicker than hunting through the menus.

After shooting:

4. Use a LUT. A LUT (Look Up Table) is an automatic color correction designed specifically for your camera, which is applied to your footage in post. Which LUT to use? I recommend these free EOS Cinema LUTs from Able Cine. To apply them to your footage, you’ll need an inexpensive plugin like the $29 LUT Utility. These will work as a plugin to the NLE of your choice (i.e., Premiere, FCPX, etc).

5. Adjust LUT intensity to get desired look. Using LUT Utility, you can adjust the effect from zero to 100 percent. I often find that dialing in 60-80 percent of the effect is just about right.

Here’s an example, an interview shot in front of a window. The challenge is we’d like to make her skin look awesome, while at the same time retaining as much highlight information as possible in the background. So we shoot in CP Locked, and…

Screen Shot 2015-08-26 at 11.56.56 AM

Above: CP Locked footage looks flat prior to grading.

Screen Shot 2015-08-26 at 12.11.14 PM

LUT applied (no other color correction). As you can see above, this instantly does wonders for our footage. But it still needs a little work.

Getting the most from LUTS

To get the most dynamic range (that is, visible detail in your image from the darkest shadows to the brightest highlights) choose the CxxxLog10toWideDR_Full option (the one applied above).

To get punchier results, and more life in the skin tones, try one of the other two (the ones beginning with ABNorm- and JR45-), continuing to choose the -WideDR_Full option for each.

Screen Shot 2015-08-26 at 12.23.27 PM

 

ABNorm_CxxxLog10toWideDR_Full appliedAdding this LUT makes our image appear too crushed, in my opinion, and slightly over-saturated. There are two ways to correct that while retaining the punchier color of the effect. You can lighten the image (in this case, by pushing up the midtones) or you can reduce the LUT intensity.

Screen Shot 2015-08-26 at 12.31.34 PM

Mid-tone brightness increased to compensate, above.

Screen Shot 2015-08-26 at 12.33.28 PMFinally, reducing the intensity of the ABNorm- or JR45- LUTs, which tend to overly saturate and crush the image, allow you to dial in just the right amount of correction.

Screen Shot 2015-08-26 at 12.36.41 PM

 

The final results. Notice how lifelike the skin tones are, and how the important detail in the background is preserved. Besides applying the LUT, the only color correction I’ve applied to this frame is a slight boost in the midtones.

Conclusion: Using a LUT, we can very quickly get everything out of our c100 footage that the camera is capable of giving us. Vibrant skin tones, crisp blacks, and 12 stops of dynamic range that allows us to hold detail in brightly lit areas of our frame, such as the background used in this example.

I don’t know about you, but I’d rather spend my time thinking about the content of my films than how to color grade them. Using a C100 and a LUT, you can have both.

Allen Institute for Artificial Intelligence looks for help to tackle the toughest AI challenges

My latest commercial piece is a recruiting video designed to attract talent to the Allen Institute for Artificial Intelligence, which has it’s offices tucked along the north shore of Lake Union here in Seattle.

I was thrilled to have the opportunity to work with the best gaffer in Seattle, Jeremy Mackie, on this project. But I got more than just his top-notch lighting skills. After many years lighting films like Eden and Fat Kid Rules the World, Jeremy is looking to step behind the camera and work as a DP. I was thrilled to give him the opportunity on this project, and I think the results speak for themselves. Having a DP who knows lighting inside out, backwards and forwards the way Jeremy does is just epic.

Rounding out the crew on this one were Mike Astle, grip; Justin Dolkiewicz-Kotsenas, AC, and Nickolas Abercrombie, sound. I did the drone work myself with a Phantom 3.

Ryan Schwalm was my camera assistant/grip on the pickup shots.

A couple things worth mentioning technically: As you can see in the photos above, we used a Dana Dolly for most of the b-roll. It really brought some life the office scenes, which would have otherwise been pretty static. I had the dolly mounted on low-boy combo stands with rollers, which made it quick and easy for Jeremy to pick up the b-roll we needed while the rest of the crew was packing up at the end of the day.

And take a look at that hand-made LED light, hey? It’s a real piece of work, something Jeremy cobbled together from parts supplied by Lite Gear. It was a great fill source: portable and color-temperature controllable with very high CRI. I want to make one!

Finally, we used a pair of powerful HMI lights on this shoot, an M18 and 800-watt Joker, to bring up the interview subjects to match the window light in the background. But in one case, this approach backfired. When the CEO saw the rough cut, the CEO felt he was squinting too much as a result of all that light. So we had to reshoot him, using mostly natural window light with a single Kino Tegra for fill. It’s great to play with the big guns, but keeping it simple is sometimes best when it comes to working with non actors.

5 reasons to crash your Phantom 2 immediately

I DP’d a short promo film for the 60 Second Film Festival recently, which involved quite a few drone shots. After nailing them all with my Phantom 2, I made a stupid mistake: I relaxed. With my eyes off the drone for a moment,  a huge douglas fir reached out and swallowed it. You gotta keep your eyes on those trees! A professional tree climber retrieved it a day later, but it’s gimbal was damaged. When I considered the cost of repairing it, vs. purchasing a new Phantom 3, I made the decision to upgrade.

After less than a week with my Phantom 3, (which I purchased from the fine pros at The Copter Shop in Woodinville) I can assure you that crashing  my Phantom 2 was the best thing that could have happened to my aerial cinematography. Here’s 5 reasons why.

1. The DJI Pilot app is ridiculously awesome. My head is still spinning with the mad power it puts at your fingertips: a map that shows you where the drone is and how it’s pointed, an HD monitor that gives you a beautiful real-time camera view, ability to override automatic exposures or temporarily lock them during a shot, ability to switch between stills and video, and so much more. With a finger swipe, you can hide the overlays and just see the gorgeous Lightbridge video stream. Going from Phantom 2 to this isn’t just an incremental upgrade – it’s an altogether different way of flying.

2. Full manual exposure controls. On the fly, using the Pilot app, you can assert full manual control over exposure. So, if you want to do a slow tilt up over dark green water to reveal a sun-drenched horizon, now you can do that without blowing out the horizon as it comes up. Try that on your GoPro!

3. Integrated 4k camera. The built-in camera is much easier to use than GoPro Hero. For starters, you don’t have to go through the tedious step of replacing camera batteries – ever – because the camera is powered directly from the flight battery. Next, replacing the microSD card doesn’t require doing surgery on the camera with tools – just insert it directly into the side of the craft. Also, the lens has threaded filters, which is going to make it super easy to mount ND filters as soon as they are on the market (the fine folks at Snake River Prototyping assure me they are hard at work on making compatible filters even as we speak).  Also, the camera doesn’t have that silly fisheye problem that all GoPros suffer from. Sold!

4. Lightbridge. You get crystal clear HD video transmission in real time, and it allows you to fly far, far away. Like a mile or more, and still see clearly. For me, this makes full FPV flying a real option for the first time. The video transmits through semi-permeable objects like trees, no problem, while my Phantom 2 would go to static the moment I went behind a tree.


5. Instant fine-tuning of stick sensitivity.
When you’re doing aerial cinematography, you want very sluggish sticks at the beginning of the stick, but you want them to respond normally at the far end of the stick. That gives you a fighting chance when doing complicated moves like circling an object, or for a parallax shot. Or just for doing a slow, feathered start or stop to a move. On my Phantom 2, I tired all kinds of advanced calibration, and ultimately junked all of them because I couldn’t get it right. Part of the problem was that you had to connect to your computer to apply a calibration, then disconnect, test, and try again.

exponentialWith the Phantom 3, you get the ability to set the response curve of the sticks (see graphic at right) without connecting to a computer. All the configuration options are a few swipes away in the DJI Pilot app.

After just one evening of test flying and stick calibrating, I was able to pull off a very difficult circling-while-climbing shot that I never would have attempted on my Phantom 2. And it feels like just the beginning of what’s possible with this craft.

Canon C100 mkii configuration guide

configguidec100mkiiThe C100 mkii is an amazing documentary camera. It’s capable of 12 stops of dynamic range and it gives you everything you need (ND filters, EVF, phantom power, etc.) without additional rigging. But to get the most cinematic performance out of it,  it’s important to set it up correctly, and remap one of the buttons. Here’s how I configure my camera before a big shoot.

First of all, do an auto black balance. Canon recommends you do this every time you change the ISO. It’s especially important to do this if you’ll be shooting with high ISOs. From Camera Setup menu, select ABB. Make sure a lens cap is on. Press OK to perform the balance.

Auto Focus

1. Reassign One-Shot Autofocus to button #7. Button 15, the default, is in a very awkward location if you plan to use autofocus regularly, like I do.

afdefault

This is an awkward button location for routine focus grabbing. So we’ll remap it.

 

button7

Button 7 is a great choice for one-shot autofocus, especially if you’re used to working with DSLR autofocus, which places it in the same spot as the autofocus button on pro Canon DSLRs

To remap the button, select Other Functions > Assignable Buttons. Then choose button 7.

assign7

Button 7 is set to Magnify by default.

 

assignbutton

Press the joystick and scroll to select One-Shot AF

Picture Style

cine-locked2. Set to record Canon Log. Under the Camera menu, select CP Cinema Locked and set On. This enables Canon Log, which gives you a flat file that grades beautifully in post. Using this setting preserves all the options that are available to you in post. Using this setting in combination with a  C100 lookup tables supplied from Able Cine is a speedy way to get amazing looking footage.

Monitor

viewassist3. Enable View Assist. Under OLED/VF Setup menu, select View Assist and set to On. This makes what you see on screen look more like what the contrast and exposure settings will look like after the image is graded. If you don’t enable this, the image on screen will look very flat, making it difficult to judge exposure by eye.

ISO

4. ISO [850]. To get the maximum dynamic range out of this camera, set the ISO at 850. If you set a lower ISO, you will be losing information in the highlights.

Auto lens compensation

5. Peripheral illumination Correction. Under Camera Setup, set Peripheral Illumination Correction to On (if available for your lens). This will automatically fix vignetting and barrel distortion issues on supported Canon lenses. It works like magic!

perifph

Codec

6. Select AVCHD 24mbps. Under Other Functions menu, select Movie Format and choose AVCHD. Then, under AVCHD menu, select Bit Rate 24 Mbps LPCM (LPCM allows you to record uncompressed audio – the best quality). AVCHD is slightly better than MP4 in terms of quality.

With these settings, you can rest assured that your footage will live up to the amazing potential that this camera is capable of. Have a great shoot!

4 essential plugins for improving GoPro drone footage

I just finished filming an epic drone sequence on Seattle’s waterfront, in which I filmed a mammoth barge headed out to sea. Unfortunately I can’t share the footage, which my client is keeping under wraps for the moment. But I can share with you the 4 plugins that I’ve found indispensable for cleaning up and presenting drone footage shot with my GoPro Hero 4 and Phantom 2.

Digital-Anarchy-Flicker-Free-1.0.1-FULL-Precracked1. Flicker Free.

Cleans up the nasty interference pattern caused by quadcopter props when sun is a factor.

Any time you’re shooting into the sun, or when the sun is angled overhead in front of the camera, you can see this issue. It looks like a bad TV channel from the 60s: rapidly scanning lines caused by the shadow of the props on the lens. For the longest time I thought there was no way to fix this. One day I was using Flicker Free to clean up a time-lapse, and though: hey, why wouldn’t this work for aerial footage? I tried it, and it worked like magic. I generally get best results with the “remove horizontal bands 2” setting, but if that doesn’t work for you, try the other settings as well until you find one that does.

Screen Shot 2015-04-12 at 4.10.31 PM2. Lock and Load X.

Stabilizes and reduces rolling shutter. Works especially well to make parallax shots looks smooth and intentional. Something about how this plugin works just smooths out overcorrections in steering. It works far, far better for this than the built-in plugin in Final Cut Pro X or Adobe’s Warp stabilizer. This is another one of those magical pieces of software that I apply to ALL of my drone footage, whether it needs it or not. It always makes it better. Try it yourself – they offer a fully functional 30-day trial.

682482033. FilmConverPro with GoPro camera pack.

There are two presets for use with ProTune footage (which I use on my GoPro) that instantly make your footage look great. To get the most of this, you have to configure your GoPro Hero 4 the right way BEFORE you shoot. Here’s the settings I use:

  • Pro Tune
  • Flat
  • Max ISO: 400
  • Sharpening: low

I also find that the smoothest, most cinematic drone footage often results from shooting at 48fps in 2.7K (conformed to 24p in post). That way, in addition to the option for slow things down, if you ultimately export out to 1920×1080, you have a lot of extra frame that plugins such as Lock and Load and Fisheye Fixer can work with without losing any resolution. You can also crop in closer to your subject without losing resolution – which allows you to shoot a little loose – a little farther away from your subject. Improves the odds that you’ll get your drone back safely!

crumplepop_fisheye_gopro_douglas_044. Fisheye Fixer.

Straightens the horizon curvature that is always present with GoPro footage.
The curved horizon thing may look great for in-your-face sports action, but for the typical drone shot, it’s bullshit. Get rid of it!

Fisheye Fixer gives you fine grained control over how much curvature to remove, so you can dial in the perfectly flat horizon that we like so much.

Screen Shot 2015-04-12 at 4.05.46 PMBonus tip: Use an ND filter in front of your GoPro. This is like putting a pair of sunglasses on your lens, and reduces the amount of light coming in so that the shutter speed can come down to something more cinematic. This allows movement to blur and look natural. It also goes a long way to reducing jello shutter that is exacerbated by high shutter speeds.

I recommend using the Snake River Prototyping BlurFlix Air ND 4 (good for both cloudy days and sun). It’s currently the best on the market for drone use because of it’s light weight, which allows them to be used without upsetting your delicate gimbal.

For if you’ll be shooting in bright sunlight all the time, I’d recommend their ND 8 filter.

Happy drone shooting!

5dmkiii Raw tip: install Magic Lantern on one SD card, then record to many CF cards

As I’ve moved more and more to a raw workflow with my 5dmkiii, I’ve now got 6 CF cards, all 64gig 1000x. Keeping the same version of Magic Lantern installed on all those cards has become a problem, especially since I like to keep all my cards current with the most recently nightly build. So I was thrilled to discover recently that it’s possible to install Magic Lantern on an SD card, which stays with the camera, and then you just use empty CF cards for recording. A simple tip, but it’s been a big time saver for me. Thanks to the fine folks at Lensrentals.com for sharing this tip.

One ring to rule them all: seamless focus gears by mechanical engineer Sean McCurry

First you get a DSLR, then you get a follow focus unit. Then a bunch of stuff happens, and you end up with a pile of this on your living room floor:

Today I’m happy to report that such bandaids for dslr lenses are no longer necessary, thanks to a mechanical engineer named Sean McCurry, who is quietly revolutionizing the follow focus gear, one perfectly printed lens gear at a time. Wait, printed? But before we get into that…

I guess you could say that I’m a focus gear whore. I feel like I’ve tried just about everything on the market in hopes of finding one that worked seamlessly (so to speak) with my set of Zeiss/Contax primes. But every one I’ve tried has left me cold. To be specific:

Redrock Micro gears are nice because they give some autofocus lenses some much-needed extra throw. But with my Zeiss primes, I found the extra throw to be too much. And the ergonomics suck: too big to store in my lens case, they have to be assembled before every shoot. Major bummer. I want gears that I can buy and forget about,.

Zacuto Gears are basically thin bands of plastic that have a big awkward bump. They get the job done, but I’ve had them slip off my lenses more than once while running and gunning, because the bit that holds the two ends together gets caught on things. Oh, and they aren’t cheap.

If you want cheap, you want Jag35 zip-tie gears. But like the Zacuto, they catch on things, and they don’t add any throw diameter to your lens, either.

Genus gears are one-size-fits all, which makes them great for larger diameter lenses like my 300mm f/4 Nikon. It’s the only gear I could find to fit it. But not at all great for more standard size lenses, where the tightening screw gets right in the middle of your business. Plus, they tend to loosen up during use, and you have to remember to keep retightening them.

One thing I have never tried: Duclos cine-mod. This is the gold standard of lens gears. And by gold, I mean $105 a pop. But what’s prevented me from going Duclos is that you have to send your lens away for an unknown length of time to have the mod done. That more than anything has been the deal-breaker for me. I need my lenses.

Above: iPhone pano of my set of Zeiss primes, with Sean’s gears.

So. Is it too much to ask to have something as perfect as the Duclos mod, for a third of the price, that without any tools, I can install myself?

Enter a mechanical engineer named Sean McCurry. I accidentally discovered his brilliant work while surfing on Ebay a few weeks ago, when I was startled to see a listing for “Seamless follow focus gears” specifically made for Contax-Zeiss primes. I have a lens set that ranges in size from 25mm – 135mm, and Sean had each of my focal lengths covered. For $35, I took a chance and ordered one for my 50 f/1.7 prime. It arrived three or four days later, and with great curiosity I took it out of the box. Four pages of instructions on lens fitting were included, but were unnecessary: the gear fit PERFECTLY. I simply had to very carefully and slowly wiggle the gear on, until it seated firmly into the spot where I wanted it to stay on the focus barrel of the lens. The fit is so tight that it doesn’t slip at all, doesn’t require glue, and feels like it was made for my lens. Which, in fact, it was.

I’m not 100 percent sure how Sean is able to make such killer gears. But I’m confident it’s because he’s 3D printing them. A close examination of the gears reveals telltale patterns, strata in the plastic that are consistent with 3d printing (click image to enlarge):

One great thing about these gears is that I was able to place them at approximately the same position on 4 of my 5 lenses, so that when swapping lenses, I don’t have to adjust the focus puller position on the rails. Also, my previous gears would ride up and down the lens as they came in and out of their foam Pelican case, requiring constant readjustment, often in the middle of a shoot. These gears stay put.

Need more amazing? Beyond the great ergonomics, these gears producer smoother more predictable and repeatable pulls than I’m used to getting from my previous gears. Maybe it’s the extra gear depth, maybe it’s the precision of the printing, maybe it’s the Delrin they are made from. Whatever it is, these gears have taken my focus pulling to the next level.

Sean is currently making the gears for popular DSLR lenses that include the following:

Sigma 18-35mm f/1.8

Set for Contax Zeiss Lenses

Canon 100mm Macro lens

Nikon 105mm f1.8 AI-S Lens

Canon 24-70mm Lens L Series F2.8

Canon 70-200mm f2.8 L IS Lens

Tokina 11-16mm f2.8 IF DX II Lens

You can see the full list (currently 105 items) to see if your lenses are on it.

Don’t see your lens on the list? Sean welcomes custom orders. You can measure the circumference of your lens, and email your request to helicoptersean@gmail.com. Or contact him via his Ebay shop.

So here we are. Living in a world where the best stuff can come out of a printer. Welcome to the future.

Magic Lantern raw now supports audio recording with MLV format

Magic Lantern raw keeps getting better. It’s been awhile since I downloaded and tested the latest nightly builds. Turns out there’s a LOT of progress being made: the latest releases include the option for a new recording file format called MLV (for magic lantern video). This format allows for some really great stuff, most significantly for me, audio recording.

Post-production keeps getting better, too. There’s a new batch-conversion utility called MLV Mystic that allows Mac users to unpack the MLV files directly into cinema DNG files, which can then be opened directly into Davinci Resolve 10.

HDMI monitoring is also vastly improved. Gone is the bug that caused pink-frame tearing when recording 3x (which at the press of a button turns any lens into a macro). No more pink frame tearing! Also, the monitor display bugs that incorrectly drew the boundaries of the frame are fixed.

HDMI tip: I discovered this weekend while testing that it’s critical to plug in your hdmi monitor to your camera in the correct order. First, with your camera off, power on your HDMI monitor and plug it into the camera. Then, power up the camera. This way, it will correctly draw the HDMI screen. Otherwise, you get some misaligned black bands encroaching into the display from top and bottom.

Another sweet thing, and this is a big one: I discovered after much testing that Magic Lantern’s focus peaking is just killer. It works like magic to help you find focus, and you can select from three settings: one for darkly lit subjects, one for brightly lit subjects, and one that is a bit of both, for average scenes. This peaking is the first I’ve ever used that actually works without over sharpening to death or otherwise unacceptably screwing with my monitor. I’ll be using it by default from now on.

All I can say is: thank you Magic Lantern team. You guys are tireless in continuing to unlock the potential of this amazing image making tool called the Canon DSLR. The raw video image coming out of this camera is breathtaking.

Shooting in a rain forest is a bit like shooting under water

I just returned from spending the longest night of the year in one of the darkest, dampest places in Washington: the Quinault Rain Forest. Lisa and I spent a couple of days there last year, and discovered it to be a magical wonderland for photography.

When we returned this time, we brought a Speedlight and a 20″ Glow Hexapop, made by Adorama. This is a small and very portable soft box, so it was a small thing to pack it along. But we were really impressed with the results when we used it to backlight our subjects. Just as with underwater photography, subjects that otherwise look monochromatic in the eerie half light come to life with a little pop of strobe. Happy Solstice!

The invisible key to better documentary film interviews

I know the key to better documentary interviews. A silver-bullet technique that has enabled me to make award-winning films like The Coffinmaker and The Metalsmith (both Vimeo staff picks). In the photo above, Scott Berkun is using the technique to interview Martina Welke of Zealyst for We Make Seattle. It’s not difficult. In fact, it involves doing less than what you’re currently doing.

Huh, you ask? How can doing less make my films better? I’m going to share this technique with you in a minute. But first, some background.

When I began making documentary films five years ago, I was coming from the world of still photography. In that world, it’s possible to be a one-man band and do a great job. Not easy, mind you, but totally plausible. Film is a different animal.

Consider for a moment how many balls you must to keep in the air to pull off the simplest of shoots, the interview:

  1. Camera (focus, batteries, monitoring subject movement within frame).
  2. Lighting (changing ambient light, placement of lights).
  3. Sound (levels, distance to subject, mic axis).
  4. Location (noise level, permissions).
  5. Subject (makeup, direction,)
  6. Interview (preparation, full attention, questions, redirect).

And that’s just the production bit. If that all goes well, you get a bunch of footage and audio that you must store on a hard drive, then go to work on. That involves importing into your editing suite, watching it, listening to it, cutting the interview, carefully placing b-roll on top of that to hide your cut points, adding music, color correcting, audio mixing… Whew, it’s a lot to manage.

But let’s stay focused on the production piece. Now, consider that these variables don’t just have to be aligned for a split second, as with a photograph – with film everything has to STAY perfect for the duration of the shoot. If the sun comes out halfway through, you have to change exposure. If the subject gets excited and leans forward, you need to adjust focus. And, all the while you need to maintain human contact with the subject, so they feel you are present in the conversation with them.

It’s too much for one person to manage. Really, it is. Two people can swing it. But one? Forget about it.

Since I couldn’t do it all, I considered what I could NOT do, and still get the job done. Can you guess what that was? For a guy most comfortable with a camera, it was a tough one to swallow. I skipped the eyes and went for the ears.

If you had to watch 30 minutes of someone talking without sound, how long would you watch? Now, if you had to listen to 30 minutes of audio without video, how does that change things? A lot. But wait, you say, we’re making a film, not a radio program! Yes, but if you’re making a film, doesn’t that mean you want to show action? And does a person sitting in a chair really qualify as action?

The key to getting better documentary film interviews is: don’t bring a camera. You heard me right. Leave the camera at home. That way, you won’t be tempted to use it. Instead, you’ll free yourself to think about the story. You’ll connect better with the subject without your eyes constantly wandering away from theirs to check focus.

But what if you have plenty of crew? You should probably still skip it! Here’s why: because most people are intimidated by cameras. They are distracted by thinking about how they look, their makeup, wardrobe, etc. Consider this: How often do you FaceTime someone when you want to call them? I think I’ve used it twice in the three years I’ve had an iPhone. It’s invasive. I’m more comfortable talking as opposed to acting. Same is true in your average documentary interview situation. Take away the camera, and you take away the self-consciousness. Take away the self-consciousness, and you get straight to the good stuff. The scary, emotional stuff.

There’s another benefit to doing interviews without camera: it forces you to shoot better b-roll. In fact, it forces you to think differently about b-roll altogether. No longer is it filler to get you through – it becomes everything! So you have to think of action that can carry the story. And your film just got better.

There’s another benefit: you won’t be able to make the mistake of including too much talking head time, because you don’t have any!

It’s a big commitment. But try it once. You may be surprised with the results.