Category Archives: Tips

How to shape natural light for cinematic outdoor interviews

“Lighting is a subtle craft,” I told my students yesterday. Then we headed to a baseball field to see what that looks like. Our assignment: To shape natural light into something cinematic for an outdoor interview using only two pieces of grip. A 4×4 foam core bounce, and a Scrim Jim Cine Frame (8 x 8′).

Our first challenge was to choose a background. We placed our subject on the edge of the field, and slowly walked in a circle around him to observe how the light fell on him in relationship to the background.

People rarely look good in direct sunlight. It makes them squint. So we placed our subject with the sun behind him, as you can see from the shadows below.

By doing this, we penciled him out from the background with natural rim light. This works best when you can find a background that is at least a stop darker than your subject. In this case, we could do that very easily, because there were lots of trees. Trees absorb light. So here’s our starting frame:

1. The camera left side of his face is visibly darker because there is a line of trees off camera in front of him on that side, and open field on the other.

This first frame shows the importance of considering how nearby objects will impact your subject. If he had been standing in the middle of the field, his face would have been evenly illuminated. But because there is a line of trees behind us and camera left, the light is wrapping in from camera right which is exposed to open field.

Our goal here is to make the most of the tools we have to create a naturally lit interview that feels organic and makes our subject look great. So let’s start playing with our toys and see what each does for our shot.

2. Bounce 3/4  on camera left. 

Our first setup is to bounce light on the same side as the sun, 3/4 angle camera left. This evens out the light on his face, eliminating the dark areas created by the trees in front of him. But it’s pretty flat. Let’s move the bounce a little farther to the right…

3. Bounce under lens

Placing the bounce directly under the lens is a pretty common sight on film sets. It’s a great way to get a nearly invisible fill up into the eye sockets of your talent. But in this case, it still feels pretty flat.

4. Bounce 90° camera right

Placing the bounce on the camera right side gives us a nice dimensionality, but it doesn’t look organic, because the sun is coming from behind and to camera left. So it looks lit. No good. Let’s put down our bounce for a minute and see how the negative fill affects our shot.

5. Neg only

The neg by itself does a nice job of evening out the light on his face, but he’s now too dark overall. If we raise our exposure to compensate, the  background will get too hot, and we want to leave it alone at a stop under. So let’s bring back our bounce, on the same side as the sun (called “same-side fill”) and see what happens.

7. All in – bounce camera left and neg camera right. 

Wow! This looks pretty good. Adding the bounce wraps the light of the sun around his face very naturally, and the neg on the other side gives us the 3-dimensionality that we’re always striving for in cinematic shooting. 

If anything, I’d say we could have raised the 4×4 bounce a little higher to do something about the shadow that’s forming a triangle between his camera right cheek and eye. 

What techniques do you use to shape available light for cinematic outdoor interviews?

How to choose the right hard drives for 4K video projects

Hard drives for 4K video

One of my students at Seattle Film Institute asked me a question the other day: “How do you choose hard drives for 4K video?”

Most beginning filmmakers are on tight budgets. So my short answer was: “Buy the cheapest drives you can afford to store your media, and the most expensive drive you can afford to edit it.”

Let’s unpack what that means in today’s technology landscape.

When I get a new 4K project, I buy two hard drives big enough to hold all project media. In my case, that’s generally 1 to 2 terabyte drives. At the end of each day of production, I’ll lay off the files to both simultaneously. I use  Hedge which enables me to have two backups of the media from the get-go.

Drives I recommend

The drive I have most frequently chosen for this is the 2TB Backup Plus Slim Portable External USB 3.0 Hard Drive. It currently costs $65. Black Magic Speed Test clocks it at 75 MB/s. That’s way too slow to edit 4K video on, of course, but we only need it for storage. The nice thing about USB3 is that it’s compatible with just about any computer out there, both Windows and Mac. So if your client wants the files at any point, you can simply hand them the drive.

If you have a computer with USB-C, however, I recommend the aPrime ineo rugged waterproof IP-66 certified drives. For $95, you’re getting a drive you can drop in the water, with rubber bumpers to break its fall, and a built-in USB-C cable. These drives clock for me at around 110 MB/s read and write speeds. So for a little more money, you get drives that are a LOT more rugged and a little bit faster. 

Both the above drives are about the size of a typical iPhone. And that matters to me – they will (hopefully) live out their lives in a drawer. I like that they won’t take up much space.

Small is the new big

So now let’s talk about the fun stuff – speedy editing drives. I used to rely on toaster-sized RAID drives to get the speed and reliability I needed for editing. But with SSD, that’s no longer the case. With solid state media, I have found speed, reliability AND the benefit of being able to take entire projects with me wherever I go. With this freedom, I find the only time I’m cutting at a desk is when I’m doing audio passes with studio monitors. I’ll connect my laptop to a larger monitor at various stages of the project. But even then, I tend to park myself all over the house. For example, the kitchen table, or on the coffee table in the living room.

My tried-and-true favorite 4K editing drive is currently the  1TB T5 Portable Solid-State Drive (Black). I get read-write tests to about 300 MB/s which is more than fast enough to edit 4K video. This drive is the size of a business card (and only a little thicker). It is now available in a 2TB size for under $500, which seems like a bargain to me. But the landscape is changing. 

Here comes Thunderbolt 3 

Most new Macs now support Thunderbolt 3. If you are one of the fortunate people who has one, I invite you to behold the 2TB X5 Portable SSD.

I hesitate to call it affordable at $1,400, but it gives wings to your 4K projects. I recently retired my late 2013 MacBook Pro and made the leap to a late 2018 MacBook Pro, so I finally have a computer than can keep up with such a beast.

I’ve been putting the X5 through its paces by editing multiple streams of  ProRes RAW 4K DCI on a project that weighs in at 1.7 TB. With everything loaded on the drive, here’s how the X5 performs:

X5 Write/Read times with drive 3/4 full
The X5 clocks even faster read and write times with Black Magic Speed Test

It’s interesting to note that even though this drive is lightning quick, it’s still not nearly as fast the internal drive of the 2018 MacBook Pro:

Late 2018 Macbook Pro internal drive speed test

Putting it all together

What these numbers tell me is that to get the absolute best performance from Final Cut Pro X, you want to keep your FCPX Library file on your local hard drive. Then, store all of your media on the X5. Beyond speed, this has the added benefit of allowing automatic backups of your FCPX project files. To get near real-time backups, use an automatic cloud-based backup service like BackBlaze. Because it runs in the background, BackBlaze won’t slow you down at all and you won’t have to remember to back up your project. Note, however, that BackBlaze is not an efficient way to back up your media drives. But you’ve already got yourself covered there with those cheap backup drives.

For longer 4K projects like feature-length films, you’re of course still going to be living in the land of RAID when choosing hard drives for 4K video. But for the small projects, I find this 3-drive system, in which you back up your media on 2 cheap drives, and edit it on a single fast one, is a winning formula. 

What hard drives for 4K video are you using? 


To infinity and beyond: A fix for the focus issues with Contax-Zeiss and Metabones Speedbooster

Well, that sure was simple. Turns out the solution to fixing the infinity focus issue on vintage Contax-Zeiss with Metabones Speedbooster doesn’t involve using a Dremel tool. And no, you don’t have to recalibrate the lenses, either.

You just have to loosen a tiny set screw on the Metabones Speedbooster, and rotate its lens element (in my case, counter clockwise). Boom! Everything comes into focus.

Each of my vintage CY Zeiss lenses has a slightly different infinity focus point, however, so that means setting the infinity focus to the lens that is the farthest off. This means the rest of the lenses now focus beyond infinity, which isn’t really a problem. All of my modern AF Canon lenses do that by design. But it does mean the witness marks are off to varying degrees.

There’s one other issue with this fix that affects my Sony FS5: rotating out the rear element on the Speedbooster causes it to protrude further (see image below). As a result, when the Speedbooster is screwed into the FS5, it makes contact around the edges of the ND filter mechanism. That’s not ideal, but it’s a slight contact, and the ND filter functions normally. So I’m going to chase the shallow look to infinity and beyond.

I love the vintage look of these lenses so much that, before I discovered this solution, I’ve been shooting pretty much every interview I do with them. That hasn’t been an issue, because interviews never happen at infinity. But now that I have the full range of focus, I’m a heck of a lot more likely to shoot my b-roll with this glass now, too.

Want to see for yourself how amazing this vintage look is? You can rent my infinitely focusable 5-lens set of Contax-Zeiss cine-mod lenses on ShareGrid Seattle for $60/day. Set includes modified Speedbooster for use with Sony E-mount cameras.

SaveSave

SaveSave

SaveSave

SaveSave

Teradek ServPro turns iPhones into superb client monitors

Teradek Serve Pro

Teradek Serve Pro, ready for action

For smaller documentary film productions, a client monitor is a luxury that’s rarely on the table. Not only are they expensive, but they are complicated. First, you need to set up a radio network, and then, a monitor. It’s expensive, and time consuming.

But Teradek has changed that equation with the Serve Pro, a small camera-mountable box that creates a wi-fi network that up to 10 iOS or Android devices can use to monitor video. It’s been a game-changer for me and my clients. Here’s why it’s my new favorite tool on location.

I’m often shooting projects where up to three client representatives are on location. Client reps like to have something to do when they are there, besides just watching you work. But they also need and want to stay out of my way.  On larger sets, they are usually clustered around the monitor. So what Serve Pro does is give them that experience, the opportunity to see what’s going on, and provide you feedback if they want, based on an image they are seeing on their phone or tablet.

I was skeptical at first that the quality would be that good – I imagined having to spend time disclaiming the image, telling them that it would look better in post. But that turns out not to be the case. Not only is the image top-notch, but I find that it’s as good or in some cases better than the image coming to my Shogun and SmallHD monitors if you have an iPhone 7 or newer.

HD video monitoring with professional controls

Not only is the image bright and sharp, but all of the controls that you would expect to find on a SmallHD monitor are included in the free Vuer app! You read that correctly: you can apply a LUT, view peaking or zebras, even false color if you want. You can look at a vector scope, histogram, etc. But clients generally just want the image to look good. And Serv Pro has that covered, too.

I generally shoot in Slog, so I just email my clients a LUT before the project, then show them how to load it into the app on the morning of the shoot. It takes just a few clicks, and boom, the image they are seeing looks fantastic, as I send them a LUT that is designed for client viewing with slightly crushed blacks.

Shogun’s problem and solution

When I first started shooting with a Shogun Inferno, I was disappointed to find that I couldn’t get an HD signal out of the Inferno when shooting RAW. So that pretty much meant I couldn’t use the Serv Pro. But the latest firmware (version 9 that also supports ProRes Raw) has fixed this issue! So now it’s possible to record raw to Shogun Inferno AND send an HD signal out to the Serv Pro.

Wooden Camera battery plate for Sony FS5

Serve Pro attached with velcro to the back of a battery plate. 

Mounting the Serv Pro to camera

It’s possible to attach the Serve Pro to the camera using 1/4 holes that are drilled into the unit in two places. But what I’ve found works great for me is to attach two strips of velcro, and use that to mount to the flat back of a battery plate. This doesn’t cause the unit to overhead, and works great to keep a low profile on the camera while shooting.

Battery life

It’s possible to power the Serve Pro from a smaller battery with a p-tap, but I find that using full-size vlock batteries is the way to go. A 98wh battery will power my camera, Serve Pro, and Shogun 2.5 hours.

The Serv Pro costs $1,800. I rent it to my productions for $105/day. My clients love it so much they call me before shoots to make sure I’m brining it. So it’s become a standard line-item in most of my productions, and I will have it paid off by the end of the year.

You can rent my Serv Pro and check it out for yourself! My kit, which includes HDMI, SDI and power cables in road case as pictured above, is available on ShareGrid for $105 for the day or weekend for productions in the Seattle area.

Rent my Teradek Serv Pro on ShareGrid for $105.

Do you provide monitoring for your clients?

Tentacle Sync gives timecode a good name

Tentacle sync and sound devices mixpre 3

Tentacle Sync in “green” mode feeding timecode to a Sound Devices MixPre3

When I made my first film about 7 years ago, I was sort of lucky. There was this little software app called PluralEyes that enabled me to do dual-system sound without knowing the first thing about timecode. That was cool, and it served me rather well for a long time. But recently I’ve been edging out of this friendly zone into more dangerous territory where I’m getting hired by directors who expect me to provide files with timecode. Because, well, timecode.

But I’ve been pleasantly surprised to discover that, with the right tools, timecode doesn’t have to be scary. In fact, it has really speeded up my post-production workflow. But I’m getting ahead of myself. First, a reminder of why timecode can suck.

Timecode is complicated

When a director hired me earlier this year for a timecode shoot, I did what any self-respecting DP would do: I hired a professional sound recordist. They usually come with everything needed for timecode – a recorder/mixer that generates timecode, and the “lockit” boxes that plug into cameras that accept timecode, for a complete solution. But this time, the recordist told me her lockit box was broken, so I’d have to provide something myself.

I was able to rent a lockit box from Lensrentals, and I brought it to the shoot assuming the sound recordist would know what to do with it. She responded as if I’d just handed her a snake. Fifteen minutes later, she allowed that she didn’t know how to make this particular box work. I took one look at the long row of cryptically labeled dip switches, and…told the director I’d record reference audio and he could use PluralEyes to sync the footage.

The look on his face told me I’d disappointed him.

So the next day, I did some research. Turns out there are a lot of solutions that claim to be “industry standard” on the market. Most of them are big, clunky, and expensive. For example, the lockit box I’d rented is about twice as big as a Sennheiser wireless mic receiver, and has to be somehow anchored to the camera. But my Sony FS5 doesn’t have a timecode in port, so how do I work around that? Finally, it needs to be externally powered somehow…

But there’s a better way. It’s called Tentacle Sync.

Tentacle Sync is simple

Tentacle makes a tiny plastic box (less than half the size of a Sennheiser G3 wireless mic receiver) that has built-in velcro for attaching to your camera. It doesn’t need external power, as it comes standard with a built-in lithium ion battery that will last for even the longest day of shooting.

Tentacle Sync doesn't take up much space

Tentacle Sync doesn’t take up much space on a Sony FS5

There is only one switch on the Tentacle – on or off. All other controls are accessed through an app, which connects wireless using bluetooth via either iOS or Android.

tentacle label

Each tentacle is named at the factory (or you can assign your own custom name) to tell them apart from each other

I use a Sound Devices MixPre 3 as my primary recorder when I’m not working with a pro recordist, and it does not have timecode built in. However, it supports timecode, meaning, it will accept timecode from a generator. In practice this means that the MixPre 3 is a fine timecode-generating recorder/mixer when paired with a Tentacle Sync. Used in this way, you need one Tentacle Sync unit for the recorder, and a Tentacle Sync box for each of the cameras you will be using on the shoot. A Tentacle Sync Sync E Timecode Generator with Bluetooth (Single Unit) sells for $289 apiece, but you can save money by purchasing them in a set. The Tentacle Sync Sync E Timecode Generator with Bluetooth (Dual Set) sells for $519.

I now own three of the boxes, and they allow me to pair two cameras with timecode generated by the third unit, which lives on the MixPre 3.

Tentacle Sync E comes with everything needed to get up and running, and more

Tentacle Sync E comes with everything needed to get up and running, and more

How to configure a Tentacle Sync

Setting up the units is a breeze. Each comes with a unique name from Tentacle (which I’ve labeled them so I can tell them apart easily). Press and hold the on switch until the light flashes green – that puts it in timecode generator mode. Put this one on the recorder. Press and hold the others until the light turns blue – this puts them into timecode receiving mode. Then, briefly connect the master to the slave, and this syncs the timecode.

The Tentacle Sync units come standard with a 1/8″ to 1/8″ trs cable, which works great for pairing to the auxiliary input of the MixPre3, which can be set to receive timecode. It also works great to send timecode to DSLRs or mirrorless cameras via the audio jack.

Larger cameras like my Sony FS5 that don’t have a dedicated timecode in port can also receive timecode via one of the two audio channels, although you’ll need a Tentacle Sync Tentacle to 3-Pin XLR Cable (16″) to get the timecode out of the Tentacle box (which they sell).

I have a Shogun Inferno, which has a dedicated Sync port for Linear Timecode (LTC) in, and for that, you’ll need a Tentacle Sync Tentacle to BNC Cable (Straight, 16″) or Tentacle Sync Tentacle to BNC Cable (Right-Angle, 16″).

A professional sound recordist I work with frequently, Scott Waters, loves the Tentacle boxes. He owns the required Hirose to 1/8″ connector cable required to sync timecode from his Sound Devices 633 to each Tentacle Sync on my cameras.

Tentacle workflow

It’s one thing to get timecode sent to the audio port of your camera, but that’s a non-standard way of doing timecode. I’m not even sure Davinci Resolve can read timecode on an audio channel if you using Resolve to batch-sync, as many productions do.  So how do you take advantage of it?

Syncing is near instantaneous and offers XML and file export options

Syncing is near instantaneous in Tentacle Sync Studio, and offers XML and file export options plus multicam mode

Enter Tentacle Sync Studio. This is drop-dead simple software designed with smart defaults, which looks for timecode on one of the audio channels of all media dropped into it. If it finds timecode in an audio channel, it automatically uses that to sync, rather than the timecode in the video file. Once you drop all of your footage and audio into the app, it automatically builds a sync map, which you can view to see which of your footage is synced.

You can export footage via XML into your NLE of choice, or you can export new clips with the good audio embedded in them. You can optionally keep your reference audio, but the default is to replace the reference with the clean audio (which is what I almost always want).

Shogun’s 2.5 frame delay

I use Tentacle Sync with my Shogun Inferno via a BNC cable, and it works great, with a small caveat. For some reason, there is a 2.5 frame offset in the timecode, meaning the audio needs to be slipped 2.5 frames forward in the timeline to be perfectly in sync. Luckily, the Shogun Inferno has a timecode preference that allows you to set a timecode offset, but it only works in 1-frame increments. So I’ve set mine to -2 frames, and it’s close enough.

Shogun Inferno's timecode controls allow offsetting the TC signal. I find a -2 offset works best.

Shogun Inferno’s timecode controls allow offsetting the TC signal. I find a -2 offset works best.

Tip: When sending timecode to your MixPre 3, if you use presets (and you should because they are a huge timesaver), make sure you configure each preset to accept timecode via the aux in. I forgot to do that and when I chose a new preset, it reset to the default (timecode in via HDMI) and I was very sad when I discovered in the edit that most of my files contained no timecode).

Why use timecode on your projects instead of PluralEyes? Because it saves time. Syncing based on audio waveforms is very processor intensive, and if you’re working on a large project, it can literally take hours. Furthermore, it’s messy (creating folders full of temporary files), and not always accurate. Some files just never sync, forcing you to align them one at a time, a very cumbersome process.

With Tentacle Sync Studio, timecode syncing is almost instantaneous. And the Tentacle Sync boxes themselves are unobtrusive, and can be attached with velcro (included in the kit) to any size of camera.

Since I’ve discovered the joys of syncing with Tentacle, I pretty much use it on every project now. What about you? How do you keep your clips in sync?

SaveSave

SaveSave

Exploring Apple’s new ProRes Raw with Sony FS5

Apple’s ProRes Raw is the most exciting thing out of NAB this year. Fancy new LEDs that immitate police lights are cool, but they will be surpassed by even cooler lights within a year or two. ProRes Raw, however, is going to be with us for a long time. And for good reason.

I’ve been playing with ProRes Raw for the last few days, using my Sony FS5 and Shogun Inferno, bringing the footage into Final Cut Pro X to learn how it works and what it can offer me and my clients.

First of all, let me just say, I’ve never been a big fan of 4K. Few of my clients ask for it, and when they do, it slows me down in my post-production by a factor of three or four. So it’s expensive. Everything takes forever to render, and in the end, we invariably end up delivering HD anyway. So why shoot it?

Well, ProRes Raw just gave us some pretty compelling reasons.

  1. First, it doesn’t slow your editing down nearly as much as before. I found that by setting playback to “Better Performance” in FCPX, I was able to play back a ProRes Raw 4K timeline on my late 2013 MacBook Pro without any frames dropping.
  2. Secondly, to get ProRes Raw using FS5 and Shogun, you MUST acquire in 4K. There’s no option to record a 2k signal, apart from recording in high frame rates. So it would take a lot longer to down convert your footage and transcode to HD than to stay in 4K.
  3. Thirdly, the options you get in post by acquiring the raw and the 4K are actually pretty cool given that they no longer slow you down the way they used to.

So, if you could have a Tesla for the same price as a boring gas car, which would you buy? (Only, you’re going to have to wait a long time for that Tesla, and you can have ProRes today.)

Consider this:

  • No transcoding = no need to make dailies. Your raw files ARE your dailies – and you can change your mind about camera LUTs later without re-rendering 🙂
  • 12-bit color depth. Woot.
  • File sizes aren’t much bigger than current flavors of ProRes

So how does it work?

To get started working on your ProRes Raw footage in FCPX, you need to show the “Info Inspector” and change the default view from Basic to General. This reveals the ProRes Raw-specific controls that allow you to debayer your footage.

RAW to Log Conversion controls which flavor of Log you want to debayer to (it’s also possible to choose None and work directly with RAW – more on that in a moment).

Camera LUT applies a LUT to the LOG signal before it’s sent out – akin to baking in a look on your recorder (but it allows you to change your mind later).

The first thing I wanted to test out was how the Shogun was interpreting the raw signal, and see if that was the same as the way Final Cut Pro X interprets it. Turns out, they’re different. And that has implications on how you shoot.

Shogun interprets the raw signal from a Sony FS5 like Slog-2 – only it’s not.

When you look at the raw signal on the Shogun, it doesn’t matter whether you send it Slog 3 or Slog 2 on your FS5 – the Shogun thinks it’s getting an Slog 2 stream, at least for the purpose of applying LUTs. It’s NOT an Slog2 stream, though – it’s raw sensor data. But Shogun is receiving metadata from the camera so that it can interpret it, and whatever it’s doing is in SLOG2. So when you drop on a LUT designed for Slog3, it looks all wrong (way too crunchy). So until we have more options in FCPX, it’s important to stick with using SLOG2 LUTs for monitoring on the Shogun.

When you bring your ProRes Raw footage into Final Cut Pro X, however, there’s no option to debayer the footage as SLOG2. Currently the only options for Sony cameras are Slog3 in two flavors – either Sgamma3 or Sgamma3.cine.

You can also apply a camera LUT in FCPX at the same time you are debayering, but it’s important to note that if you do, it’s the same thing as applying a LUT at the beginning of your color pipeline, rather than at the end of it. Before ProRes Raw, this was generally the wrong place to do this, because if the camera LUT crunches the shadows or clips the highlights, you won’t be able to recover them when making adjustments to the clip later. But. And this is a big but: When you apply a camera LUT to ProRes Raw at the source, THERE IS NO RENDERING. Did you hear that correctly? No rendering.

Actually, with FCPX 10.4.1, it’s not only possible to apply a camera LUT to RAW, but you can also apply a camera LUT to any LOG footage.

Yep, you can play with LUTs to your heart’s content, find one that gives you a ballpark without clipping your highlights, and take it from there. So this really is a fantastic way to make dailies – but with the ability to change your mind later without incurring any rendering penalty. How cool is that?

But (a small one this time). Since you can’t shoot with SLOG3 LUTs using Shogun (or more correctly, since Shogun won’t interpret your raw footage as anything other than SLOG2), and since it doesn’t work to apply an SLOG2 camera LUTs in FCPX due to the fact that it’s debayered as SLOG3, we at a bit of an impasse, aren’t we?

Matching Slog2 to Slog3

Matching Slog2 to Slog3

Turns it it’s not that difficult to overcome. It’s entirely possible to find an SLOG2 LUT for shooting that comes fairly close to matching what you’ll see off the sensor in SLOG3 in FCPX. I did this by placing my MacBook monitor on a stand next to my Shogun, and trying lots of different LUTs on. Turns out that loading the SONY_EE_SL2_L709A-1.cube LUT onto the Shogun, I’m able to get pretty close to what I’m seeing when I open the clip in FCPX using the default Slog3 raw to log conversion, and the default slog3/sgamma3 camera LUT. I do look forward, though, to FCPX supporting more options for debayering.

OK enough about LOG. Let’s roll up our sleeves and get straight to the real RAW, shall we? All you have to do is select None as the option on Raw to Log Conversion, and you get this:

You get an image that looks, well, raw. To work on it, you have to drop it into a timeline, and start applying color controls. That’s great and all, but now you have to deal with rendering. So right away you can see that there are some workflow reasons why you’d want to stick with Log. Having said that, you can do all the corrections you want to do and save them using “Save Effects Preset” so you can apply them in a single click later. Not the same as a LUT, but definitely a timesaver. But I digress.

I was blown away by how you can pretty much, within FCPX, do the same things I used to doing to an image in Photoshop. You can recover highlights until forever! But you do have to be careful – pulling highlights down has an impact on the rest of the image, so you have to jack a contrast curve something crazy if you want to recover highlights outside a window. But you can do it:

Recovered highlights in a bright window

Recovered highlights in a bright window

You’ll notice that the brightest highlights went a little crazy there, but good news – you have pinpoint control over color saturation using hue-sat curves in FCPX. After a little playing around, I was able to calm it down like so:

Tamed highlights

Tamed highlights

The bottom line is you quickly find yourself spending a LOT of time fussing with things when you get into grading the raw RAW. But it sure is a neat thing to be able to do it when you want to.

I think I’m going to find that I stick with a Log workflow for now, but with the option to to straight into raw for some challenging shots, where I really want the kind of control that Photoshop gives a still photographer.

Blowing the image up to 200 percent, a little noise becomes apparent after all the pushing around:

Noise in 12-bit raw

Noise in 12-bit raw

But dare I say, it’s filmic noise, not ugly digital noise. That right there is the benefit of shooting 12-bit, most def. Shooting 10-bit 4k on the Shogun you definitely have noise issues in the shadows. I’d say definitely less so with 12-bit.

Here’s a side-by side comparison of 10-bit SLOG2 footage with LUT applied next to RAW 12-bit:

4k enlarged 200 percent

There’s currently no tool in FCPX raw controls akin to the denoise setting in Adobe Camera Raw tools. So to denoise this, you’ll need a plugin like the excellent Neat Video Denoiser. And that, my friends, is where the Apple ProRes party slows down a bit. You’ll definitely want to apply that filter as the very last step of any project, because the render times with denoising are painfully long.

But never mind a little noise that has to be enlarged to be seen. ProRes Raw has me all fired up. This codec looks like a game changer to me. More workflows will emerge quickly, supported by more software and hardware tools such as cameras that shoot direct to ProRes Raw. In the mean time, Apple has handed us Sony FS5 owners a way to stay on the bleeding edge for the foreseeable future (without upgrading to the Sony FS5 MKII, thank you very much).

 

SaveSave

SaveSave

Sennheiser AVX is a great wireless system – but not with DPA lavs

Tip: power AVX receiver with a cellphone backup battery

Tip: power AVX receiver all day with a cellphone backup battery

I recently took advantage of Sennheiser’s trade-in program for wireless mics in the 600Mhz range. I was able to get $100 bucks from Sennheiser for trading in my soon-to-be illegal G3 wireless package for a fancy new Sennheiser AVX system, lured by its simplicity. I figured it would be a real beast paired with my DPA D:screet 4061 mic. And I was right. But not in a good way.

Turns out the the DPA 4061 turns the AVX into a virtual theremin. If you move the cable anywhere near the antenna, you get humming and buzzing interference. That would be fine if you were creating sound effects for Gravity, but not so great if you just want clean dialog.

It appears the culprit is the thinly shielded cable on the DPA 4061, which is no match for the transmitter on the AVX. The problem disappears when you plug in the OEM mic from Sennheiser, which sports a thicker rubber coating.

Another gotcha with the AVX is that the receiver only lasts 3 hours before needing a recharge. So unless you plug in while recording, you’re likely to need extra of those $50 Sennheiser proprietary batteries. I thought I’d found a clever solution by powering the receiver from the same battery that powers my Sound Devices MixPre-3 mixer/recorder. But turns out that causes interference too. So my solution is to use a portable cellphone battery recharging stick to power the AVX receiver while I’m working. Luckily, the AVX receiver can take a charge in this way at the same time that it is operating, so this works great, even if it is a little unwieldily with all the cables.

If I’d known all of this when I bought the mic, I probably would have waited for the forthcoming Sennheiser G4 wireless. Because you can’t beat the sound of the DPA D:Screet mics.

 

Sony FS5 shooting tip: toggle auto ND on and off

Sony’s variable ND filter on the Sony FS5 is a killer feature. It gives you precise control over exposure levels by allowing you to dial ND levels up or down, in steps that are fractions a stop. Virtually all other cameras on the market today engage ND in 2-stop increments, which is like using a sledge hammer to set a thumbtack. And yet…

Sony’s variable ND isn’t truly stepless until you engage the auto ND

So if you want stepless exposure control, I definitely recommend turning it on, at least for brightly lit scenes that require ND.

But wait, isn’t cinematography all about control? Isn’t auto-anything giving up control to the camera instead of keeping it for yourself? Well, if you’re shooting outside, you’re likely going to be using ND anyway. And using auto-ND simply let’s you get into the ballpark of where you’d get manually, only much, much faster – and with the ability to seamlessly adjust to the light if you want it to. I view this as the camera giving me MORE control, not less.

Here’s an example of scene in which using auto ND gives a perfect result. Notice that I’ve set up auto exposure to +1.5 because I’m shooting log (which should routinely be overexposed to reduce noise):

Auto ND on Sony FS5

Auto ND enabled

As I pan my camera around through a door frame, leaving auto-nd engaged would result in overexposure. So I simply toggle the auto-ND off, and it freezes our auto-levels and gives us full manual control of everything.

Auto ND toggled off

Auto ND toggled off. Exposure is frozen at last auto setting, preventing our exposure from changing as we pan through dark foreground area.

I set up my #6 button (located on the inside of the grip) to toggle auto ND on and off. So as soon as I get my ballpark exposure, I turn it off. That way the exposure isn’t riding all over the place if I pan the lens behind a dark wall that I want to keep dark.

#6 custom button is located under the Sony FS5's rotating grip.

#6 custom button is located under the Sony FS5’s rotating grip.

My (wicked fast) auto-ND workflow

  1. Set aperture.
  2. Toggle auto ND on (boom – exposure is now perfect)
  3. Toggle auto ND off (freezes settings without changing them)
  4. Adjust exposure up or down as needed with variable ND wheel or other means.
  5. Repeat as needed.

Using this approach, I can be up and rolling almost instantly, and adjust to just about any lighting condition in real time. I don’t have to think about anything other than whether that auto  button is on or off. Sometimes I want it on (if panning through a scene that has different brightness levels or traveling from one room to another, for example). Sometimes I want it off (when dark object momentarily passes through the frame, for example, like panning through doorway above).

Do you use auto Nd to control exposure on your Sony FS5? Got any tips to share about how you use it?

SaveSave

Sony FS5 Tip: Keep your internal battery in when rocking an external battery

Sony FS5 with Vmount battery and most importantly, its internal battery on deck

Sony FS5 with Vmount battery and most importantly, an internal battery on deck

I got all fancy and started using a v-mount battery with my Sony FS5 recently. I had to jump through some hoops to get it working, but with the help of a Wooden Camera V-Mount Battery Plate for Sony FS5, everything worked as if it were designed by Jesus.  I didn’t even need to put in the camera’s internal battery, and it saved weight to skip it. But this morning, I was shooting this timelapse…

It was a long timelapse, and the v-mount battery died during the shoot. No biggie, right? We know from experience that when an FS5’s internal batteries dies, it saves the clip before it shuts down. But when I opened the SD card on my laptop, woe and behold, I see the clip size is zero KB. Oh Shit.

File sizes of zero bytes are big trouble

Yep, it turns out that Sony FS5 likes having its external power cut when rolling about as much as a DJ likes it when a raver trips over the cord to his mixer.  It’s a train wreck, full stop.

So I thought, hmmm. Wouldn’t it be cool if you could insert the camera’s internal battery, and just have it automatically know, when you plug in that external power, that it should defer to that, while providing instant backup if the power is cut?

So I said like, a little prayer (to Sony engineering), rolled the camera with both batteries in place, put my finger on the vmount eject button, and pressed it. Off came the battery with a clunk, and… the LCD screen dimmed slightly, and… the camera kept right on rolling!

Bless you Sony engineering

In hindsight, it makes perfect sense. So maybe I was the last camera person in the world ignorant enough to make this mistake.  I’m just grateful I was able to learn this lesson on my own dime, rather than at the end of a long client interview.

It’s good to know that, with the camera’s internal battery in place, I can hot swap in and out as many vmount batteries as I like, and all my files will be recorded safely to the SD card.

Did you know to keep your internal battery loaded when powering your FS5 externally? Have you ever experienced a power-related data loss?

 

SaveSave

SaveSave

SaveSave

Plugging the Sony FS5 ND filter gap

Before I say anything bad about the built-in ND filter on the Sony FS5, let me begin by saying that it doesn’t suck. In fact, the Sony FS5 ND filter is so good, it’s part of the reason I switched from Canon to Sony a couple years ago. I’m still amazed by how easy it is to dial in precisely the level of ND you need every time. Well, almost every time.

When you need just a little bit of ND, there’s a gap.  It’s not a huge gap, a tad more than 2 stops of light when you go from no ND to the lightest 1/4 setting. But it’s one that I’ve encountered again and again when, for example, shooting an interview wide open.

A common situation

I often want to shoot interviews wide open to separate the subject from her background. So with an f/2.8 lens, I’m as wide as I can go. And what I often find is that the room is just a little too hot, about a stop over. When I engage the variable ND, at its lowest setting, it takes the room down 2 1/3rd stops. So now let’s say I’m a good stop under exposed.

A common workaround

The only in-camera solution that won’t degrade the image (apart from increasing the shutter speed, which I don’t want to do for aesthetic reasons) is to stop down to f/4. Note, if you’re shooting glass faster than f/2.8 (and you don’t mind really throwing your background out of focus) all you have to do open to f/2.0, and engage the ND. Boom, done. But a lot of my glass is f/2.8, and that’s where I find this gap.

What the gap looks like on a monitor

F2.8 wide open without any ND. I’m looking at those highlights and saying we’re 1 stop too hot.

But when I engage the ND at lowest 1/4 setting, now we’re 2 1/3 stops under exposed. Too dark.

Turning off ND and closing the aperture to f/4 gives me the exposure I want – but I want to set f/2.8.

Finally, here’s the shot at f/2.8 with a Tiffen .3 Water White ND filter to bring us down one stop. Just right!

Closing the gap

I’ve found the right tool for plugging this gap is a .3 neutral density filter. The .3 gives you just one stop of light reduction, the least powerful ND filter that you can readily buy.

The one I prefer is the Tiffen 77mm Water White Neutral Density 0.3 Filter. The quality of the glass is superb.

I’ve also got a 4×4″ Neutral Density 0.3 Resin Filter made by Lee, which is also excellent. But I like the 77mm screw in best, because, with appropriate step ring, I can screw it on to the front of most of my lenses very quickly without rocking a matte box.

Have you encountered this situation with your Sony FS5? How do you “mind the gap”?