Aug. 27, 2020

Emmy-nominated DP Greig Fraser ACS, ASC on Disney’s “The Mandalorian”

Cinematographer Greig Fraser ACS, ASC takes us behind the scenes of Disney’s groundbreaking live-action Star Wars series “The Mandalorian.” He discusses shooting in-camera effects with ALEXA LF on the series, which is Emmy-nominated in 15 categories.

Aug. 27, 2020

We caught up with Greig Fraser in Los Angeles to talk about the Disney+ series, “The Mandalorian,” the first live-action series in the Star Wars saga. Unlike previous Star Wars efforts, which relied heavily on location shoots or green or bluescreen technology, “The Mandalorian” used real-time, in-camera compositing. This involved shooting on a stage in front of a structure termed the Volume: a concave video wall comprising 1,326 LED screens, enveloping the set with photo-real digital backgrounds representing any number of different planetary landscapes or interiors. The series was captured with ALEXA LF cameras and is Emmy-nominated in 15 categories, including outstanding drama series, cinematography, and visual effects.

What was the look and style of the Mandalorian?

[Showrunner] Jon Favreau and I are both big Star Wars fans, and I had done a Star Wars film before with “Rogue One.” I had studied a lot of Star Wars and believe it or not, there is a very classic Star Wars style of filmmaking, which involves beautiful big wide shots, nice tight shots, and small camera moves. If you look back to how “Star Wars: Episode IV” was made, you’ll see Star Wars has a particular style, and we drew upon that in the beginning.

Jon also wanted me to watch a number of Westerns and Samurai films because of their influence on the gunslinger idea of “Mando.” We started there and figured out the path through. We also knew we were going to be shooting this on new technology, the LED Volume, so we had to create a look that would work for that environment. It meant we didn’t do too much handheld, because the Volume doesn’t love that, but we made sure we had enough slow moves, beautiful dusk shots, and landscape shots to work for the Volume and also for our aesthetic.

You can see the influence of Western films in the opening scene of the series. In Westerns, it’s very common to have the gunslinger walking into the bar or walking into the town; the door opens to the bar, the gunslinger walks in, and everybody stops. It’s a classic trope and is fantastic because we all know what we are watching when we see something like that. The idea for having that as the opening of “The Mandalorian” sets the tone for the rest of the series. So, he walks into the bar, gets into a bar fight, grabs his prey, and then leaves. It's very matter-of-fact and sets the tone perfectly by using these Western references.

How did the Stars Wars cinematic style influence your gear choices?

As a cinematographer, I respond to what I would call very simple visual cues. For example, one of the things that makes Star Wars—and it’s not exclusive to Star Wars—is widescreen. The framing of 2.40:1 is very much a Star Wars thing. Early on, we discussed maybe making “The Mandalorian” 16:9, since it was for TV. But we looked at it in 16:9, and it just didn’t feel like Star Wars to us. That’s not to say Star Wars couldn’t ever be 16:9, but for us, the use of a widescreen aspect ratio was one of the things that really made it Star Wars.

Another thing was the anamorphic format. We had the very good fortune to be able to use the ALEXA 65 on “Rogue One” and then the ALEXA LF on “The Mandalorian,” both times with anamorphic lenses. We used different squeeze ratios for the two shows, and they both had anamorphic qualities to them. Again, you don’t have to shoot on anamorphic lenses for a Star Wars show—that’s been proven fantastically by Bradford [Young ASC] on “Solo.” But on “The Mandalorian,” it was just one of those things that added little spices of Star Wars, little drops of Star Wars.

What is it about that combination of the ALEXA LF paired with an anamorphic lens that cinematographers love?

I think it’s the focus fall-off. As we all know, if you go to a larger sensor, then the same focal length will give you a wider field of view. If you shoot with a 50 mm anamorphic on an ALEXA LF, then it acts as a wide lens without the distortion or the bowing that a wide lens normally has. You get the effect of a wider field of view, which I think brings the audience in. For me, if you get a wider field of view, then you have to move in closer. So, suddenly your 50 mm is closer than where it would have been on Super 35, which means your proximity of the camera to the actor is closer, but you’re not wider. You don’t get that distortion or that feeling of putting a wide lens in someone’s face, so it’s a beautiful look. I think it brings you into your subject more and brings you closer to your story.

Can you talk about your approach in creating the world of “The Mandalorian” and specifically using real-time, in-camera compositing with the Volume?

The process of shooting in the Volume, or the way we did it on season one of “The Mandalorian,” is very untraditional. You’re scouting with the director 12 weeks out [from an episode], and you’re building your loads (images for the Volume) 12 weeks out. There was a very close relationship between Baz Idoine, the other DP, and I. He was physically present on the other episodes, but he also helped me by being on the floor of episode one when I was off lighting episode two backgrounds. It was very much a joint, communal effort and very unusual for standard TV, but I think quite successful. It allowed us to work in sync as a partnership so that he could be photographing on a day where I was lighting, or vice versa.

There is a lot of buzz about real-time, in-camera compositing being groundbreaking. How does it compare to green or bluescreen technology?

I think this technology is the most groundbreaking, revolutionary breakthrough in maybe 50 to 70 years, or even since sound. I mean, when processed screens came along as a technology, that was sort of a breakthrough, but they looked, to be frank, a little hokey in the early days. You know, I still question bluescreen’s effectiveness because it’s not a lighting tool. For me, I have major contentious issues with processed screens as they stand. Effectively if you are trying to light a set, with in-camera VFX, everything around you is a lighting tool for you to use.

I’m always having conversations about where this technology will be in 10 years and what it will do. We were able to put a bounty hunter on a desert planet, but really the technology is in its infancy. As a cinematographer, it allows you more control than a bluescreen. It even allows you more control in other ways that you wouldn’t have thought. For example, if a production designer loves the look of a location but the location is impossible for a film crew to get to, you can send a photographer out to get photos and build this location in 3D. It just opens the door to hundreds of possibilities that we haven’t thought about.