Infinity War VFX: The Josh Puppet

© Disney

This small piece of nerdery from the visual effects process of Avengers: Infinity War struck me as weird, poetic and interesting:

“We used an actor puppet as part of our process of solving the facial performance,” said [Weta Digital Visual Effects Supervisor] Matt Aitken. “We had Josh Brolin’s face-cam footage, which we tracked. In the past, we would have taken that tracked motion and solved it straight onto the CG character. But, at that point, you’re always guessing how accurately you’ve captured the actor’s original performance. How much of what we’re seeing on Thanos are inaccuracies that have crept into our processes? So we introduced this intermediary stage, which was a digital version of Josh Brolin. We would first solve our captured performance onto that, so we could see how accurate it was. Once we were happy with that, we did a simple migration of that motion from the Josh puppet to the Thanos puppet. (…).”

from the article by Jody Duncan in Cinefex 159

Solo: Screens as props and environments

Explaining the effects work on Solo, Cinefex issue 160 describes two ways the team used screens as carriers of ersatz reality. The first makes use of a tablet to simulate a window:

The coaxium containers have windows through which the liquid material can be seen sloshing around. Rob Bredow shot footage of ferrofluid which the props team puppeteered using magnets; ILM stitched the plates into seamless loops. BLIND fittet a Microsoft Surface Pro tablet inside a prop container, on which the coaxium footage was displayed.

The second way recalls the way Gravity created Sandra Bullock’s surroundings and constitutes a sort of advanced rear projection, with screens showing the space around the “Millenium Falcon” cockpit set:

Immersive environment specialist Lux Machina surrounded the cockpit with a 180-degree rear projection screen illuminated by multiple 4K projectors in portrait mode. To feed the projectors, ILM finaled visual effects backgrounds prior to principle photography. “We generated wraparound content just as if we were working on a simulator film, with beats that either looped or were much longer than if you were just doing the two or three seconds that end up in a shot.” The rear projection approach – also used for scenes inside Dryden’s yacht – enabled [DP] Bradford Young to capture cockpit shots in camera, backgrounds and all, using the screen as his primary lighting tool.

FX Guide has more on the immersive cockpit set, including some amazing images

Marvel Studios’ Global Pipeline

Ever wonder how Marvel manages to deliver their movies on time despite tight schedules and 2.500+ effects shots? Executive producer Victoria Alonso explains:

We’ve had anywhere from 12 vendors to 24 vendors, which is madness. It’s a challenge, but when you have that many shots, you have to divide the work among many different teams. If we relied on one vendor, we would choke that vendor. And by having visual effects teams from around the world, in different time zones, we essentially get a 36-hour day. That extra time allows us to constantly feed the beast.

VFX Supervisor Jake Morrison goes into more detail in a different interview:

On Thor: Ragnarök we had 18 vendors, so our day would start with calls to Germany and then sweep right across the planet chasing the sun until we finished in Australia. The tools that have been built to allow for all this data to slosh around the world on a nightly basis are breathtaking.

Both interviews can be found in Cinefex issue 158.