newsdog Facebook

ILM's New Chief Talks Deepfakes, A.I. and the Virtual Reality Used on 'Solo'

Fadu News 2018-08-10 23:54:20

Rob Bredow, who will deliver the keynote address at Siggraph on Monday, opens up about the AI-driven technique which combines or superimposes one video on another, and the promises of new tech like augmented reality.

Rob Bredow, the visual effects supervisor on Solo: A Star Wars Story, took over the reins at Lucasfilm's Industrial Light & Magic when he was named senior vp, executive creator director and head of ILM in May. Having previously served as the company's chief technology officer, he brings with him expertise in both the art and science of VFX that he will be sharing when he delivers the Aug. 13 keynote address at Siggraph, the annual conference on computer graphics and interactive techniques, in Vancouver.

Speaking with THR, he talked about the promises of new tech like augmented reality and the AI-driven technique known as deepfakes.

What role do you think machine learning and artificial intelligence could play in the further advancement of VFX?

Especially over the next few years, they're going to really change the landscape. We can train a computer to actually do something. One of the most interesting examples is some of the work that's being done in the area of face-swapping. From a visual effects practitioner's perspective, we, of course, know how to do this with all the manual steps, but what these machine-learning models are showing is that they could do those steps for you.

Deepfakes — which combine or superimpose one video on another — aren't being used yet in production, but do their algorithms have potential?

Deepfakes aren't a high enough quality for us to just run a deepfake algorithm and put it in one of our films. But we've identified what those limitations are, and it won't be too long before we'll be able to use something like that at much higher quality [in motion picture production]. I don't think we're 10 years away from that. We may only be one or two.

Digital "beauty work," the motion picture equivalent of retouching still photographs, has become common today. Is it going in the right direction?

Beauty work can be both a positive thing and it can be taken too far. We've seen it in magazines for years, and there's cases where it's done tastefully and there's cases where it's done over-the-top. There is some responsibility for how we use these tools and how far directors, producers — and even the actors themselves who are interested in making sure they look their best — take it.

At ILMxLAB, you've worked with augmented reality. What have you learned?

Augmented reality is really interesting because it allows you to stay engaged with the world around you while also adding components to it. And it can be an increasingly social experience, which is really important. We see a lot of applications for processes both in filmmaking and outside of filmmaking, but from just the filmmaking perspective, there's nothing like getting people to walk through a virtual environment that is actually at scale [before a practical or digital set is built]. You can do that in virtual reality, but in augmented reality it even has more potential because you can still see the people around you and interact with them.

Can you cite an example of how you've used virtual reality on a production?

Yes, on Solo, we were working on the scale for that train heist sequence. We had a pretty good sense of what that train car scale needed to be, but VR really helped us tune those parameters. [Lucasfilm design supervisor] James Clyne was trying to figure out the spacing between the cars. He put the headset on, we had the virtual trains set up for him, and he actually tried to jump himself. We adjusted until he thought it was a challenging jump but not too challenging, and then he just got back in and continued to inform his design.