In this exclusive fireside chat, we hear from Oscar-winner Paul Franklin of movie effects house DNEG and effects expert Ed Thomas of Dimension Studio. DNEG is responsible for the look of films you’re definitely familiar with: Interstellar, Inception, The Dark Knight trilogy and more. They sat down with us at the first Beyond Games online event to discuss modern techniques for rendering real-time movie VFX.
With technological advancements like Unreal Engine being used in filmmaking today, what are the possibilities of real-time VFX production? With elements no longer being added later on in post-production, how will content creation change moving forward? Discover all this and more, with two of the most talented figures from behind the scenes of modern movies.
Immersive tech built in real-time
At the online conference Beyond Games #1 (May 2021), Paul Franklin (DNEG) and Ed Thomas (Dimension Studio) talk about what goes on behind the scenes when it comes to movie sets and virtual stages. They discuss the different scenarios in which creators utilise real-time technologies. In particular, LED walls have the ability to bring two very different worlds to the same screen. This enables filmmakers to break down physical walls and have two sets exist in the same shot. Of course, LED walls are just one part of virtual production, which is a very loose term.
Real-time VFX also includes creators making lighting and blocking changes as they shoot scenes. In turn, everyone can make informed decisions more collaboratively without the need to wait for post-production. In this video, Wolf and Thomas also touch upon the creative possibilities of capturing volumetric video, as well as being able to look at things with cinematic value as you go along.
Looking to the future
Of course, the tech here is still continuously developing, and it needs to be more robust. The pair also share valuable insights on when, how and why to use it. Plus, there is a need to share it with the industry as a whole. The panel also touches upon issues that, while improving greatly, still are infamously difficult to do. This includes limitations on gushing water, creating facial details and rendering hair.
Is creation truly democratized now? Does new VFX tech allow other companies and other types of creators to participate in telling these stories? In particular, can the indie side benefit more from these assets? Towards the end of this video session, the panel answers a few hand-picked questions from the audience to cap off the talk.
This 40-minute session is ideal for anyone who’s always been curious about the possibilities of seamless integration between the real-time process, games tech, traditional VFX and filmmaking. (It’s also a great watch for anyone who’s eager to see if the haptics of Ready Player One will happen sooner than we think!)