While films and video games have been exploring and exploiting the capabilities of real-time visual effects or VFX tech for years now, both these media verticals have been achieving similar results through disparate paths. Though there’s been a sharing of technologies, the usage differs for both platforms. But of late, games are becoming increasingly visually stunning, and VFX-infused movies are becoming increasingly immersive, which has led these two worlds to finally overlap their functionality.
The overlap has thus happened in the form of real-time rendering in VFX. Gamers will be intimately familiar with how this works. You enter a new arena, scene, map, or location and start moving around and things are just there. From buildings to objects to characters, you name it. Depending on the game and the perspective of the player, a lot of these elements react to specific interactions too, and we don’t think twice about how all of this is being displayed in real-time. Now, this same technology is coming into film production, and this turn of events have one catalyst – Epic Games’ Unreal Engine.
Unreal Engine and how it all began
Typically, games aren’t anywhere near as complex as films, as the computing power required to render everything we see in real life, in real-time, would be ginormous. Besides this, man-hour required to fine-tune basic things like lighting, depth of field, and compositing would be off the charts and unfeasible for game-makers. Epic Games changed all that with the launch of the Unreal Engine. While game developers have been using Unreal Engine since it has launched, to foster greater quality render in real-time, filmmakers are catching on, too.
- The first example of this was a short called The Human Race which mixed live-action footage with a pair of CG cars rendered in real-time to create a fantastic real-time visual effects collaboration of the two, that was so seamless which took the viewers’ breath away. See for yourself.
- Now coming to the cars, those are another victory for Unreal and real-time editing, and they harness the powers of a vehicle called The Blackbird. The Blackbird is a running and driving electric car whose length, dimensions, and performance can be changed based on the car you want it to mimic without needing the actual vehicle. It uses breathtaking tech like LIDAR, sensors and cameras to work together and recreate reflections, lighting changes and more. We’ll let the car makers explain it through this exciting video from Top Gear.
Rogue One and K-2SO
Moving away from the automotive side of things and into a full-fledged film set, Rogue One: A Star Wars Story used real-time rendering in VFX to create probably the film’s most iconic character, K-2SO. The droid had some great scenes and stole the hearts of millions of Star Wars fans. It was also rendered in real time using Unreal.
An article on Polygon sheds more light onto how it all happened, paraphrasing John Knoll VFX Supervisor for the movie and what he said in his GDC 2017 keynote speech, “Knoll spoke at length about how Epic’s engine allowed the ILM team to render the beloved, sarcastic droid K-2SO in real time, bypassing the pre-rendering process. As a result, the team could see K-2SO on the screen during a specific scene instead of implementing the real-time rendering in VFX and editing after the fact. Knoll explained that achieving final pixels on screen helped with the production of Rogue One — and it marked the first time the studio could work with CGI.”
Here are some highlights from the android’s appearances in the film so you can see the final product –
Real-time Rendering in Ready Player One
Unreal Engine and real-time rendering in VFX were also used to bring the sets of dystopian sci-fi Ready Player One to life. The Spielberg-directed movie gained widespread recognition for its cutting-edge graphics, particularly the depiction of the virtual world – The Oasis.
As this Studio Daily piece reports, the legendary director himself said this was one of his most challenging projects:
“The layers we had to achieve to put the OASIS on screen made it one of the most complicated things I’ve ever done,” said Spielberg. “There was motion-capture, live action, computer animation … It was really like making four movies at the same time.”
He used a combination of MOCAP, virtual reality, and real-time rendering in VFX to create an entirely new world within the real world and make it look complex, immersive, and believable. The result, well, take a look and make up your mind.
Expect more projects to harness Unreal Engine’s capabilities and shift to real-time rendering in VFX to make more spectacular in the coming years. This technology will only grow with time, delighting the taste buds of film-makers and fans.
Need help with specialized and highly focused VFX solutions for a film, TV or web series project? Contact us at Toolbox Studio ,a pioneer in real-time visual effects outsourcing.