Become an expert on virtual productions before you even get on the set.
This is the first in a four-part series on virtual production. This first part introduces some basic types of virtual production. The definitions used to describe VP terms in this series are moving targets in a rapidly evolving set of tools used in film production, so they can change over time. Here, virtual production currently fits into the film industry.
Virtual production will change the way we make films. This is no longer news. It’s something that people like me are actively working on to make a reality. Why?
- It lowers the barriers to entry for aspiring filmmakers
- There is complete creative control over the environments and the world of the story
- It is an extremely cost-effective solution for on-site filming logistics
But before we actually get into the way we use it on and on set, let’s set our framework –What is virtual production best used for?
First, let’s start with the more obvious question: what the hell is virtual manufacturing? It’s a collective term that actually refers to a lot of different things, but for the intentions and purposes of this series of articles, here’s what I’m referring to when I use the term virtual production:
The use of LED volume levels for film and commercial productions is growing in popularity after Disney’s critical success The Mandalorian, Lens by Grieg Fraser.
For those of you who don’t know what I’m talking about, we’d like to give you a brief introduction to filming virtual productions and what to expect. Let’s dive in!
Motion tracking and detection
While there has long been some form of mocap in film and gaming, using the tools available to capture the movement of an object, actor, or camera is a big part of what makes virtual production so exciting. It’s the fact that real human-like movements can be encoded into a 3D graphics engine in real time to do a variety of things, including:
- Capture human movement for an animated character
- Capture the movement of a prop or set piece to render it into a complex animation
- Capture the movement of a camera in 3D space so that the background of the set can be changed while maintaining camera movement and any visual distortion of that movement
Previsualization and Scouting
Previs and scouting are two separate things, but essentially they use the same tools in the context of virtual production. Both use a real-time game engine such as Unreal engine or unit to get the results you want.
Virtual scouting involves integrating a real-world environment (or one created from scratch) into a game engine and remotely exploring it with your creative team to determine if that location is suitable for your production.
In a game engine, you can mimic different real-world lighting styles and solve some of the most complex logistical challenges without ever setting foot on site. Of course, this can save you the immense travel expenses – flights, meals, hotels, transportation, security, etc. – that are required to get a whole team of creatives to a remote location.
If the location doesn’t already exist, it can be created from scratch in a studio. By practically exploring the set in advance and blocking out what scenes are taking place there, you can ensure that the set designer and construction team only create it once for everything the production needs and avoid costly rebuilds or changes once the set is ready for the Recording is checked.
Pre-visualization takes virtual scouting and takes it to the next level where you want to capture, edit, and export entire sequences or scenes of a movie before it’s even time to roll a physical camera. This allows you to determine which creative decisions should be made on the set on that day to avoid additional funding or costly delays.
While this technique is not new to the world of film and television, creating a real-time pre-visualization within a game engine makes the process more comprehensive for various members of the creative team who do not have 3D software knowledge that each will get the final look can sign off before it needs to be updated on set.
LED volumetric filmmaking
Finally, we have LED-controlled volumetric filmmaking. This is what the world is currently excited about because of Disney’s commercial and critical success The MandalorianThis is at the forefront of the Unreal game engine’s use of this type of real-time capture drive to make scenes captured on set in not-so-real-life settings more believable for the audience.
Let’s summarize what this is all about.
I’ll avoid being overly technical – although there is a very deep rabbit hole we could dive into, I’ll let you do your own research on the technology at play here if you’re interested.
On the hardware side, the most technical part of this process is real time back projecting “end pixels” to your scene using LED panels as the background source for that image.
While there are many different types of panels on the market, they all have their advantages and disadvantages. RoE Black Pearl 2s are the ones I’ve used the most that were used in the filming of The Mandalorian– Since there is no official “standard”, I would say that these are at the top of the list.
The common pipeline begins with the image being processed inside the VP workstation, which is then rendered by a dedicated rendering computer at the required resolution and then sent to a switcher which sends the signal to the on-board color processor and finally to the panels sends itself to display the picture.
This is the part where things get really exciting. We use video game engines to render “Final Pixel” quality 3D environments for real-time viewing on these LED screens. The environments used are often developed in these engines and this is where previewing, scouting and testing takes place.
This is the element that holds it all together, and with the recent advances in GPU technology for professionals, this tool is available to any filmmaker who wants to learn it.
Both Unreal and Unity have toolkits for virtual production, and both seem keen to make them technically free and accessible to anyone who wants to use them for film and television.
Personally, I’ve used both game engines and prefer Unreal for virtual production purposes because of its extensive community of developers and plug-in support. However, so much has been written about the debate between Unity and Unreal that you can decide for yourself.
Give both of them a try – they’re free to try and learn from them, and both have amazing communities and libraries of how-to videos to get you started.
What does that actually make “volumetric”?
It is the fact that our camera and sometimes props and actors are tracked in real time in 3D space, then their place within the stage is processed in the game engine and placed in the 3D environment. Then the appropriate field of view and parallax is displayed on the LED screen in the background for your camera position, even if you are performing complex hand and crane movements.
There are many ways to track an object in 3D space, which I will describe in more detail in another article.
How does it all work?
Quick overview: You stand on a stage with a camera and film your scene. The camera is tracked in volumetric space within a game engine that takes this information and puts you in a 3D environment. The background of your recording is filled in with the help of LED screens around which your stage is built.
Not only does this create the illusion that you could be standing on the cliffs of Dover or the tundras of Mars, but it also gives you the option to emulate the lighting of that environment using the LED screens as active sources, which you would have to solve if you use green screen instead.
There is a lot more to discover here. So if you’re curious about the world of virtual manufacturing, I’ve included some links to videos and articles to read below that cover it in more detail.
This is also the first in a series of four articles I’ve written to help traditional filmmakers understand more about the world of VP and the convergence of games and film. In the next article I’ll answer the question, “What is the difference between virtual production and traditional production?”
Learn more about virtual production
Here are just a few resources you can check out to get even deeper into virtual production and unreal. Look at her!
Long – video
In short – audio
- Virtual production– Podcast by Noah Kadner, 15 minutes per episodeBasic to intermediate knowledge
Long – audio
What’s next? Keep learning
For filmmakers shooting on volumetric stages, things are different and new tools are available to you. Find out what cameramen need to know to get started in virtual production. Then read about the differences between virtual and traditional production.