What do you need to know to get started in virtual production?
The use of LED volume levels for film and commercial productions is growing in popularity after Disney’s critical success The Mandalorian, Lens from Grieg Fraser.
This article is the third in a four-part series designed to help filmmakers aboard this new great storytelling toolset. For a primer, see parts one and two.
LED volumetric filmmaking
Simply put, the difference between volumetric LED filmmaking and traditional filmmaking is that instead of using large sets / locations or chroma keys filling the background of your footage, it uses synchronized LED screens to capture a final pixel image. This is helpful for a wide variety of productions for reasons that I’ve covered in previous articles.
Why does it work Well, the camera is being tracked in 3D space, which allows the screens to react and change their perspective in response. This creates a so-called parallax: when objects move at different speeds depending on the distance to the subject being viewed.
This is part of why Fraser and the team at The Mandalorian won seven Emmys along with other awards and nominations for her work in the field. It really changes what’s possible on set and opens up endless ways in which you can tell the story.
Just got the gig
So you’re a cameraman and you’ve just been hired to work on a production that wants to use volumetric filmmaking methods to get the bottom line. I’ve put together a quick list of tips and important pointers to keep in mind as you get on your way to the project.
Cameras: Make sure the camera you are using has the capability to do so Genlock with the rest of the stage. Some cameras have genlock (like Sony’s Venice and ARRI’s Alexa Mini), but many cameras don’t. It’s a broadcast-centric feature that usually requires fine-tuning between systems to make it perfect.
lenses: Anamorphic lenses have a harder time in these environments, especially when shooting on a smaller stage. This is due to the flaws and more organic approach to focus for which most anamorphic lenses are praised. Without getting too technical, anamorphic lenses have a tendency to “accidentally” focus the LED volume screens when you don’t want them to, resulting in moiré.
That said, Fraser shot all Panavisions anamorphically The Mandalorian, to prove that it is possible.
opening: Shoot lower stops. I found that T2.8 is generally the highest I can get without worrying about inducing moire on the screens in the background, but this is all directly related to your depth of field. So take your first AC power and pull out the DoF calculator because even if it isn’t visible on your production monitor, you don’t want to post and find that suddenly a moiré pattern of the LED screens appears on a 40 foot screen in the background.
Shutter angle and high frame rates: Your ability to shoot at high speed or a shutter angle other than 24 fps / 180 degrees is affected by the refresh rate of the panels you use, which is controlled on the brain bar. In our tests with RoE BP2s updated at 96 Hz, we were able to record a shutter at 48 fps or 24 fps / 90 degrees. However, be sure to test this out before locking it on the set during the day.
persecution: Your camera needs to be tracked within the room for which some solutions are currently available. Regardless, this means that either points or a sensor are placed on the camera to track the position in 3D space. You need to consider the occlusion of these trackers when planning camera movements. Do you want the shot to push through some leaves? Make sure you have enough points to track the camera so you don’t lose track and cause interference with the truncated cone.
Point of view: Different LED panels can have different ideal viewing angles. What usually happens when looking at the panels from a different angle is a slight, or sometimes dramatic, shift in the color of the panels. If you end up pointing a camera at a hard seam in a volume, the LED operator may need to adjust the color balance based on that viewing angle. You can see an example of this below.
Color space and bit rate: One last thing to bring to the conversation with your Unreal team is: “What color space are we going to finish this project in? Is it DCI-P3, REC 2020, ACES?” Knowing this in advance will allow you to adjust the colors you see on the screen to match the colors you want to finish in the post.
Also make sure to talk to your Volume / Unreal teams about what bitrate they are sending out of the engine and then out of the LED panels. RoE BP2s can send up to 12-bit colors from the pixels. Depending on the settings, your engine may send you less information. This is important in preparation as you may get down to your color grade and find that the background doesn’t have all of the color data you need to flex with the rest of the foreground image. Capturing a 16-bit RAW file won’t help if the color data captured from the background doesn’t match it.
It’s all a great light:: One of the helpful things about recording with an LED volume is that the entire room is one big softbox. This means that all of your fill sources will be taken care of. It is also possible to use only the LED panels for softer keying and backlighting. If you have to install other lights, the light sources must be harder or more source-rich than the panels can emulate.
Work smarter: Use the light cards as often as possible and familiarize yourself with the preparation options. By lighting on the set, you can better understand how much light and shadow you can draw with the panels alone. Honestly, this will all make your Grip team very happy, it is for sure the lightest 20×20 fly swatter out there.
Hard lighting:: Hard sources (such as undiffused HMIs) are difficult to use in the volume for two reasons.
First, it is more difficult to adjust the tone and texture of the sunlight presented in the engine when you put lights at your volume level.
Second, it takes a lot of grip equipment to handle the loss of light from the sources onto the LED panels so that you don’t get any washout of the panels. Some panels do better than others in this regard, but at the core they don’t like being hit by other light.
Truncated cone: The truncated cone can have its own independent exposure level so that the volume around you can stand out more from the lighting to compensate for this. It’s also unhindered by light maps you place, so you can drop them into areas of the wall that you will eventually move through with your camera without any problems.
Practices methods exercises
- Familiarize yourself with Unreal (although this is not an absolute requirement) and understand the vocabulary to speak to the Brain Bar technicians and the VFX team and make it easy to communicate your production needs. I’ve included a short list of vocabulary below.
- Take part in the environment creation conversations. Unlike recording in a chroma key environment, you now need to adjust the lighting of the digital assets, not the other way around. This means that without set-driven knowledge and control, decisions can be made during the environment creation phase that will affect your ability to adjust the lighting on the set.
- Prepare, prepare, prepare! While this is a given for almost any shoot, the undeniable control that is available to you and the rest of the creative team at this point is really untapped potential. The rules aren’t set in stone yet, and like any good magic trick, having the right flair in your setups can help you sell almost anything.
- Get to know the team of technicians who you will work with on the “Brain Bar”. Now an extension of your team and your efforts on set, you can take a number of actions to influence the final image, including:
- Generating and placing light cards as sources of talent and presence on the volume stage
- Adjust the color of the sun and other light sources to match the sources you use on the set
- Balance the exposure of the truncated cone to the rest of your lighting on set
- Control the refresh rate of in-engine content and walls for high-speed production
- Optimizing camera tracking with new sensors and cameras to ensure that the trackers placed on the camera are never obscured
Truncated cone: The rendering window within the engine that is tracked with the movement of the camera. This is projected onto the LED walls with the highest possible resolution, while the environment is used as an active light source and is not as high a resolution.
Light card: A 3D object rendered in “unreal space” that acts as a light source or flag within the volumetric stage. Essentially, you take a block of panels and just turn them into a source of light or shadow outside of the camera.
occlusion: The act of blocking or clogging something out of sight. It has become a common term to describe the interaction between objects within the volume.
Brain bar: The group of technicians on set responsible for everything within Unreal and the volumetric phase. This includes the Unreal Operator, the LED Volume Playback Operator, the Camera Tracking Technician, the Digital Gaffer, or the Lightcard Operator.
Last pixel: The term used to describe images that will be printed when the project is complete. (Unlike images that are used with the intent to be post-processed later.)
What’s next? Keep learning
Learn about the differences between virtual production and traditional production and cover the basics of virtual production.