What is virtual production & how does it work

Learn about the basics of virtual & extended reality production in our beginner’s guide.

Virtual production is not only used for Film & TV productions, it can be used for corporate presentations, education, immersive experiences, to access the metaverse, and for multi-channel campaigns.

Virtual & extended reality production merges the physical and virtual worlds to create immersive environments for entertainment, education and events. But how is it done? 

In our beginner’s guide, we take you through everything you need to know about virtual & xR production - step by step. Learn about the setup, systems and configurations required to produce next-level immersive content. 

Starting with the setup, we are going to elaborate on the prominent LED Volume.

 

The LED Volume

The LED Volume, also called an LED stage, is where the magic of virtual production happens nowadays. 

Think of it as the next-level green screen.

Keying out the green can be a tedious process and requires a lot of work to make the final product look convincing.
Green screen production requires the removal of green behind and around the person or object you are shooting, whether in real-time or during the post-production workflow

LED Volume vs Green screen

Green screen production requires the removal of green behind and around the person or object you are shooting, whether in real-time or during the post-production workflow. Considerations such as reflections on shiny objects or clothing have to be taken into account, as they reflect the green, adding to the green spill nightmare. 

Keying out the green can be a tedious process and requires a lot of work to make the final product look convincing. 

An LED stage workflow solves many of these issues.

The light that is produced by the LEDs aids in providing realistic lighting on the subjects or objects you are filming. As you are already integrating the virtual workflow using in-camera visual effects, you are seeing 95% of your final footage on set, making post-production efforts minimal to none.  

 

What is an LED stage?

An LED stage or LED Volume is built using LED tiles. 

A stage is put together using those LED tiles and these can come in different sizes and configurations.

LED tiles are exactly what they sound like: tiles made of hundreds of LED lights. These can come in different pixel pitches. The smaller the pitch, the higher the resolution of the screen. 

A stage is put together using those LED tiles and these can come in different sizes and configurations.

They can have LED floors, LED ceilings, modular LED walls, physical floors, can be curved
Meptik Arc Studio Wall

They can have LED floors, LED ceilings, modular LED walls, physical floors, can be curved, and so on. Depending on your goals and the purpose of your content shoots, different LED stage configurations can be best for you.

The xR system is integrated with the LED stage, enabling content to be displayed on the screens and different systems used for virtual production to communicate with each other. 

The most famous example of an LED Stage production would probably be the virtual production of Disney’s The Mandalorian, which showed off the reflections of the LED wall on his shiny suit, creating a convincing environment in real-time.

For simplicity, we will stop here and leave it to this rough overview.

The most famous example of an LED Stage production would probably be the virtual production of Disney’s The Mandalorian, which showed off the reflections of the LED wall on his shiny suit, creating a convincing environment in real-time. 

 

Virtual scene creation 

We have talked through the LED Volume, but an LED stage without any content to integrate, is just going to be just that: a stage made of LEDs lights.

There is different systems that can be used to create real-time 3D environments for LED screens. 

The most common systems used in virtual productions are real-time cross-platform game engines Unreal Engine, developed by Epic Games, and Unity, developed by Unity Technologies.

Meptik’s experienced virtual scene designers are proficient in both Unreal Engine and Unity and furthermore use another software called Notch. Notch is a real-time motion graphics tool that enables designers to create interactive and video content in one unified real-time environment.

The 3D environments can either be photorealistic virtual environments, recreating physical locations in utmost detail, or they can be fictional virtual environments created only limited by imagination. 

The video content created in 3D is then pushed onto the LED wall in real-time, which means it is modifiable last-minute, on-set, and can be tweaked exactly to the client’s vision. 

 

In the virtual production studio

Once you combine the LED stage and the content, the (almost) only thing missing is your talent.

The person, multiple people or an object can be placed in front of the LED wall, or on top of the LED floor if your studio has one,  and they are able to see the content in real-time around them. 

Apple Music Thomas rhett arc wall

Photo by Ryan Green

Just like in a traditional broadcast or film production studio, stage lights are either set up on a truss or in the ceiling around the LED stage

Stage Lighting

Just like in a traditional broadcast or film production studio, stage lights are either set up on a truss or in the ceiling around the LED stage and are operated by a lighting operator on a lighting desk to really dial in the physical lighting to not only match the background to the foreground but also match the foreground to the talent to the background to make sure their lighting matches the virtual lighting in the scene behind them.

 

Virtual production combines physical elements with virtual worlds to create entirely new, immersive environments.

Photo by Ryan Green

Camera Tracking

Just like in traditional productions, a virtual production studio also has a physical camera, which is placed in front or around the volume, on a jib, tripod, shoulder rig, or any way you prefer. 

The difference from a traditional studio camera is the camera tracking system that is attached to this physical camera.

The system can track the position of the camera, its location and its rotation in real-time. They send that data back to the video game engine software of your choice and now physical and virtual camera are communicating with each other. Perspective, lens data, focus, zoom, sensor size - everything about that physical camera sends digital data over like metadata into the render engines - Unreal Engine, Unity or Notch - so that the virtual camera's perspective is one-to-one exactly the same as the physical camera.

This pretty tedious calibration process is something that the Meptik tech team has been perfecting over the years. 

With this sort of camera tracking, the content that shows up on the LED wall looks like it’s being projected from the camera.

When the camera moves, a rectangle of content on the LED stage accordingly - that is what the camera sees. This is called the inner frustum. Anything outside of the inner frustum is called the outer frustum and is usually a lower quality version of the virtual content around the rectangle that is used for reflection and soft lighting on the talents’ skin.  

Combining the physical & the digital worlds

This is where virtual production blurs the line between the physical and the digital worlds: capturing the physical foreground elements in the virtual environment with in-camera visual effects in real-time.

Without the physical limitations of the environment, the virtual playground is yours. Recreate physical locations in perfect weather conditions or capture the perfect golden hour for 2 days straight - no need to reschedule. 

Emily Rowed Bloom music Video
Thomas Rhett apple music and Meptik

From the beach to Antarctica in seconds? Even that can be done with a press of a button. 

 

Additional options for virtual production

There is two additional options that help blend the physical and virtual together seamlessly. 

Front plates

Front plates are Augmented Reality (AR) elements that are placed in front of the foreground elements (talent or object). In the example below, a virtual tree branch is placed in front of the talent, providing depth of field to the scene as a whole.

The front plate element is composited in real-time and is being keyed out from the game engine software, rendered out as a separate pass on a separate machine, but getting the same camera tracking data so that it respects the camera's lens and to get an accurate change in depth of field from front plate to back plate.

There are many more use cases for front plates, especially for corporate virtual production settings, such as 3D text flying into the scene, bar charts or graphs, floating PiPs (picture in picture), visualization of products and more. 

Set extension

Set extension is mainly used in broadcast and corporate virtual productions, and is, as the name suggests, an extension to the set/ LED stage. 

Without set extension, the camera is limited to seeing inside the LED volume, therefore every shot has to be framed within the parameters of the LED panels.

Set extension composites a front plate pass of the rest of the environment outside of the LED volume, basically extending the virtual environment outside of the LED panels. The camera can pull back further, pan from left to right, up from the ceiling to the floor - without leaving the virtual environment. 

This is achieved using a highly detailed calibration because every virtual pixel has to be lined up to the physical pixel of the led wall. Another task that the tech team at Meptik can help with.

Post Production

On set, the Directors and Directors of Photography get a very close representation of what the final product will look like. Final video transmissions from the camera can be color-graded in real-time and would be ready for distribution, but adding color grading or additional VFX in post is still possible. 

thomas Rhett Meptik Arc Studio Wall

Photo by Ryan Green

For this, the camera’s physical camera position, data of its rotation and location, and lens data are recorded in real-time and can be sent to visual effects houses for easier processing and eliminating the need for them to match moves to try and figure out where the camera was positioned. 

 

The final product

Virtual production combines physical elements with virtual worlds to create entirely new, immersive environments. 

Not only does it open up possibilities that were not an option in the physical realm, but it also facilitates an easier and more efficient production process, saving both time and money in the long run.

Virtual production is not only used for Film & TV productions, it can be used for corporate presentations, education, immersive experiences, to access the metaverse, and for multi-channel campaigns.

 

For more reasons to go virtual, read our blog post here. 

Want to dig deeper?

Dive into these related blog posts.