The Virtual Production Revolution, Pt. 2

Pt. 2 - How does it work?

Virtual Production is a revolutionary technique made possible by a combination of technological advances. First, you need the latest high-resolution LED screens (The Mandalorian was filmed with 2.8mm pixel pitch LED, but 2.3mm and even 1.5mm pixel pitch are often now used). These LED modules are built into wide, high walls, to reproduce the background content at the required level of detail. The high-res LED also allows the camera to achieve a certain proximity to the screen, without visible artifacts such as moiré patterns giving the game away.

Image courtesy of Quite Brilliant - www.quitebrilliant.co.uk

Talking of games, the next piece of the technological puzzle is the real magic - the photorealistic content itself. This is designed, produced and controlled using the incredible power of today’s gaming engines (check out the latest version of Epic’s Unreal engine for a taste of what they can do).

“In virtual production studios, the degree of power and skill behind this content generation and control is reflected in the nickname commonly given to the control centre – the Brain Bar.”

In 2022, video games industry revenue is expected to pass US$200billion - more than the movie and music industries combined. With that kind of wealth fuelling technology and ingenuity, it’s no wonder that gaming leads the way in the creation of virtual worlds. In virtual production studios, the degree of power and skill behind this content generation and control is reflected in the nickname commonly given to the control centre – the Brain Bar.

To feed all that incredible HDR content to the screens, you need powerful, low latency, genlocked processing. This is another area where technology has stepped up to deliver. London-based Brompton Technology was among the first to provide the quality required. Their Tessera processor range now offers solutions to VP on all scales: the top-of-the-range SX40 processor serves 4K screens at 60Hz with 12 bits per colour output. Processing features which are of particular use in VP include HDR and Dynamic Calibration as well as Extended Bit Depth, HFR+ (High Frame Rate) and Frame Remapping.

Once you have all that wonderful content up on the screen, you have to make it appear real to the viewer (i.e. the camera). This means you have to introduce the effect of parallax – the variation in the apparent position of objects in your field of vision as your line-of-sight changes. Look up and move your head from side to side to see this effect in action.

This variation needs to be mimicked for the camera, and for this the system needs to know exactly where the camera is. This is done via motion-tracking technology, which accurately locates the camera within the 3D ‘volume’ of the set, then feeds this positional data to the gaming engine, which adjusts the viewpoint in the 3D world and alters the on-screen display output accordingly.

As the camera moves, the perspective of the scene shifts to match – and in real-time, or as near as makes no difference (less than 10ms latency is considered critical in virtual production). Camera-tracking technology has long been used in green-screen production, and it’s now being enhanced and adapted to be an integral part of the virtual production process.

Lee Baldock

Lee Baldock has been involved in the live entertainment production industry since 1994 as a journalist, editor and public relations agent.

Previous
Previous

The Virtual Production Revolution, Pt. 1

Next
Next

The Virtual Production Revolution, Pt. 3