Select Page

The Virtual Production Adventures of “Star Trek: Discovery”

Virtual production was used for the fourth season of the Paramount franchise.

The virtual production stage at VFX company Pixomondo (PXO) in Toronto is even called the “Holodeck.”

The stage is 72 x 85 x 24 feet and is equipped with a 2,000-panel LED wall and 750 LED panels on the ceiling. Tracking is done by more than sixty OptiTrack cameras, and more than forty high-end GPUs running Unreal Engine’s nDisplay make sure that all of the more than 2,750 panels work together.

A production case study on the Unreal Engine site says that almost every episode of the 13-part season has at least one scene shot against the AR wall. There are both familiar places like the shuttle bay on the Discovery and strange places like the Kaminar Council Chamber, which is deep underwater.

Virtual Studio

In pre-production, PXO’s virtual art department creates prototypes for each new environment to be incorporated into the augmented reality (AR) wall. The team obtains pieces from the Epic Games Marketplace, modifies them as necessary, and adds them to the scene in order to rapidly generate a rough depiction for blocking. All elements were made using the Unreal Engine.

Unreal Engine 5 Virtual 3D set

Mahmoud Rahnama, VFX supervisor and head of studio at PXO, said, “Using a procedural approach directly in Unreal Engine turned out to be the best way.” “This not only let us make assets in UE with a much higher resolution, but it also gave us the same results as our offline render and cut our average offline render time by a lot.”

JERSEY DIAMOND STUDIO

It’s wonderful to focus the team’s attention on shots they’re excited about rather than dull tasks. Virtual production artists have various creative perks.

The lighting team particularly changed. A light bake could take 14 hours in Discovery’s first season.The Unreal Engine’s GPU Lightmass Baker can bake in 30 minutes.

A substantial change is intangible. In earlier seasons, artists were allotted unique tiles for each area. Once the tiles were complete, each was assessed separately, and only the full environment was reviewed last.

During Season 4, PXO’s artists could simultaneously work on the same scenario. Not only did this result in a more unified and consistent setting, but it also made it possible for artists to collaborate and inspire one another. Plus, according to Rahnama, “when artists know people will see their work every day, it tends to greatly enhance organization.”

Right now, the pace at which you can shoot a sequence, edit it with 80% of the shots completed, and view the final VFX without having to wait months is a game-changer. Soon, it will be the norm to get things out the door and onto people’s screens even more quickly.

Hollywood’s rise of virtual production:

Virtual production used to be used only by the most technologically advanced studios, but now it’s used in a lot of shows and movies, like The Mandalorian, Our Flag Means Death, Dune, Spider-Man: No Way Home, and The Matrix Resurrections.

Virtual Production History

Virtual production is having a “moment” in Hollywood and beyond, but VP technologies and processes aren’t new, cinematographer Neil Oseman writes. He claims that LED walls and LED volumes, both of which are important components of virtual production, can be traced back to front- and rear-projection techniques.

webcast, live webcast, live webcasting, live streaming, gotham comedy club

Oseman tells the story of virtual production, from movies like “North by Northwest” to Disney’s “The Mandalorian,” which was a big hit on streaming services. Along the way, he talks about the “LED Box” that Emmanuel Lubezki made for 2013’s VFX Oscar-winner Gravity, the hybrid green screen/LED screen sets that were used to film driving scenes for Netflix’s House of Cards, and the high-resolution projectors that were used on 2013’s Oblivion.

Deepwater Horizon (2016), which used a 42×24-foot video wall with 250 LED panels; Train to Busan (2016), Murder on the Orient Express (2017), and Rogue One: A Star Wars Story (2016), as well as The Jungle Book (2016) and The Lion King (2018). He then talks about more recent movies like The Midnight Sky in 2020, The Batman in 2022, and the Paramount+ series Star Trek: Strange New Worlds.

Cine Live Lab debuts at the NAB Show

Cine Live Lab gives daily hands-on demonstrations of the newest tools and methods for making movies and live broadcasts. On October 19 and 20, Cine Live Lab will be conducted at the Javits Center.

There are three AbelCine seminars on film and TV production at Cine Live Lab. The roles of the production team and how to manage multi-camera projects are explained.

Students will learn how to set up, focus, frame, and use a lens. The most important media companies will be there. Sony, Fujinon, Reidel, and Multidyne are all partners.

WebCasting, Live streaming, Video Production, Live Events, Film, TV, Production

Storytelling techniques from movies are used in live performances and on TV. These projects combine making movies and getting the word out. Pete Abel, who helped start AbelCine, says that Cine Live Lab brings together people who are creative.

The Cine Consortium will start up in Los Angeles in November 2021 to help the NAB Show and its related events educate and bring together the film, production, post-production, and content development industries member of a tech company, studio, guild, or society.

Skip to content