Framestore Sets Sail on ‘1899’ with New Virtual Production Tools

The award-winning VFX studio’s team, led by Connor Ling and Freddy Salazar, created tools integrated with Unreal Engine 5.1 to render and project realistic backgrounds onto an LED wall using 18 different machines with dual GPUs, for the Netflix series about an ocean liner diverted to answer a distress signal from a long-lost sister ship.

Leave it to the creators of the mind-bending, timeline-shifting series, Dark, to push the boundaries of virtual production innovation in their latest project. Moving on from that show’s dreary forest and cave setting straight out of a Brothers Grimm fairy tale, their new Netflix series, 1899, takes place on a turn of the century European ocen liner diverted from its course for New York City to answer the distress signal from a long-lost sister ship.  Handling the virtual production was the Framestore duo of Connor Ling, Virtual Production Lead and Freddy Salazar, Head of Virtual Art Department.

When Salazar began his career in visual effects, creating a photoreal picture was extremely difficult.  “What I learned a long time ago taught me how to get a look that makes sense and not rely on the software to figure this out for me,” he shares. “Also, I learned how to get a sense of volume and make things look deep around us even if we’re on a LED flat screen.” When 1899 went into production, Unreal Engine 5.1 was in the early release stage.  “We had to develop a toolset that was both dynamic and efficient, as we were coming onto set and dealing with 18 different machines that had dual GPUs, potentially 36 different viewports we’re rendering and had to project onto a LED wall,” explains Ling. “It was the same with content as well. Some of the tools that Freddy’s team and others provided allowed the operators onset to interact with that content and utilize its real-time aspect, which is part of why virtual production is so powerful.” 

A number of tools were developed for the onstage shooting.  “There are tools to interact with the cameras that are being projected onto the screens, for moving the worlds, and interacting with the stage itself,” Ling notes.  “You have potentially hundreds of lights illuminating the scene so you need a tool that is going interact with those efficiently.  What the DP doesn’t want is to request that you’re dropping this light source by half a stop and it takes you five minutes to action that request because you have to find it within the hundreds of assets that exist in this world.  Color is hugely important for in-camera visual effects and is an aspect that separates 1899 from other shows.” Noting it was particularly challenging to achieve the show’s visual ambitions with the available technology, Salazar adds, “It was all about seeing how far we could go to get the maximum amount of creativity. This was more like VR than visual effects.  In visual effects we have a rough idea about where we are and the image.  But for us, when they moved the camera, we had to think about how to keep things live without technical limits.”  

Building everything from scratch on set in real-time was not an option. “Unfortunately for us we had to build everything in advance!” laughs Salazar. “We worked closely with the production team to find out what kind of metal, lights, and mood they wanted.  It was important for us to cover 80 to 90 percent of what was in the picture with the last 10 to 20 percent done by Connor and his team on set. We did a library of things that could be adjusted in real-time.” 

As the use of sophisticated virtual production increases on shows like 1899, so does frontend workload. “We don’t measure the 3D content in minutes,” remarks Ling.  “When working entirely in real-time, if you want to shoot the sunset hour for 12-hour multiple days you can.  We’re not waiting for the sun to come back around.  It’s there.  Freddy and his team did a fantastic job of making sure that it was as optimized and dynamic as possible.  That’s a hard line to walk.  Allowing customization but not to the detriment of how long it takes to render a frame.  It did allow the production designer, DP and director to make those creative changes while onset.”   

Numerous mood boards were created by the production art department.  “We spent a lot of time with the director and production designer to go from a wide picture with the sky to developing more of the details,” notes Salazar.  “The mood boards were important because we needed a target to be able to create the vision.”  Co-creator Baran Bo Odar directed the eight episodes while Nikolaus Summerer was the sole cinematographer for the series.  “It means that the initial pre-light days are incredibly valuable because everyone can get a grip as to what is possible and you’re not having to go through that learning process eight times for every episode.  We were fortunate with everybody on set, particularly Nik, the DP, who picked up on the process and how we could fully utilize it along with his practical lighting. In doing so he quickly created some beautiful shots.” 

One major innovation for the on-set principal photography was the revolving stage.  “It was a game-changer in terms of turnaround for shooting other angles,” reveals Ling.  “Rather than having to do a complete set redress you can literally rotate the stage and in five minutes you have the engine updating so the projection on the wall is at the correct rotation and the way you go.  For a lot of the ship sequences, it was handy because they were large set pieces and it would have taken quite a while for grips to go in and move all of that.  It allowed them to capture a lot more content.  Gratitude to Udo Kramer [production designer] and his team because they really thought about how to capitalize on making sure that the timeframe that they’re needing to do this in and how these sets were constructed, some were incredibly complex, were as efficient as possible. Some of the sets were like pizza slices that literally slide in together on this massive rotating dial.”

Protecting the LED screens meant that a complex rain system had to be implemented.  “There is a particular rain scene on the rear deck that was shot in camera and looks fantastic,” states Ling.  “Ventilation was important [in being able to protect the LED screens].  The amount of fog in there at times was incredible to the point that sometimes we were wondering if the [LED] wall was still up.  It is important for the foreground to help sell the immersion into the background. If you’re relying solely on the LED, it’s hard to make that look realistic and that’s where a lot of these atmospherics and foreground elements help to embed the actors and actresses in the setup and help sell the illusion.”

During the shoot, it became obvious the ocean simulations were realistic… and successful. “If people stand on the static set and what is being projected on the [LED] wall is making them feel seasick, then something has to be working,” Salazar muses. “You almost get to the VR element at that point.  Virtual scouting was utilized heavily on this show to make sure we were going to fit within the practical capabilities of the set as well as environment scouting.”      

Most of the LED wall content was CG, as opposed to plate photography augmented with digital skies. “We are selling an emotion with the visual effects work and that’s why it’s important to understand what you are trying to say,” remarks Salazar.  

He continues, “Outside when we are on the boat it seems to be simple because its just water and sky. But wow, it’s so difficult to make a picture with just water and sky!  The dining room was not simple because of the complexity of the interior itself.  There are a lot of colors, floors, chairs, and people.  It was a nightmare to figure out how to light this.  The engineering room was difficult because it was so dark.  The fire is the main lighting source and we had to figure out what was dark and bright.”  Visually the dining room ended up looking the best out of all the environments.  “They lit it fantastically but that fidelity comes with a cost,” observes Ling.  “It was incredibly hard to make sure that we were hitting at least the minimum frame rate so we’re not rendering the same frame twice or more as well as keeping that visual quality.  When you’re in close contact with something like the dining room, there was a lot of foreground set and the blend of light and color was important on-set going into Unreal Engine. There was that big light source that stretches across the entire dining room. It was expensive to light correctly.  It was the scene everyone was happy with and looks incredible in the released show.” 

Trevor Hogg's picture

Trevor Hogg is a freelance video editor and writer best known for composing in-depth filmmaker and movie profiles for VFX Voice, Animation Magazine, and British Cinematographer.