Your browser is out-of-date!

Update your browser to view this website correctly. Update my browser now

×

How Netflix is using the cloud to get productions over the finish line

Netflix's director of production strategy Jeff Shapiro, and Peter Cioni, director of production business development, talk to TVBEurope about how the streamer is using the cloud to help its vendor supply chain

At the end of 2022, Netflix announced a partnership with AWS to enable visual effects companies use the cloud to render workflows. TVBEurope spoke to Jeff Shapiro, director of production strategy and Peter Cioni, director of production business development at Netflix, to find out how the cloud can help the streamer create compelling content.

How does Netflix work with AWS?

Jeff Shapiro: We built workstations, compute and storage on the AWS backbone to provision to vendors and talent through a platform called NetFX. To put that in context, if you were thinking about Uber, AWS and the environment we built on top of it is our car, and we’re looking for drivers. The drivers are artists, they’re the ones taking the passenger, which is a production, to the finish line. We show up with the cars, we’ve got a network of drivers, and we bring the shows in on time. We serviced over 100 projects in 2022, across a broad network of individuals and companies. We’re hosting out of the Mumbai zone, and also US East 1 and US West 1, so we’re out of LA, Montreal, and Mumbai. 

We’ve been successful in being able to provide this giant safety net for our productions to be delivered in time. Not every region is as mature as the US or the UK, to be able to just service the volume of work that we have. We’ve got a lot of work going on in Korea, which is a big market for us, and there are very few suppliers there. So in the case where we have a large demand of work, we then have the spillover effect where we create what I call a transformer vendor, we get 26 vendors to plug into the same ecosystem to deliver thousands of visual effects shots in a very short period of time. It’s pretty novel. We’ve been using AWS since the pandemic started. It was a coincidence. We wanted to build this before the pandemic, which just shot it into the future. We have close to 2000 individual users on the platform, doing a variety of services. We provision infrastructure through AWS and we also provide a host of software that gets mounted on these workstations for the artists. 

You recently announced a partnership with AWS to bring rendering to the cloud, how does that help Netflix?

Peter Cioni: One of the most constrained resources right now on in in the vendor supply chain is visual effects, and so with increased demand, many more shows being made, timelines being extended, there’s just a lot of demand on visual effects vendors. We’ve seen challenges with the delivery of shows as a result of these crunches for resources. One of the biggest constrained moments in time is the rendering process. All the work is done, all the creative decisions are made, it’s just hitting the render button. All of the facilities have on-prem infrastructure, but in a crunch moment they may not have enough infrastructure to actually render all those shots and make the delivery. That’s where a cloud service provider, like AWS is so powerful, because they have effectively unlimited compute capability. We recognise that by allowing and enabling more VFX vendors to leverage AWS infrastructure, the render process is not a cause for a delay in delivery. 

Many VFX vendors are already leveraging AWS for the cloud. But when we work with certain vendors, some of them have not yet been able to connect to AWS, for a myriad of reasons, it could be budget, it could be technical know-how, it could be just the staffing resources that they have. So we recognised that this was a bit of a barrier to some of our VFX vendors being able to deliver shows for us. We partnered with Conductor, which is a Software-as-a-Service provider that leverages AWS infrastructure, and by working together, we’re able to introduce more VFX vendors to AWS’s capabilities. Conductor helps streamline the on-ramp to AWS. For those who don’t have the in-house team to develop special software or IT departments, they can make use of Conductor’s capability to make that process easier. The benefit to Netflix is now we can put more work in the hands of visual effects vendors, and the rendering process of visual effects is no longer a barrier to those vendors getting the work done, which means we can put more shows through that same facility. So we are able to get more capacity from a fixed supply of visual effects miners,

You mentioned the pandemic, how did it change Netflix’s visual effects workflows?

PC: In terms of long-lasting takeaways, or kind of permanent changes in the industry, what comes to mind first is distributed workforce, the comfort that people now have with having a workforce everywhere. The idea of people working in different places all over the world and being OK with it, that’s a very long-lasting outcome. We have visual effects vendors who have talent all over the world. We’ve heard of examples where a VFX company that’s based in Montreal may have an artist or two based in Brazil. The empowerment and the enablement of technology makes that work. I think it’s pretty empowering that physical location is no longer a barrier to how big your workforce is. 

Michelle Yeoh stars in The Witcher: Blood Origin

JS: It’s a double-edged sword though because now everyone that doesn’t have the right resources goes to Brazil. Mexico, or Korea. Netflix is a global company, we need those resources to work on local content. You can access anyone, anywhere now, but it also disrupts the local equilibrium

The pandemic thrust a lot of virtual production workflows into the limelight. We’re still learning a lot about using LEDs as a technology to help create digital backgrounds and reflections. I think virtual production is a tool, and you have to make sure that the content is written to use the tool. It’s not something that you want to force on a production.

What new production technologies are you currently looking at that we haven’t mentioned yet?

JS: Volumetric capture is first and foremost a big catalyst for us because it helps us with transmedia, so scanning and capturing assets and moving them through linear content, games, whole worlds. I wouldn’t call it metaverse. That’s a loaded term. I would say 3D asset reuse and more interplay between the characters and environments we create on our IP and being able to share that in lots of different ways. I think we’re most excited about that.