Netflix Unveils Generative AI in Original Series for the First Time
Netflix recently announced that it has begun using generative artificial intelligence to produce scenes in its
original content. In a July 18, 2025 earnings call, co-CEO Ted Sarandos revealed that El Eternauta
–
a new Argentine sci-fi series – contained “the very first GenAI final footage to appear on screen” for Netflix. In
El Eternauta, the apocalypse arrives as a toxic snowstorm, and a key sequence shows a building in Buenos
Aires crumbling under the blizzard. Such a shot would normally require massive VFX resources, but Netflix’s in-house
team did it with AI. Sarandos said the generative model let them complete that collapse 10 times
faster than traditional CGI, at a cost “that just wouldn’t have been feasible” on the show’s budget. In
short, Netflix turned an impossible effect into reality – and did it far more quickly and cheaply than ever before.
A still from Netflix’s Argentine series El Eternauta, where a snow-covered
Buenos Aires scene (left) and AI-generated debris (right) combine in a spectacular collapse sequence.
Netflix says: “AI let them render this building demolition 10× faster and cheaper than usual.
Sarandos was enthusiastic about the result. He told analysts that the creators, executives and even early
viewers were “thrilled with the result” – and insisted that AI in this case was about empowering
storytellers, not cutting corners for its own sake. “We remain convinced that AI represents an incredible
opportunity
to help creators make films and series better, not just cheaper,” he said. In other words, AI was a new tool in the
artists’ toolkit, like a paintbrush that can render complex scenes instantly, but still guided by human
vision. As Sarandos put it, “this is real people doing real work with better tools” – the AI “helps creators
expand the possibilities of storytelling on screen, and that is endlessly exciting”.
Efficiency and Creativity: 10× Faster on a Shoestring Budget
Netflix’s first use of generative AI highlights how the technology can turbocharge effects even on modest budgets.
In
the days of old, smashing a city building into rubble would require weeks of modeling, simulation and rendering by
large VFX teams. Now a visual effects artist can give the AI a few key frames or instructions and let it generate
the
blast and debris. The result in El Eternauta looks seamless: Sarandos noted that the sequence was completed
ten times faster than with traditional VFX tools. More importantly, he said, the cost savings were
dramatic – so much so that without AI “the cost… just wouldn’t have been feasible for a show on that budget”.
This leap in efficiency doesn’t just save money; it lets smaller productions attempt effects previously reserved
for
blockbusters. Sarandos pointed out that Netflix could now include things like fancy de-aging or large-scale
destruction even in non-giant-budget series. It’s not a coincidence that El Eternauta, a foreign-language
thriller on a middle-tier budget, got such help. By offloading grunt work to AI, filmmakers can focus their crews on
creative decisions. As one VFX artist described it, AI becomes an assistant: it can instantly propose dozens of
visual
ideas for the collapse, generate detailed textures of broken concrete and flying glass, and fill in secondary chaos
(dust, sparks, debris) around the human-animated center of the shot. The team still fine-tunes and directs the
outcome
– it’s “real people doing real work with better tools,” as Sarandos emphasized. The AI never writes
the story; it just paints it more quickly.
Indeed, Netflix’s own executives are already touting the creative upside. Sarandos noted that beyond cost and
speed,
the AI tools let creators do things they couldn’t do before:
“our creators are already seeing the benefits in
production through pre-visualisation and shot planning work, and certainly visual effects,” he said. By “expanding
the
possibilities of storytelling on screen,” AI might allow new kinds of scenes and camera moves to be attempted,
rather
than diluting the story.
Netflix’s AI Beyond VFX: Personalized Search and Interactive Ads
Netflix isn’t just experimenting with AI for eye candy on screen. Co-CEO Greg Peters confirmed on the same call
that
generative AI is being deployed across the platform’s tech. For example, Netflix has quietly rolled out an
AI-powered search feature, where viewers can now speak natural-language queries – “I want to
watch a dark thriller from the 80s” – and let the AI parse the request into relevant titles. Peters noted
that
this kind of natural search would have been nearly impossible with old-school tech, but AI has “made it possible to
do
more and more.”
He also said Netflix plans to use AI for interactive advertising and personalization. In the coming
months,
Netflix expects to serve ads that can dynamically change based on viewer feedback or even let viewers ask questions
in
real time, all driven by generative models. Even today, the platform is leveraging AI behind the scenes: the
recommendation engine has become more sophisticated with machine learning, and Peters mentioned that ads and
personalized promos are being fine-tuned by AI to boost engagement. In short, generative AI is gradually seeping
into
everything from how you find a show to what commercials you see – on top of its new role in making the shows
themselves.
Industry Reactions and Ethical Concerns
Not everyone in Hollywood cheered Netflix’s announcement unreservedly. In fact, the use of AI in production was a
major flashpoint just two years ago in the 2023 actors’ and writers’ strikes. Unions won new protections to ensure
AI
would not replace writers or create “digital replicas” of performers without consent. The concern is real: many
creatives worry that once studios embrace AI widely, jobs could vanish. Visual effects and animation workers in
particular have protested what they see as a threat to their livelihood. (One labor advocacy group even conducted a
study showing that many VFX jobs could be at risk from automation.)
In public forums, the reactions have been mixed. Some industry observers hailed Netflix’s move as visionary,
arguing
it democratises high-end effects for smaller studios. Others sounded alarms: on social media and news sites,
comments
ranged from “cool future tech” to “AI will steal our jobs.” Bloomberg and tech sites noted that
Lionsgate, Amazon and others are racing into AI, prompting “AI arms race”
headlines.
The Guilds remain cautious: they stress that AI footage must be closely supervised by humans.
Netflix’s leadership tried to address these fears head-on. Sarandos repeatedly emphasized that AI is not about
replacing creators. In his words, it’s about “helping creators, not replacing them.” He pointed out
that AI tools only did the tasks that human artists still needed to integrate and refine. As Sarandos put it, the
industry’s better future is with “real people doing real work with better tools”. By maintaining
that
stance, Netflix echoed the message of several AI innovators. For example, Moonvalley’s AI CEO Naeem Talukdar – whose
studio is releasing a film-ready AI called Marey – says bluntly: “We have to make sure that we’re building these
tools the right way: building with the filmmaker and the artist at the center of it, rather than trying to
automate
their job away”. In other words, the goal is augmentation, not automation.
There are also ethical worries about how the AI was trained. Industry creatives have pointed out that most
generative
models today are trained on huge datasets of copyrighted images and footage scraped from the internet. To some, that
feels like using others’ art without permission. Late in 2024, more than 400 actors, writers and directors even
signed
an open letter calling on policymakers to shield creative work from being grabbed by AI companies without consent.
Netflix has not publicized how it trained the model for El Eternauta’s scene; one hopes it was on safely
licensed data or created assets. By contrast, some startups are explicitly using licensed material
to
train their models. For example, Moonvalley’s Marey is built on a library of footage cleared by filmmakers, so that
studios can use it without worrying about copyright lawsuits. (This approach – AI on a leash – has won praise from
early adopters.) The fact that Disney and Universal sued AI art company Midjourney in June 2025 for allegedly
infringing on their characters underscores the stakes. Hollywood is now writing the rules as it goes: California
even
passed new laws to restrict unauthorized AI “deepfakes” of real actors in ads, and studio unions have set strict
contract terms about using AI on set. The Netflix announcement comes amid this turbulence, reassuring some that at
least a major streamer is treating AI cautiously.
AI in Hollywood’s Broader Landscape
Netflix’s disclosure is a sign of a much larger trend. Across the industry, many studios are experimenting with AI
in
production. Amazon MGM (Prime Video) quietly reported that its Biblical epic House of
David
leaned heavily on AI-powered VFX. (An episode six Goliath sequence, for instance, was reportedly generated with AI
assistance.) John Erwin, the show’s creator and an animator by training, says his team used AI on 73 of the 850
visual-effects shots in the first season. “These tools are more human, more creative, and more collaborative
than any other tools I’ve ever used in a VFX process,” Erwin explained. In his view, the AI shots were not
about cutting corners but about enabling something that would have been “unimaginable” otherwise. Rather than cost
cuts, he insisted it actually “empowered [him] to be more creative” and made the show possible on
its
ambitious scale. Thanks to AI, House of David could employ “nearly 700 people” on a production
that
many experts had initially deemed beyond reach. As the show’s motto goes, “We do not compromise. We innovate.” – and
in this case, innovation meant smart use of AI.
Lionsgate, the studio behind John Wick and Hunger Games, is taking a different tack. In late 2024
they partnered with AI startup Runway to train a custom model on Lionsgate’s entire film archive. The idea is to
give
their filmmakers an AI that already understands Lionsgate’s style and characters. Lionsgate’s vice-chair Michael
Burns
said Runway is a “visionary, best-in-class partner” to help the studio create content more efficiently. In practical
terms, they plan to use the AI for storyboarding, concept art and even generating trailers or background scenes
before
a shoot. Burns believes the move “will save Lionsgate millions and millions of dollars” in development and
VFX costs. Clearly, the pressure to reduce budgets is driving creative experiments everywhere.
Hollywood’s tech giants are also watching closely. Netflix has explicitly declared YouTube its new chief rival in
TV
viewing, but TV networks are uneasy that Netflix is getting a new edge. Meanwhile, streaming newcomers like Peacock
and Max are reportedly exploring generative AI for marketing and content fillers. Even outside Hollywood, Chinese
filmmakers have launched AI trials – for example, a $5 million AI-made short film at a Chinese festival shows the
global reach of this trend. In sum, Netflix is far from alone. We are at the beginning of what many call an “AI arms
race” in entertainment: every studio is trying to figure out how to use the tools without burning bridges with
creators.
Striking a Balance: Tools for Creators, Not Replacements
As the dust settles on the news, most industry insiders seem to agree that the real winners will be audiences – and
storytellers – if these tools are used wisely. On one hand, generative AI promises dazzling sequences and new types
of
stories. Sarandos himself told Variety that Netflix’s success this quarter was “a result of great content, increased
pricing, and advertising momentum hitting all at once.” He and Greg Peters now see AI as just another technological
leap for filmmaking, akin to computer graphics or the Steadicam. In his words, it’s a way to “help creators
make films and series better”, not a magic money-saver.
On the other hand, Hollywood veterans stress that the vision must remain human. OpenAI researchers
like Talukdar and supportive filmmakers like director Ángel Manuel Soto (who is advising an ethical AI startup)
emphasize that tools should be built around artists, not beneath them. In Soto’s view, a responsibly built
generative model is “a breakthrough… from streamlining studio workflows to empowering emerging creators in places
like
Puerto Rico and Dakar”. The consensus among many filmmakers is clear: AI might amplify imagination, but it cannot
originate it.
Netflix’s own message mirrors this: the AI-produced collapse in El Eternauta was only possible because
experienced VFX artists guided the model, checked each frame, and integrated the final cut. Without their
leadership,
an AI would just be spinning its wheels. The company took pains to credit the human crew behind the scenes. In
essence, Netflix says AI gave them a faster engine, but the drivers were still people.
In the end, human creativity remains irreplaceable. Netflix’s slide deck quoted Sarandos calling
generative AI “endlessly exciting,” but what really shines is how people use it. The coming months will likely see
more studios unveiling AI experiments: imagine animated storyboards, digital doubles done in a day, or even
personalized storylines. But one thing is certain: the writers, actors and effects artists who design our stories
will
still be the ones who decide which stories are told. As Netflix’s top execs emphasize, AI is best seen as a
new kind of brush – a powerful one, but a brush nonetheless, wielded by human hands.
Sources: Netflix and media company statements and news reports (July 2025) (reports and interviews
with Netflix execs and industry figures).