How Runway revolutionized film production with AI
Cristóbal Valenzuela envisioned a new kind of creative suite, where artificial intelligence would serve as a collaborative partner, enabling users to bring any imaginable concept to life.
How Runway revolutionized film production with AI
Cristóbal Valenzuela envisioned a new kind of creative suite, where artificial intelligence would serve as a collaborative partner, enabling users to bring any imaginable concept to life.
How Runway revolutionized film production with AI
Cristóbal Valenzuela envisioned a new kind of creative suite, where artificial intelligence would serve as a collaborative partner, enabling users to bring any imaginable concept to life.
Co-Founder and CEO
A new bag of (storytelling) tricks
“Everything you need to make anything you want.”
It’s a bold statement, but that’s exactly what you get with Runway, a full-stack, applied AI research company that trains and builds generative AI models for content creation. Runway is building tools that allow everyone to tell their stories, regardless of their skill level, background, or resources.
The company’s suite of products includes Gen-2, the first publicly available text-to-video model and Gen-1, a model that allows users to generate video content from an input video and either image or text prompts. These releases are paired with over 30 AI Magic tools that allow users to generate and edit content, serving every aspect of the creative process - including Green Screen, which instantly removes the background from any video with just a few clicks and Inpainting, which removes any object from any video with just a few simple brush strokes. Most recently, Runway released a mobile app, a first of its kind video to video AI model that lets you transform the videos on your phone into anything you can imagine.
“It doesn't matter if a user is a professional, award-winning creator and filmmaker or a small-scale YouTuber, Runway is a tool for storytelling,” says Runway co-founder and CEO Cristóbal Valenzuela. “The creatives with the best ideas will be able to execute them really fast. And our constant goal is to find ways to reduce that time.”
The Lumières light the way
For Valenzuela, Runway’s story started in Chile, where he studied economics, film, and design at Santiago’s Universidad Adolfo Ibáñez. While most of his cohort went on to work in finance, Valenzuela’s first job was at one of Chile's biggest film production houses—and he took the job without pay.“I wanted to work with the best filmmakers and learn as much as I could from them,” he says. “I was so drawn to it and interested in it, and it gave me a good perspective on how the industry works.”
In this role, Valenzuela had the opportunity to see the industry’s pain points firsthand. Despite the emergence of new tools, people continued to use outdated technology simply because they were used to it which resulted in costly and frustrating production delays and limited creativity.
Inspired by the Lumière brothers, the trailblazing inventors who created the cinématographe in the 1800s, Valenzuela began exploring new ways to use technology in filmmaking through research in AI applied to computational creativity.
“Filmmaking is a magic trick,” he says. “You stitch images together to create an illusion, and that illusion helps immerse you in the story. All because of the invention of the camera, two hundred years ago. AI is a new camera. Perhaps, the most powerful one we’ve ever created and what we need is a new generation of creatives doing new magic tricks.”
Valenzuela had found his path.
Valenzuela moved to the US in 2016, where he enrolled in New York University’s two-year interactive telecommunications program. That’s where he met Anastasis Germanidis and Alejandro Matamala, fellow grad students who shared Valenzuela’s interest in neural networks and filmmaking.
AI technology advanced rapidly while they were at NYU. Valenzuela, Germanidis, and Matamala threw themselves into their research that would eventually become Runway. While they all received offers to work at top companies, they turned them down in pursuit of building a company together. After graduating, they all got offers to work at some of the best companies, but turned them down and decided to build Runway instead.
“I'm going to listen to my gut, and I'm going to try it. If it doesn't work, great, I'll figure out something else,” says Valenzuela.
Runway was founded in December of 2018, just seven months after Valenzuela, Germanidis, and Matamala graduated from NYU.
Going all in on AI
From “no” to “go”
Startups always come with risk, but for Runway’s co-founders, it was substantial. Their ability to stay in the US hinged on their ability to navigate the immigration process while building a company, trying to hire, and developing a product.
“We put a lot of pressure on ourselves because there was no plan B. We had to figure something out,” says Valenzuela. “Now that has become central in how Runway operates and is one of the most important lessons I’ve learned: the ability to figure out things is fundamental when building a company. Everything else follows that. Just figure it out.”
Despite the team’s determination, Runway faced an uphill climb once they started pitching potential investors and clients on AI-first products and Generative AI. “Everyone kept saying, ‘There's no market for this. This is a toy,’” says Valenzuela. “It was challenging, but we were determined. You’re going to encounter a sea of nos, and you need to keep telling yourself, ‘It's going to happen.’”
To support their vision, Valenzuela, Germanidis, and Matamala assembled the best research team they could find. Over the past few years, Runway has collaborated with a range of institutions, including Carnegie Mellon University and the University of Washington. With strong research to support the work, the team was able to release a beta version of Runway just six months after founding the company.
They raised their first round of funding in December 2020, at NeurIPS, a top AI conference. “Our seed round was led by Amplify, Lux, and Compound,” says Valenzuela. “We were very fortunate to work with investors that really supported our vision.”
The tipping point
Runway continued to release research-backed tools, and later that year they raised a $8.5M Series A round led by Amplify, followed by an additional $35 million in Series B funding in 2021.
True to form, Runway continued its research alongside its product development efforts. In 2022, while collaborating with LMU Munich, they published High-Resolution Image Synthesis with Latent Diffusion Models.
“Latent Diffusion was an important milestone for image synthesis research,” says Valenzuela, referring to a type of AI technology that can generate images based on textual prompts. That research led to a new version of latent diffusion, called Stable Diffusion, which is more effective than the previous versions.
“Stable Diffusion has taken on a life of its own,” says Valenzuela. “We open-sourced it a year ago, and it grew. A lot. Stable Diffusion is now used to create all sorts of things. But the origin story is tied to our open source work and the continuous scientific research we've done over the last years.”
With Stable Diffusion taking off, AI reached a tipping point in terms of mainstream awareness, making it much easier for creatives to understand the power of generative AI.
Even Runway’s immigration challenges have paid off. Thanks to that experience, the company has been able to create a specialized immigration team that enables Runway to attract top-notch employees from all over the world. As Valenzuela says, “I study art, but I have a PhD in visas.”
Putting users first
As the possibilities for AI-generated content gained mainstream awareness in 2022, the Runway team announced a $50M Series C funding round - aimed at catapulting its next phase of growth and accelerating its strategic business initiatives. Since that announcement, the team has remained focused on training, inference, model improvements, and of course, a final product designed to provide the best possible user experience.
Always committed to pushing research forward, Runway also consistently finds ways to improve the quality of its models.
“We always need to think about what can be possible in research, and push the boundaries,” says Valenzuela. “After that, we need to develop products so we can get that research into the hands of users in the shortest amount of time possible.”
Gen-2, a groundbreaking video generation system that can generate novel videos with text, images, or video clips, is the latest of those leaps. “We're very excited about the reception the model has gotten and the amazing video people have already been creating,” says Valenzuela. “We’ve released it to a small, early cohort of alpha testers, but the content they’ve created and shared online has already blown my mind.”
What’s most remarkable about Runway is that it isn’t simply making small improvements to existing products. It’s approaching filmmaking in an entirely new way, with entirely new technology. The result? Massive technological leaps that ensure their users can execute their ideas while spending a fraction of the time and money.
Success stories
Valenzuela has finally achieved his dream of providing today’s filmmaking magicians with new tricks—while also tackling one of the film industry’s toughest pain points: costly production delays. And the success stories are stacking up.
Valenzuela discovered that The Late Show with Stephen Colbert was using Runway to reduce production time when he saw a few consecutive sign-ups come through from CBS. When he reached out to see if he could help with anything, he found out that a Late Show editor had decided to try out Runway after seeing a rotoscoping competition on Corridor Crew’s YouTube channel. One of the competitors used Runway to edit a video—a task that the editor was used to doing one frame at a time—in about three minutes.
“He bought a license and started to use the software—and finished things that used to take him a couple of days in a couple of minutes,” says Valenzuela. “But he didn't tell the rest of his team what he was using. So people would ask him to help them with something, expecting it to take him two days. And he was back in 20 minutes.”
In 2022, the creators of Everything Everywhere All At Once also used Runway to reduce the time required to produce the award-winning film—an impressive feat for such a visually intense film.
But Runway hasn’t just set out to help the big production houses. The company wants to make creation easier for everyone so that more people can tell more stories. “We’re not only able to make storytelling more accessible and cost-efficient for the professionals who've been in the space for the last 20 years, we’re also providing the ability to create high-quality content to people who never thought they’d have access to those kinds of resources,” says Valenzuela. “It allows a huge, new market to emerge.”
One example of this is The People’s Joker, an indie feature film that was written, directed, acted in, and edited by Vera Drew. Her original concept was a mixed media/found footage project that would have required such a significant amount of rotoscoping that it would have been impossible to do manually. She began looking into composite software and discovered Runway—which enabled her to create a feature film.
Both markets continue to grow, just as Valenzuela predicted, thanks to word-of-mouth and top-notch customer support. The product has also spread virally, as a result of people sharing their own Runway-driven projects online. “Early, early on, we earned our first customers just by working very closely with every user we thought would be interesting to learn from,” says Valenzuela. “Now, that network-based approach and product-led growth has led to millions of users.”
(Warp)speed to market
In addition to its technical ingenuity and its creative vision, Runway is recognized for shipping products incredibly fast—especially for a small company with only 44 people on staff and millions of customers.
“People ask me about our production cycles a lot,” says Valenzuela. "They want to know which processes we run, and which frameworks we use to organize our team. They want to know what the secret sauce is. But that’s like asking an Olympic runner, ‘How do you run so fast? What did you eat that morning?’ Well, they've been training for 10 years. They're able to run so fast because they've prepared for it for a long time. And so, for us, it's very similar. We're able to run so fast because we've been training for it now for almost five years.”
Of course, it’s also about skill, not just endurance. Runway has done a lot of solid research and made some great decisions—including their choice to center users in everything they do.
According to Valenzuela, when you’re doing research, it’s easy to make assumptions about what you think will work for people. But those assumptions aren’t always right—and when they’re wrong, they can lead to costly mistakes in terms of shipping time.
“Being very pragmatic and listening to customers as early as possible, even if you're doing fundamental research,” says Valenzuela.
“That's how we've always approached research. We have research scientists with years of experience in AI sitting next to people who have been working in video for 20 years. The cross-pollination that happens at that table is unique because they’re leveraging both worlds to make something that actually works.”
Runway’s ability to ship products quickly is also driven by the same sheer determination that enabled the co-founders to launch a groundbreaking business while juggling immigration challenges. When something needs to be done, Runway takes care of it. For example, when Runway decided to launch its AI Film Festival, they handled everything in-house, with no agency in sight. Why? Because they wanted the kind of control they needed to achieve their vision.
“We tend to focus less on processes and more on outcomes,” says Valenzuela. “The idea for the film festival was brewing for years, but we didn’t decide to get started until late October of 2022, and we wanted to hold it in February 2023. “We had a few months to do it, to figure it out. We just needed to figure it out. And that drives a lot of our decisions.”
Transforming the future of film
When Valenzuela talks about the future, you can hear the excitement in his voice. “It's going to be transformative in ways that just weren't even possible before.”
When he considers Runway’s primary role in the future of AI, he plans to remain committed to making storytelling tools more accessible for creatives while allowing them to express their ideas as deeply as they can envision them.
“I’m really excited about everything that we still need to build,” says Valenzuela. “We've been working on this for years, and I'm really excited about this line we've just crossed, where more people are finally understanding how impactful and powerful the technology is. I can’t wait to finally start delivering this to billions of people.”