Mon. Dec 23rd, 2024
Runway Launches Ultra Realistic New Ai Video Model Gen 3 Alpha

It’s time to celebrate the incredible women leading the way in AI. Nominate your incredible leader for VentureBeat’s Women in AI Awards by June 18. Learn more here


Based in New York City Runway MLThe company, also known as Runway, was one of the early startups focused on realistic, high-quality generative AI video creation models.

However, after that debut, The Gen-1 model is scheduled to be released in February 2023. and Gen-2 is scheduled for release in June 2023But the company has since seen its star get eclipsed by other highly realistic AI video generators, including OpenAI’s unreleased Sora model and Luma AI’s Dream Machine model, released last week.

But today, things have changed and Runway has launched a major counterattack in the AI ​​video generation wars. Today, Runway is Gen-3 Alpha LaunchedThe blog post states that this is “the first in a series of upcoming models trained by Runway on our new infrastructure built for large-scale multimodal training” and is “a step toward building general-world models,” AI models that “can represent and simulate a wide variety of situations and interactions that we might encounter in the real world.” Check out the example video created with Runway’s Gen-3 Alpha at the bottom of this post.

Gen-3 Alpha allows users to generate high-quality, detailed and highly realistic 10-second video clips with a wide range of emotional expressions and camera movements with high accuracy.


VB Transform 2024 registration opens

Join enterprise leaders at our flagship AI event in San Francisco July 9-11. Network with your peers, explore the opportunities and challenges of generative AI, and learn how to integrate AI applications in your industry. Register now


An exact release date for the model has yet to be announced, with Runway only releasing a demo video on the X’s website and social accounts, and it’s unclear whether the model will be available on Runway’s free plan or if a paid subscription (starting at $15 per month or $144 per year) will be required to access it.

In X, Anastasis Germanidis, co-founder and CTO of Runway, wrote that Gen-3 Alpha “will soon be available in Runway products, bringing all the existing modes you’re used to using (text-to-video, image-to-video, video-to-video) plus some new modes that were previously only possible on the higher-performance base model.”

Germanidis also wrote that since releasing Gen-2 in 2023, Runway has learned that “video diffusion models are nowhere near saturating the performance gains that come with scaling, and these models build very powerful representations of the visual world by learning the task of predicting videos.”

What is diffusion? AI models are trained to reconstruct visuals It learns concepts from annotated image/video and text pairs and extracts concepts (stills or video) from pixelated “noise”.

Runway said in a blog post that Gen 3-Alpha was “jointly trained on video and imagery” and is “a collaborative effort from an interdisciplinary team of researchers, engineers, and artists,” but the specific dataset has not yet been made public. This follows a trend among most other major AI media generators of not disclosing exactly what data their models were trained on, let alone whether the data was sourced through paid licensing agreements or simply scraped from the web.

Critics argue that AI modelers should pay the original creators of the training data through licensing agreements, and have even filed copyright infringement lawsuits on the matter, but AI model companies generally take the position that they are legally allowed to train on any publicly available data.

Interestingly, Runway also says that it is already “working with major entertainment and media organizations to create custom versions of Gen-3,” which “allows for more stylistically controlled and consistent characters, targeting specific artistic and narrative requirements, and more.”

Although no specific organizations have been mentioned, so far, Everything, everywhere, at once and The People’s Joker He revealed that Runway was used to create effects for some parts of the film.

Runway included a form in its announcement of the Gen-3 Alpha inviting other organizations interested in acquiring custom versions of the new model. Apply hereThe cost of training a custom model is not disclosed.

We’ve reached out to Runway for more information on some of the points and questions above and will update you when we hear back.

Meanwhile, it’s clear that Runway isn’t giving up in the fight to be a leading player or leader in the rapidly evolving field of AI-generation video production.