AI Video Looks Incredible. It's Still Dead on Arrival.

ByteDance's AI video generator Seedance 2.0 is reportedly on hold after Disney and Paramount Skydance took issue with its potential for copyright infringement, raising serious questions about the future of AI video generation.

Jaron Chong
March 18, 2026 · 5 min read
AI Video Looks Incredible. It's Still Dead on Arrival.

ByteDance just blinked.

A month after launching Seedance 2.0 in China, the company has reportedly suspended its global rollout after cease-and-desist letters from Disney and Paramount Skydance. The tool had gone viral almost immediately, most notably from a convincing Brad Pitt vs. Tom Cruise fight video that had people declaring Hollywood dead. ByteDance told the BBC it's "taking steps to strengthen current safeguards." That's corporate for we got caught.

Social media ran with the hype. New Seinfeld episodes. Star Wars sequels. Cinematic action sequences from a text prompt. "This changes everything," people said.

It doesn't change much. Not yet. Because unless AI video companies solve the copyright problem — actually solve it, not dodge it — the technology's quality is beside the point. The most photorealistic AI clip on earth is worthless if it can't be legally distributed. Impressive and viable are not the same thing.

The Legal War Is Already Underway

Seedance isn't a one-off controversy. It's the latest flashpoint in a full-scale legal conflict that has been building for years.

In June 2025, Disney and Universal filed a landmark copyright lawsuit against Midjourney, the first time major studios sued a generative AI company directly. The 110-page complaint accused Midjourney of running a "virtual vending machine" producing unauthorized copies of Darth Vader, Homer Simpson, and Shrek. Separately, Disney sent cease-and-desist letters to both Google and Character.AI over similar infringement claims. According to the Copyright Alliance, there are now over 70 active infringement suits filed by copyright owners against AI companies in U.S. courts alone.

The "fair use" defense the AI industry has leaned on is not holding up. A federal judge in 2025 found that AI companies can use copyrighted material for training if acquired legally, but that Anthropic had obtained some training data through pirated sources, which crossed into infringement. The settlement: $1.5 billion. Meanwhile, the U.S. Copyright Office clarified in 2025 that content produced from a prompt alone is not eligible for copyright protection. You can generate a clip. You don't own it. You can't license it. You can't build a business on it.

The Licensing Playbook Costs a Billion Dollars

There is a legal path forward. OpenAI has walked it, and the price tag is instructive.

In December 2025, Disney and OpenAI announced a three-year licensing deal giving Sora users access to over 200 characters from Disney, Marvel, Pixar, and Star Wars. The deal required a $1 billion equity investment from Disney, a joint steering committee to monitor generated content, and a detailed brand-use appendix. No actor likenesses. No voices. No long-form video. Curated short clips only.

The first year is exclusive to OpenAI. Every other AI video company is legally locked out of Disney IP for at least 12 months. As one analyst put it: "Use IP without permission to train AI models, get rewarded with $1 billion equity and licensing deals." The original sin, it seems, can be forgiven, but only if the check is big enough.

For the dozens of AI video startups that raised seed rounds on the promise of democratized video creation, this is a death sentence. Studios are not looking to cut deals with companies that can't pay. They're looking to set nine-figure legal precedents. This is exactly what the music industry did with Napster and LimeWire before eventually licensing to Spotify. The studios learned from it, and they're not repeating the same mistake.

The Chinese Enforcement Problem

The Seedance situation has an extra wrinkle: ByteDance is headquartered in Beijing.

Serving a copyright complaint on a Chinese company under the Hague Convention can take 18 to 24 months just to reach the opening stages of litigation. Disney, NBCUniversal, and Warner Bros. filed against Chinese AI company MiniMax back in September 2025 and are still working out how to serve the complaint. Some Chinese AI companies haven't even established formal legal entities recognizable under U.S. law.

Some users celebrated this as a feature: a tool that "doesn't give a shit about copyright laws." But operating outside legal reach also means operating outside commercial reach. No legitimate platform will host it. No brand will license it. No enterprise will touch it. The user base, by definition, is people doing things they shouldn't. That is not a market with a future.

What Actually Survives

AI video generation does not disappear. It retreats to the use cases that were always its most defensible ground: corporate training videos, product explainers, stock b-roll, architectural renderings. Applications that operate within pre-approved asset libraries, under enterprise contracts, with legal teams reviewing the output.

These use cases don't require recognizable actors, copyrighted characters, or cinematic styles pulled from existing films. They are unglamorous. They are commercially sound. They will scale.

The content people most want to generate, clips that feel like real movies, populated with familiar characters, styled like their favorite franchises, is precisely the content most legally compromised. The gap between what the technology can produce and what it can legally distribute is not closing. It's widening with every new lawsuit filed.

The technology keeps getting better. The legal framework stays firmly in place. For mass-market consumer video, one of those things matters more than the other, and it is not the one the demos are showing you.


Sources: Engadget · Copyright Alliance · Copyright Lately · NPR · OpenAI · Axios · TechCrunch · CNBC · Marketing AI Institute · Make Use Of

Written by

Jaron Chong