Imagine a world where your favorite cartoon characters come to life in AI-generated videos—sounds exciting, right? But here’s where it gets controversial: OpenAI’s new app, Sora 2, has sparked a heated debate after users flooded the platform with videos featuring copyrighted characters from shows like SpongeBob SquarePants, South Park, and Rick and Morty. Now, OpenAI is scrambling to give copyright owners more control over how their creations are used—or misused.
Launched last week on an invite-only basis, Sora 2 is an AI-powered video generator that turns text prompts into short videos. While the technology is undeniably impressive, it’s also a legal and ethical minefield. The Guardian’s review of the platform revealed a deluge of videos featuring copyrighted characters, raising questions about intellectual property rights in the age of AI. And this is the part most people miss: Before releasing Sora 2, OpenAI reportedly told talent agencies and studios they’d have to opt out if they didn’t want their characters replicated—a move that’s drawn sharp criticism.
OpenAI has since promised to give copyright holders “more granular control” over character generation. According to Varun Shetty, the company’s head of media partnerships, rights holders can flag infringements using a dedicated form, and OpenAI will block characters or respond to takedown requests. But here’s the catch: individual artists or studios can’t opt out entirely. This has left many creators feeling uneasy, as their work could still be used without explicit permission.
In a recent blog post, OpenAI CEO Sam Altman acknowledged the backlash, stating the company has been “taking feedback” and will implement changes. He compared the new system to how users can opt-in to share their likeness, but with “additional controls” for rights holders. Altman also hinted at a potential revenue-sharing model, where creators could be paid for allowing their characters to be generated. But here’s the bold question: Is this enough to balance innovation with creators’ rights?
Altman admitted there will be “edge cases” where problematic content slips through Sora 2’s guardrails. He also noted the platform’s unexpected popularity, with users generating far more videos than anticipated. “The exact model will take some trial and error,” he said, emphasizing the need to make the platform both profitable and fair. Here’s where it gets even more intriguing: Some rights holders are actually excited about this “interactive fan fiction,” seeing it as a new way to engage audiences—but only if they can control how their characters are used.
As OpenAI navigates this complex landscape, one thing is clear: the line between innovation and infringement is blurrier than ever. What do you think? Should AI platforms like Sora 2 prioritize technological advancement, or should they err on the side of protecting creators’ rights? Let’s spark a conversation in the comments—this is one debate that’s just getting started.