Japan’s Content Overseas Distribution Association (CODA) has asked OpenAI to stop using members’ works without permission in the operation and training of Sora 2. The group represents major rights holders across anime, film, music, and games, including Studio Ghibli, Bandai Namco, Square Enix, Toei, and Aniplex. CODA says many Sora 2 outputs closely resemble Japanese content, and argues that copying during training can itself violate copyright under Japanese law. The request arrived October 27, 2025, after weeks of industry concern about Sora 2’s rapid adoption in fan and commercial contexts.
CODA’s letter makes two core asks. First, that OpenAI ensure member content is not used for machine learning or Sora 2 outputs without prior permission. Second, that OpenAI “respond sincerely” when companies raise infringement claims about Sora 2 results. The group also rejects an opt-out approach, noting that Japan generally requires permission in advance rather than after the fact.

This push follows pressure from Japan’s government and trade bodies over the cultural and economic stakes around anime and manga. Officials have framed the issue as protecting creators and the country’s soft-power industries while allowing room for responsible AI development.
OpenAI has floated changes since early October, signaling plans for more granular controls for rights holders and a revenue-sharing model for those who allow their characters to be generated. Reporting in late September also described an opt-out mechanism under consideration. CODA’s position suggests those steps will not satisfy Japanese rightsholders without a clear opt-in standard and a formal process that respects domestic law.
The outcome matters far beyond Japan. If CODA’s view influences policy or litigation, it could shape how AI companies obtain training data, the permissions they seek, and the controls they offer creators worldwide. For now, the ball is in OpenAI’s court to address the request and clarify how Sora 2 will operate in jurisdictions that demand prior consent.






Leave a comment