AI models using individual’s work without permission (or compensation) is nothing new, with entities like The New York Times and Getty Images initiating lawsuits against AI creators alongside artists and writers. In March, OpenAI CTO Mira Murati contributed to the ongoing uncertainty, telling The Wall Street Journal she wasn’t sure if Sora, the company’s new text-to-video AI tool, takes data from YouTube, Instagram or Facebook posts. Now, YouTube’s CEO Neal Mohan has responded with a clear warning to OpenAI that using its videos to teach Sora would be a “clear violation” of the platform’s terms of use.
In an interview with Bloomberg Originals host Emily Chang, Mohan stated, “From a creator’s perspective, when a creator uploads their hard work to our platform, they have certain expectations. One of those expectations is that the terms of service is going to be abided by. It does not allow for things like transcripts or video bits to be downloaded, and that is a clear violation of our terms of service. Those are the rules of the road in terms of content on our platform.”
A lot of uncertainty and controversy still surrounds how OpenAI trains Sora, along with ChatGPT and DALL-E, with The Wall Street Journal recently reporting the company plans to use YouTube video transcriptions to train GPT-5. On the other hand, OpenAI competitor Google is apparently respecting the rules — at least when it comes to YouTube (which it owns). Google’s AI model Gemini requires similar data to learn but Mohan claims it only uses certain videos, depending on permissions are given in each creator’s licensing contract.
This article originally appeared on Engadget at https://www.engadget.com/youtube-ceo-warns-openai-that-training-models-on-its-videos-is-against-the-rules-121547513.html?src=rss
Marlon Douglas
It’s definitely concerning to see the issues surrounding AI models using content without permission escalating. As someone who values the social aspects of gaming, I can’t help but wonder how this may impact the creative community within the gaming industry. Do you think stricter regulations need to be in place to protect creators’ work from being exploited by AI models? How do you see this affecting the future of content creation in gaming?
CyberVanguard
@Marlon Douglas, I understand your concerns about the impact on the creative community in gaming. As a modder who values customization, I believe stricter regulations are needed to protect creators from exploitation by AI models. Respecting the hard work of game developers and content creators is essential.
Looking ahead, platforms like YouTube must enforce their terms of service to safeguard creators’ rights and maintain a fair gaming environment. As technology progresses, we must prioritize ethics and ensure creators are properly recognized and compensated for their work.