AI

YouTube wants you to flag low-quality AI-generated videos: here’s how its new tool works

The platform is beginning to show surveys to some users to identify content that “AI slop,” as it tightens its controls.

AI Slop

YouTube has started showing some users a new survey asking them to rate whether a video “looks like AI slop”—that is, low-quality content generated by artificial intelligence. According to screenshots shared in recent days, the platform displays the video, its title, and its thumbnail, and asks directly whether the content feels like “AI slop” or “low quality,” with five possible responses ranging from “not at all” to “extremely.” For now, it’s unclear exactly what effect this rating will have on the video, the channel, or recommendations, but it does make it quite clear where YouTube wants to go.

In his annual letter, CEO Neal Mohan openly acknowledged the rise of so-called “AI slop” and stated that the company is strengthening its systems against spam and clickbait to reduce the spread of repetitive, low-quality content generated by AI. This is not a blanket ban on artificial intelligence, but rather an effort to distinguish between the creative use of automation and industrial-scale junk video.

YouTube wants you to flag low-quality AI-generated videos: here’s how its new tool works

Yes to AI, but not just any kind

That distinction is important, because YouTube is also rolling out several AI-powered tools within its own ecosystem. The company highlights features like Ask and automatic dubbing, which by December was already attracting more than 6 million daily viewers who watched for at least ten minutes, while also implementing new safeguards against counterfeits. This month, it has also expanded a pilot program to journalists, public officials, and candidates to detect AI-generated videos that mimic their appearance and allow them to request their removal if they violate the platform’s rules.

Related stories

In July 2025, YouTube renamed its “repetitive content” policy to “inauthentic content” and clarified that repetitive or mass-produced material cannot be monetized. Its guidelines explicitly state that content must be original, not mass-produced, and not repeated on a large scale using nearly identical templates. All of this addresses a growing problem. A study cited by The Guardian estimated in late 2025 that more than 20% of the videos the algorithm showed to new users were “AI slop,” while YouTube has already removed several very large channels for violating its policies on spam, deceptive practices, and scams.

Follow MeriStation USA on X (formerly known as Twitter). Your video game and entertainment website for all the news, updates, and breaking news from the world of video games, movies, series, manga, and anime. Previews, reviews, interviews, trailers, gameplay, podcasts and more! Follow us now!

Tagged in:
Comments
Rules

Complete your personal details to comment