You’ve in all probability encountered photos in your social media feeds that seem like a cross between pictures and computer-generated graphics. Some are fantastical—assume Shrimp Jesus—and a few are plausible at a fast look—keep in mind the little lady clutching a pet in a ship throughout a flood?
These are examples of AI slop, or low- to mid-quality content material—video, photos, audio, textual content or a mixture—created with AI instruments, typically with little regard for accuracy. It’s quick, straightforward, and cheap to make this content material. AI slop producers sometimes place it on social media to take advantage of the economics of consideration on the web, displacing higher-quality materials that might be extra useful.
AI slop has been growing over the previous few years. Because the time period “slop” signifies, that’s usually not good for individuals utilizing the web.
AI slop’s many varieties
The Guardian revealed an evaluation in July 2025 inspecting how AI slop is taking on YouTube’s fastest-growing channels. The journalists discovered that 9 out of the highest 100 fastest-growing channels characteristic AI-generated content material like zombie soccer and cat cleaning soap operas.
The music “Let it Burn,” allegedly recorded by a band referred to as The Velvet Sunset, was AI-generated.
Listening to Spotify? Be skeptical of that new band, The Velvet Sunset, that appeared on the streaming service with a inventive backstory and spinoff tracks. It’s AI-generated.
In lots of circumstances, individuals submit AI slop that’s simply adequate to draw and maintain customers’ consideration, permitting the submitter to revenue from platforms that monetize streaming and view-based content material.
The benefit of producing content material with AI allows individuals to submit low-quality articles to publications. Clarkesworld, a web-based science fiction journal that accepts consumer submissions and pays contributors, stopped taking new submissions in 2024 due to the flood of AI-generated writing it was getting.
These aren’t the one locations the place this occurs—even Wikipedia is coping with AI-generated low-quality content material that strains its total group moderation system. If the group isn’t profitable in eradicating it, a key info useful resource individuals depend upon is in danger.
Final Week Tonight with John Oliver delves into AI slop.
Harms of AI slop
AI-driven slop is making its manner upstream into individuals’s media diets as effectively. Throughout Hurricane Helene, opponents of President Joe Biden cited AI-generated photos of a displaced youngster clutching a pet as proof of the administration’s purported mishandling of the catastrophe response. Even when it’s obvious that content material is AI-generated, it may nonetheless be used to unfold misinformation by fooling some individuals who briefly look at it.
AI slop additionally harms artists by inflicting job and monetary losses and crowding out content material made by actual creators. The location of this lower-quality AI-generated content material is commonly not distinguished by the algorithms that drive social media consumption, and it displaces total lessons of creators who beforehand made their livelihood from on-line content material.
Wherever it’s enabled, you’ll be able to flag content material that’s dangerous or problematic. On some platforms, you’ll be able to add group notes to the content material to offer context. For dangerous content material, you’ll be able to attempt to report it.
Together with forcing us to be on guard for deepfakes and “inauthentic” social media accounts, AI is now resulting in piles of dreck degrading our media atmosphere. A minimum of there’s a catchy identify for it.
Adam Nemeroff is an assistant provost for improvements in studying, instructing, and know-how at Quinnipiac College.
This text is republished from The Dialog beneath a Artistic Commons license. Learn the unique article.