Nearly half of all new music hitting streaming platforms is AI-generated slop, and most of the streams those songs rack up are fake. This isn’t a distant threat creeping toward the music industry — it’s already here, already bleeding real artists dry, and the platforms collecting subscription fees are doing almost nothing about it.
Deezer dropped a number that should make every music fan’s stomach turn. According to reporting from Ars Technica, 44% of new music uploads to the platform are AI-generated. Not remixes. Not AI-assisted production. Fully synthetic tracks manufactured at industrial scale and dumped into the catalog like counterfeit bills into circulation. And the streams those tracks generate? Largely fraudulent. Bots playing bot music, with real money flowing out the other end to people who made nothing.
The Machine Is Eating Music Alive
Here’s how the scam works. Bad actors use AI tools to generate thousands of tracks in hours. Generic ambient music. Fake lo-fi beats. Counterfeit classical. They upload them under fabricated artist names, then deploy stream-farming bots to artificially inflate play counts. Streaming platforms pay out royalties based on streams. The fraudsters collect. Real artists get a smaller slice of an already paper-thin pie.
This isn’t some fringe exploit. This is systematic. And it scales in a way human fraud never could. A person can fake a few thousand streams manually. An automated pipeline can fake millions. The math is brutal and it runs 24/7.
Streaming Economics Were Already Broken
Let’s be honest — the royalty model was never built to protect musicians. Spotify, Apple Music, Deezer, and the rest built fortunes on a system where even legitimate streams pay fractions of a cent. An independent artist needs millions of plays just to cover rent. That was already an ugly reality before AI entered the picture.
Now you have a two-front attack. AI-generated content floods the catalog, diluting discovery for human artists. Fraudulent streams siphon royalty pools that were already stretched thin. The platforms? They’re still collecting their subscription fees regardless. Their financial model doesn’t care whether you stream a real guitarist from Detroit or a bot-generated drone track made in thirty seconds by a server rack somewhere.
This is the same kind of systemic rot we see in other tech-adjacent markets. Just like how the chip rally driving South Korea and Taiwan stocks to record highs papers over real structural tensions in global supply chains, the streaming industry’s surface-level growth numbers hide a catalog that’s turning into landfill.
The Hot Take
Streaming platforms should be legally liable for royalty fraud enabled by AI-generated content on their own servers. Right now they operate like landlords who collect rent while the building burns. If Deezer knows 44% of new uploads are synthetic and the majority of streams are fake — and they clearly do know, because they just published that statistic — then continuing to pay out those fraudulent royalties makes them complicit. Full stop. The argument that they’re just a neutral platform died the moment they had the data and kept cashing checks.
Detection Exists. The Will Doesn’t.
The technology to catch this isn’t some unsolved mystery. AI-generated audio has detectable fingerprints. Stream farms have behavioral patterns — abnormal geographic clustering, inhuman listening consistency, suspicious account ages. Platforms already use algorithmic detection for copyright violations when major labels demand it. They can absolutely build the same muscle for fraud detection. They just haven’t been forced to yet.
This connects to a broader problem with how we treat data integrity in tech. Whether we’re talking about using high-quality data to resolve ambiguities about genuinely complex phenomena or cleaning up a poisoned music catalog, the answer is always the same: garbage in, garbage out. Platforms that refuse to enforce data quality aren’t neutral — they’re choosing the garbage.
What Happens to Real Artists
Ask any independent musician what this means in practice and the answer is bleak. Their songs compete for playlist placement against tracks that cost nothing to produce and cost even less to promote through bots. They lose algorithmic real estate to synthetic content. Their royalty payouts shrink as fraudulent streams absorb a bigger share of the pool. And when they try to report it, they get automated responses and dead ends.
The cultural cost matters too. Streaming killed the album economy and replaced it with the attention economy. Now the attention economy is being gamed into meaninglessness. Discovery is already hard. When the catalog is half synthetic and the charts are half fake, listeners lose trust — and that trust, once gone, doesn’t come back easily. We saw that happen to industries that moved fast and assumed the fundamentals were solid until they weren’t.
The music streaming industry is standing at the edge of a legitimacy crisis it created through laziness and greed. Platforms need hard upload verification, aggressive fraud detection with real consequences, and transparent royalty auditing that artists can actually access. Not promises. Not whitepapers. Systems. Either the platforms build them voluntarily, or regulators will eventually stop asking nicely — and the platforms will have earned every bit of what follows.
