Music streaming is rotting from the inside out — and the industry knows it. Nearly half of all new music being uploaded to platforms is AI-generated slop, and most of the streams those tracks rack up are fake. If that doesn’t alarm you, you’re not paying attention.
Deezer just dropped numbers that should shake every artist, label, and playlist curator out of their comfort zone. According to a report covered by Ars Technica, 44% of new music uploads to the platform are AI-generated, and the majority of streams tied to this content are fraudulent. We’re not talking about a niche problem or an edge case. We’re talking about nearly half the fire hose of new music being synthetic garbage designed to game algorithms and drain royalty pools.
The Flood Nobody Wanted
Here’s how the scam works. Someone — or something — generates thousands of tracks using an AI music tool. These tracks get uploaded under fake artist names. Bots stream them repeatedly. Royalty payments, which are distributed based on stream share, flow toward these fake accounts and away from real artists. Real musicians making real music get a smaller slice of an already embarrassingly thin pie.
This isn’t theoretical. This is happening at industrial scale right now.
Deezer claims it’s catching and removing a significant portion of this content, but let’s be honest — if 44% of uploads are AI-generated and most associated streams are fraudulent, the platform is playing whack-a-mole with a sledgehammer and the moles have a server farm. Detection tools are reactive. The content generators are proactive. That’s a losing race.
What This Does to Real Artists
Streaming royalties were already a joke before this. Most independent artists earn fractions of a cent per stream. The major labels negotiated better deals, but the average working musician has always been at the bottom of the streaming food chain. Now add millions of AI tracks soaking up stream counts and the math gets even uglier.
Think about a bedroom producer spending six months crafting an album. They upload it. They promote it. They beg their followers to listen. Meanwhile, an AI tool spits out 10,000 “lo-fi ambient focus” tracks in a weekend, bots stream them into the billions, and the royalty pool tilts further away from the human making actual art. It’s not just unfair. It’s structural theft dressed up as a technology problem.
The platforms have known this was coming. There were warnings years ago when AI music tools started getting good enough to produce passable content at scale. Nobody built serious enough gatekeeping before opening the floodgates. Now everyone is scrambling.
The Platform Problem
Spotify, Apple Music, Tidal, and yes, Deezer — they all benefit from having enormous catalogs. More content means more perceived value to subscribers. More uploads means the platform looks active, alive, growing. There was never a strong financial incentive to aggressively filter junk music because junk music still padded the catalog numbers. That calculus has now completely backfired.
Fraud streams don’t just hurt artists. They corrupt the data that labels, managers, and booking agents use to make decisions. If a fake artist has 50 million streams, that poisons the well for how streaming data gets interpreted across the board. The signal-to-noise ratio of the entire industry is getting worse fast. And it’s not unlike the broader AI content crisis hitting other digital spaces — just as the Trump administration vows crackdown on Chinese companies ‘exploiting’ AI models made in the US, domestic platforms are realizing they’ve built ecosystems that AI abuse thrives in.
The Hot Take
The streaming platforms should be financially liable for fraud royalty payouts. Full stop. If Deezer, Spotify, or anyone else allowed fake streams to drain the royalty pool because their verification systems were lax or nonexistent, they should compensate real artists for the difference. Right now all the risk sits with musicians and all the profit sits with platforms. That’s upside-down. Make the platforms eat the fraud losses and watch how fast detection technology improves.
Where Does Music Go From Here
The irony is that music streaming was supposed to be the salvation of an industry gutted by piracy. And in some ways it was — it brought people back to paying for music. But the model always had a fragility baked in. It rewarded volume over quality, streams over sales, algorithms over taste. AI and fraud bots are just the logical endpoint of a system optimized for numbers rather than art.
This doesn’t mean streaming dies. The streaming wars have already shown that people are deeply attached to these services and aren’t going anywhere. But the industry needs a hard reset on how uploads are verified, how streams are authenticated, and how royalties are calculated. Blockchain-based verification, stricter upload requirements, and human review thresholds for new artists aren’t perfect solutions — but doing nothing is a faster path to collapse. Music culture is too important to let it get strip-mined by bots and bad actors while platforms count their subscription revenue. Something has to give, and it better give soon.
Want more from the intersection of tech and culture? Check out how Red Bull is hosting an esports tournament and what it means for live entertainment.
