
The world’s biggest audio streamer is bringing out the proverbial mop and bucket to clean up all its AI slop.
What happened: After several high-profile incidents of artists being outed as AI (we’re looking at you Velvet Sundown), Spotify said that it will add labels to AI-generated songs on its platform and adopt a filter that will root out spam songs attempting to game the algorithm.
- The labels, a joint effort with music industry standards group DDEX, will specify which parts of the song are AI-generated (vocals, instrumentation, etc.).
- The filter will look out for scammer tactics, like uploading tracks just over 30 seconds long to accrue royalty-bearing streams and uploading the same songs multiple times.
Why it matters: The advent of genAI has led to a surge of mass-produced spam songs, with the uploaders then using bots to inflate streams and generate easy royalties. These no-effort, low-quality tracks have been blamed for deteriorating the user experience.
Plus: Since royalty payments on Spotify are sourced from a finite “royalty pool,” scam tracks, quite literally, siphon money from real artists. Some smaller acts have also been caught in the crossfire, with tracks falsely flagged as spam under the current system.
Bottom line: Spotify is hardly the only platform dealing with a deluge of AI slop from scammers, and their efforts to tackle the issue will likely be copied elsewhere.—QH