Earlier this year, a band called The Velvet Sundown racked up hundreds of thousands of streams on Spotify with retro-pop tracks, generating a million monthly listeners on Spotify.
But the band wasn’t real. Every song, image, and even its back story, had been generated by someone using generative AI.
For some, it was a clever experiment. For others, it revealed a troubling lack of transparency in music creation, even though the band’s Spotify descriptor was later updated to acknowledge it is composed with AI.
In September 2025, Spotify announced it is “helping develop and will support the new industry standard for AI disclosures in music credits developed through DDEX.” DDEX is a not-for-profit membership organization focused on the creation of digital music value chain standards.
The company also says it’s focusing work on improved enforcement of impersonation violations and a new spam-filtering system, and that updates are “the latest in a series of changes we’re making to support a more trustworthy music ecosystem for artists, for rights-holders and for listeners.”
As AI becomes more embedded in music creation, the challenge is balancing its legitimate creative use with the ethical and economic pressures it introduces. Disclosure is essential not just for accountability, but to give listeners transparent and user-friendly choices in the artists they support.
On the way to the world of “Carole and Tuesday” where all music is composed by AI and owned by big corporations, so it becomes nearly impossible for a small artist to enter the market. And that show didn’t even really cover how now literally every chord that is pleasant to human ears can be not only copyrighted, but they’ll know exactly what songs and how to sue every single independent artist for copyright infringement and it will be totally legal if not very ethical. There will be no legal songs outside of one or two corporations.