In the blog, Spotify acknowledged that “the pace of recent advances in generative AI technology has felt quick and at times unsettling, especially for creatives. At its best, AI is unlocking incredible new ways for artists to create music and for listeners to discover it. At its worst, AI can be used by bad actors and content farms to confuse or deceive listeners, push ‘slop’ into the ecosystem, and interfere with authentic artists working to build their careers. That kind of harmful AI content degrades the user experience for listeners and often attempts to divert royalties to bad actors.”
The new impersonation policy requires artists to submit a claim through a legal form if they’ve identified a track impersonating their voice (something we imagine could get quite tedious—imagine Taylor Swift having to submit an individual claim every time someone on the web uses her likeness?). Spotify is also testing “prevention tactics” with artists to stop unauthorized tracks from showing up on their artist pages. The company is also rolling out a new spam filter this fall to catch “mass uploads, duplicates, SEO hacks, artificially short track abuse, and other forms of slop.” Finally, “We’re helping develop and will support the new industry standard for AI disclosures in music credits, developed through DDEX,” Spotify says. The company has already partnered with more than a dozen labels and music distributors to clearly demarcate how and where AI has been used on a track. “This change is about strengthening trust across the platform. It’s not about punishing artists who use AI responsibly or down-ranking tracks for disclosing information about how they were made,” the blog states.
The way AI has infiltrated Spotify has caused discontent among artists and listeners alike. In a statement to The Verge, Spotify’s head of marketing and policy Sam Duboff said that “there is no truth to the conspiracy theories that we’re adding AI music to playlists or promoting AI music in any way for any financial benefit.” On the contrary, “While AI is changing how some music is made, our priorities are constant,” the Spotify blog insists. “We’re investing in tools to protect artist identity, enhance the platform, and provide listeners with more transparency. We support artists’ freedom to use AI creatively while actively combating its misuse by content farms and bad actors. Spotify does not create or own music; this is a platform for licensed music where royalties are paid based on listener engagement, and all music is treated equally, regardless of the tools used to make it.”