Image credit: Mohamed Nohassi

As AI music content proliferates on music streaming services, a line is crossed as music replicating dead artists is uploaded to their pages.

AI music has reared its ugly head higher than perhaps ever before. The news broke recently that AI-generated songs had appeared on the official, verified Spotify pages of deceased artists. Of course, they were uploaded without the consent from the estates or rightsholders of those artists.

The most jarring example involves the late Texan singer-songwriter Blaze Foley, who died in 1989. Over the weekend, fans were surprised (perhaps pleasantly) to see a new track titled “Together” appear on Foley’s official artist page. The song seems to mimic Foley’s style but with obviously uncanny AI-generated vocals.

On closer inspection, “Together” features AI-generated cover art and is credited to an artist named “Syntax Error”. Foley’s catalogue manager, Craig McDonald said: “I can clearly tell you that this song is not Blaze, not anywhere near Blaze’s style, at all. It’s kind of an AI schlock bot, if you will.”

Confirming to fans that the new release is not lost, archived, or unreleased music by Foley, McDonald said: “It has nothing to do with the Blaze you know, the whole posting has the authenticity of an algorithm.” Neither Spotify nor the track’s distributor contacted Foley’s catalogue managing company Lost Arts Records before the song was uploaded.

A similar circumstance happened to the page of songwriter Guy Clark, who died in 2016. Another track credited to Syntax error appeared on his profile, titled “Happened To You”, replicating his style with no permission from Clark’s estate.

Who has the onus to prevent fraudulent AI music?

The tracks were discovered in an investigation by 404 Media. On reporting their findings to the media, subsequent reporting raised the alarm and led to the removal of the tracks by Spotify.

Spotify placed the responsibility of the issue on music distributor SoundOn, saying in a statement: “We’ve flagged the issue to SoundOn, the distributor of the content in question, and it has been removed. This violates Spotify’s deceptive content policies, which prohibit impersonation intended to mislead, such as replicating another creator’s name, image, or description, or posing as a person, brand, or organization in a deceptive manner.

“This is not allowed. We take action against licensors and distributors who fail to police for this kind of fraud and those who commit repeated or egregious violations can and have been permanently removed from Spotify.”

It does however represent the growing tension between AI, real artists, and streaming services. Following the recent, runaway success of AI-generated band The Velvet Sundown, the music industry has called for stricter policing of AI music on streaming services. This has the potential to divert streaming revenues from real artists to AI uploaders.

Whilst Spotify are correct that it should be the distributor’s responsibility to ensure that any music coming through them is legitimate and copyright uncontroversial, Craig McDonald argues that it should also be Spotify’s responsibility. Given this situation, the rightsholders involved feel they are being told it is up to them to monitor DSPs for any shady activity.

Deezer revealed at the start of the year that roughly 10% of the newly uploaded music to their platform is AI content. This may well have increased with the rapid rise in AI useage and the advancement of AI tools and generators.

Cases like this hammer home the need for strong AI enforcement and policy across the music industry. The industry has made its concern at the rise of AI content clear from the start and the rest of the industry needs to listen and react to that to maintain the trust and livelihood of artists.