Image credits: Tara Winstead

A US court ruling on AI and copyright might seem like a win for AI companies, but it could come at a cost that actually gives creative rightsholders the upper hand.

Anthropic’s legal victory, with a catch

Last month, AI company Anthropic walked away from a lawsuit with what seemed like a major win, according to the CompleteMusicUpdate. A US judge ruled that it was fair use to train a generative AI model using legitimately sourced copyrighted books.

This was one of the first major rulings around AI training data and copyright. It’s a decision that may have far-reaching implications across creative industries, including music. While Anthropic celebrated the win, the case also opened the door to potential damages in excess of a trillion dollars.

Fair use – but with an exception

Anthropic trained its Claude AI model using a massive data set, with millions of pirated books being used. That’s where things get messy.

In the landmark case, US judge William Alsup clarified that while using copyrighted material can qualify for fair use, it must be attained legitimately and not through pirated content. He likened the lawsuit to the music industry’s struggle with Napster in the early-days of streaming. In his words, Anthropic “violated the Copyright Act by doing Napster-style downloading of millions of works”.

Crucially, Alsup also said that the fair use defence could apply in this case partly because the AI-generated outputs were “spectacularly transformative”. This concept is crucial in US law when looking at the fair use defence in copyright infringement. So not only must the training data be legally sourced, but the resulting content must also significantly differ from the original to qualify as fair use.

Because of the scale of the pirated content involved, the lawsuit has now been upgraded to a class action lawsuit. This means that Anthropic could face claims from up to seven million authors, with US law allowing each author to claim damages of up to $150,000 per infringement. That adds up quickly, and could mean damages in excess of one trillion dollars.

So, while AI companies could claim the fair use defence to train models on copyrighted content- the content must be attained legally and create outputs that are significantly transformative.

Relevance to the music industry

While it remains unclear whether this ruling will apply universally across the creative industries, it could have strong implications in the music world. After all, the music industry has also accused AI companies of using their songs to train generative music tools without proper licensing or rightsholder permission.

In fact, Anthropic is already being sued by music publishers. Major record labels are going after AI music generators Udio and Suno, with the latter also being sued by GEMA (a collecting society), and a musician fighting for indie artists.

The judge’s ruling is specifically in the context of books, but the broader principles could very well apply to music. If courts require AI companies to prove that their training data is both legally sourced and that the generated content is “spectacularly transformative”, it could set a precedent that at least slightly favours music rightsholders in the US. That could mean AI-generated tracks would need to sound noticeably distinct from their source material to avoid infringement claims.

Meanwhile, lawmakers in the UK have been slow to act. Despite increasing concerns from artists and major labels, the UK government has yet to commit to new legislation around AI and copyright. After concluding a 12-week consultation back in February, there has been little movement, and the government chose not to include AI transparency measures in the Data (Use and Access) bill. 

Could this push AI companies to the negotiating table?

While the fair use exception might benefit AI companies, the huge rusks when they get it wrong may actually work in favour of creators.

If AI companies know they could pay damages in the billions for using unlicensed or generating non-transformative content, it suddenly becomes a lot more appealing to cut deals with rightsholders.

That could be good news for artists, publishers, and labels who’ve been calling for transparency, licensing, and fair compensation from AI platforms. Speaking of which, the solution could lie with more ethical AI music generators like Musical AI that address these problems.

A step toward fairer AI?

There still needs to be caution going forward. The judge’s ruling doesn’t automatically extend to music, and the legal landscape around AI is still developing.

This case does show us that “fair use” isn’t a free pass. AI companies still need to respect copyright law and rightsholders, otherwise the consequences could be huge.

For the music industry, the ruling hints at a possible future where rightsholders gain the upper hand. With uncertain outcomes as to whether the fair use defence may qualify, many AI companies may decide against the risk, and ultimately favour those in the music industry.


Did you know that RouteNote offers distribution to major streaming platforms around the world? Start for free today and get your music shared across the globe.