Music industry calls on Trump to regulate AI training
AI is learning from your music. Find out how the music industry is pushing back and what it means for artists, producers, and songwriters.
Some of the biggest names in the U.S. music and creative industries just made their voices heard in Washington, and it’s all about how AI is using your work.
In a recent joint filing addressed to President Trump’s administration, a coalition including the Recording Academy, the RIAA, A2IM, SAG-AFTRA, the National Music Publishers’ Association, and others came together to say: it’s time to draw the line on how AI trains on human-created music and art.
AI is developing fast, and while it can be a helpful tool in music production, songwriting, and mixing, it also comes with a major red flag: many of these models are being trained on your music, without permission, payment, or even acknowledgment.
Tracks are being mimicked. Artist voices are being cloned. And some AI companies are using copyrighted material (music, lyrics, vocals) to train their systems, often without any kind of licensing or compensation to the people who made it.
What the music industry is asking for
If an AI company wants to train its models on copyrighted music or use someone’s voice or likeness, they should have to pay for it and get permission.
They’re pushing for a “free market licensing” model, meaning deals should be made between creators and companies, just like you’d license a song for a film or ad. The government shouldn’t create loopholes that let AI developers scrape and copy massive amounts of copyrighted material just because it’s technically possible.
They’re also pushing back hard against the idea that AI-generated content should get the same protections as music made by humans.
What this means for independent musicians and producers
If you’re an artist putting your music out into the world, whether on Spotify, Bandcamp, TikTok, or YouTube, this does affect you.
Without some guardrails, AI companies can, and in many cases already do, use your tracks to teach their systems how to recreate music “in your style.” That puts your voice, your sound, and your creative fingerprint at risk of being copied and commodified without your input.
This filing is a sign that the industry is waking up to these risks, and demanding action to protect the people who actually create the music, not just the tech that learns from it.
The industry isn’t trying to stop AI. In fact, most musicians already use some form of it in their creative process. Whether it’s AI-assisted mixing, mastering plugins, or algorithmic inspiration, it’s here to stay. But the point is: there’s a big difference between using AI as a tool and having AI use you as training data.
This collective push is about making sure creators stay in control, get credited, and get paid.