Is SoundCloud training AI with your music?
SoundCloud’s AI policy updates sparked concerns across the industry. Here’s what its terms really say about AI training and your content.
SoundCloud and AI: What changed?
Last February, SoundCloud quietly updated its Terms of Use. The new clause stated that user content could be used to “inform, train, develop or serve as input to artificial intelligence or machine intelligence technologies or services as part of and for providing the services”. The update was spotted by tech ethicist Ed Newton-Rex, and quickly raised eyebrows across the music industry.
SoundCloud seems to claim the right to train on people's uploaded music in their terms. I think they have major questions to answer over this.
— Ed Newton-Rex (@ednewtonrex) May 9, 2025
I checked the wayback machine – it seems to have been added to their terms on 12th Feb 2024. I'm a SoundCloud user and I can't see any… pic.twitter.com/NIk7TP7K3C
SoundCloud has leaned into its AI offering over the past year with AI-powered tools for remixing, vocal generation, and sample creation. With many in the industry beginning to embrace AI, including Spotify’s AI remixing tool as part of its rumored ‘Music Pro’ tier, it’s understandable why this update has caused alarm among artists.
Is SoundCloud using your music to train AI?
Despite the wording of the new policy, SoundCloud responded to these comments by emphasizing that it “has never used artist content to train AI models”. It also highlighted that it does not allow third parties to use content uploaded to the platform for AI training.
According to SoundCloud, the new terms were meant to clarify how user content interacts with AI within the platform, such as powering personalized recommendations, fraud detection, and playlist generation. In fact, SoundCloud’s current terms explicitly prohibit the use of licensed content for AI model training. However, the policy is less clear when it comes to unlicensed content- potentially leaving scope for AI use involving content that is unlicensed.
Looking ahead
While SoundCloud maintained its current position on AI, it has left the door open for use of user content for AI training in the future. In a statement, the company said:
“No such use has taken place, and SoundCloud will introduce robust internal permissioning controls to govern any potential future use. Should we ever consider using user content to train generative AI models, we would introduce clear opt-out mechanisms in advance – at a minimum – and remain committed to transparency with our creator community.”
SoundCloud via The Verge
This would position SoundCloud among a growing list of platforms to alter their AI policies. Platforms such as X, LinkedIn, and YouTube have recently updated their policies to allow third-parties to train AI based on user posts.
A huge impact on the music industry
The concern within the industry isn’t just about copyright. It also covers consent, control, and compensation for your works. Critics argue that users should be able to opt in to AI use of their work, rather than having to opt out.
Importantly though, many creatives believe they should be credited and compensated fairly if their content is used to train AI and produce generative content. Around the world, musicians, rights organizations, and industry leaders are pushing back against the idea of AI models freely using creative content without consent.
The American Society of Composers, Authors, and Publishers has called for a ‘Humans First’ approach to AI governance, urging that copyrighted content adds billions of dollars to the US. In the UK, major labels have also voiced strong opposition to government proposals that would allow AI companies to freely use content.
SoundCloud’s future direction on AI remains to be seen. For now, it’s policies oppose the use of user content for AI purposes, but it has left the possibility open for the future.