Google’s AI Music Sandbox wants to empower creators with artificial intelligence and they’ve brought in top artists to show it off.

Google have used their powerful DeepMind AI technology to create a new suite of music tools. Their new AI Music Sandbox is designed to use the power of artificial intelligence to empower artists with unique new tools.

The core of AI Music Sandbox is its AI-powered music generation. Users can input text descriptions that the software will generate. Descriptions can include the mood, genre, instrumentation, and more abstract inputs like the feeling desired to be evoked. AI Music Sandbox uses AI to generate unique instrumental sections or entire pieces based on the provided prompts.

Once something has been created, users can use the intelligent canvas to manipulate the existing audio clips. The tools allow users to cut out sections, rearrange them, and even use them as the inspiration to generate entirely new sections based on prompts.

To show off its tools, Google recruited Wyclef Jean, Justin Tranter, and Marc Rebillet. These renowned musicians show how they use AI Music Sandbox to reveal the potential for inspiration on a professional level, showing the tools go beyond a simple piece of fun jamming software.

With their showcase of these musicians, Google want to prove that their music generation tools are for enhancing music and will not be used to replace music creation from creatives. Marc Rebillet describes it as “like having this weird friend that’s like ‘try this, try that’ and then you’re like ‘oh okay, yeah, no that’s pretty dope'”.

Wyclef Jean said: “The tools are capable of speeding up the process of what’s in my head [and] getting it out. You’re able to move lightspeed with your creativity.”

Google revealed the advancements in their AI music tech at their I/O event this week. It is still under development and will likely expand. They have revealed other AI music-making tools, including a new beatmaking DJ upgrade to their MusicFX tool.