YouTube’s AI likeness detection tool launches following pilot
All eligible YouTube creators can now protect their image, thanks to the official launch of this AI detection tool.
YouTube has officially begun rolling out its new likeness-detection technology to more creators. The feature allows eligible creators in the YouTube Partner Programme to request the removal of AI-generated content that mimics their face or voice.
The technology aims to protect creators from having their likeness used without consent, whether to promote a product, spread misinformation, or impersonate their identity.
Protecting creators from AI misuse
Yesterday, a YouTube spokesperson told TechCrunch that “this is the first wave of the rollout,” confirming that eligible creators received emails about the new tool on the morning of its launch.
The system detects and manages AI-generated videos that use a person’s face or voice. It’s designed to prevent the misuse of a creator’s likeness in ways they haven’t authorised – an issue that’s become more common in recent years. This move builds on YouTube’s ongoing efforts to handle AI responsibly. The company had already launched a pilot programme earlier in 2025 with select creators, following its initial partnership with the Creative Artists Agency (CAA) announced in late 2024.
How to set up likeness detection
If you’re part of the YouTube Partner Programme, you may soon notice a new “Likeness” tab appearing in your Studio dashboard. That’s where YouTube’s latest AI protection tool lives.
Getting started is fairly simple. After agreeing to share your data for verification, you’ll be asked to scan a QR code using your phone. This opens a secure page where you can confirm your identity with a short selfie video and photo ID. It’s all part of YouTube’s process to make sure the system correctly recognises who you are before it begins scanning for your likeness across the platform.
Once verified, you’ll have access to a dashboard showing any videos that feature your face or voice generated by AI. From there, YouTube gives you a few choices depending on how you want to handle each case. You can:
- Request for the video to be taken down under YouTube’s privacy rules
- File a copyright claim if it infringes your work
- Or simply archive the video for your records
If you ever decide you no longer want to use the tool, opting out is straightforward. YouTube says that once you do, it will stop scanning for your likeness within 24 hours.
Why this rollout matters
YouTube’s wider release of its likeness-detection tool comes as AI use in content creation continues to grow – often faster than regulation can keep up. While AI can help creators experiment and innovate, it’s also made it easier for bad actors to copy voices or faces without consent, spreading false endorsements or misleading content.
To address this, YouTube has publicly supported the NO FAKES Act, a proposed US law aimed at preventing AI-generated replicas from being used to imitate or exploit real people. The platform’s partnership with the Creative Artists Agency (CAA) also shows a strong commitment to protecting digital identity across entertainment and creator communities.
YouTube has been open about its strategies for combatting AI misuse, and we can surely expect further expansion and development of such technologies and features.
Earn money every time your music is used in YouTube videos, or YouTube Shorts. RouteNote offers free access to YouTube Content ID, helping artists monetize their tracks effortlessly. Find out more here.