YouTube promise to have over 10,000 content moderators next year
As YouTube’s landscape gets messier with advertisers pulling out and creators at a loss they are looking to crack down massively on screening content.
YouTube has had a rocky year that has caused uncertainty over their place at the top of video streaming sites. After the media got their jimmies rustled by misunderstanding a series of events on YouTube, the video giant got scared that it’s cash-cows *ahem* I mean advertisers were going to leave. So in response they cracked down on content and created a giant rift in the site between it’s dedicated creators and brands.
That’s not to say there isn’t toxic, dangerous, harmful and sometimes disgusting content on YouTube that should be controlled. It just seems that they are incapable of controlling it in a manner that doesn’t tear the platform apart. On Monday, YouTube CEO Susan Wojcicki wrote a blog post that outlines the action they’re going to take moving into a new year on dealing with “offensive content”. Is it good? You’ll have to be the judge.
After a paragraph of how much Wojcicki claims to love her site and it’s assets, she turns around to say: “But I’ve also seen up-close that there can be another, more troubling, side of YouTube’s openness. I’ve seen how some bad actors are exploiting our openness to mislead, manipulate, harass or even harm. In the last year, we took actions to protect our community against violent or extremist content, testing new systems to combat emerging and evolving threats.
“We tightened our policies on what content can appear on our platform, or earn revenue for creators. We increased our enforcement teams. And we invested in powerful new machine learning technology to scale the efforts of our human moderators to take down videos and comments that violate our policies. Now, we are applying the lessons we’ve learned from our work fighting violent extremist content, testing new systems to combat emerging and evolving threats.”
How are they going to safeguard YouTube? By adding s**tloads of new moderators to scan content and take note of anything that doesn’t fit their policies. Since June this year YouTube’s team have reviewed almost 2 million videos just for violent-extremist content. Whilst their work is helping to train machine-learning tech YouTube also want more manpower making sure content is friendly. Google say they want to grow their team to over 10,000 in 2018.
Beyond pushing moderation of videos YouTube are actually going to take action in how they advertise to push fairness for creators as well as satisfying their advertisers (at least they say for now). Wojcicki said:
“This requires a new approach to advertising on YouTube, carefully considering which channels and videos are eligible for advertising. We are planning to apply stricter criteria, conduct more manual curation, while also significantly ramping up our team of ad reviewers to ensure ads are only running where they should. This will also help vetted creators see more stability around their revenue. It’s important we get this right for both advertisers and creators, and over the next few weeks, we’ll be speaking with both to hone this approach.”
Whilst it seems like YouTube are saying they will improve the current situation for creators getting demonetised unfairly we will have to see it in action. So far they have acted under the guise of working for both creators and advertisers but only certain big name creators have been immune to YouTube’s harsh new content restrictions.
Wojcicki ended:
“As challenges to our platform evolve and change, our enforcement methods must and will evolve to respond to them. But no matter what challenges emerge, our commitment to combat them will be sustained and unwavering. We will take the steps necessary to protect our community and ensure that YouTube continues to be a place where creators, advertisers, and viewers can thrive.”
We’ll find out soon enough.