YouTube boss to counter extremist and violent content with 10,000 staff

YouTube will boast more than 10,000 staff whose job is to track down extremist, violent and predatory content on the site, Google has announced.

Susan Wojcicki, chief executive of Google-owned YouTube, said the company would significantly expand its number of moderators, admitting that "bad actors" were "exploiting our openness to mislead, manipulate, harass or even harm."

The video-sharing platform has been criticised in recent weeks for failing to prevent predatory accounts and commenters from targeting children, as well as for the ease at which terrorist propaganda is uploaded to the site.

In a blog post, Wojcicki said the company was already taking "aggressive action" on comments, and was testing new systems to counter threats that combine human and automated checks.

"Human reviewers remain essential to both removing content and training machine learning systems because human judgement is critical to making contextualised decisions on content,” she said.

"Since June, our trust and safety teams have manually reviewed nearly two million videos for violent extremist content, helping train our machine-learning technology to identify similar videos in the future.

"We will continue the significant growth of our teams into next year, with the goal of bringing the total number of people across Google working to address content that might violate our policies to over 10,000 in 2018."

More than 150,000 videos of violent extremism have been removed for the site in the past six months, she added.

Last month, separate investigations by BBC News and The Times found paedophiles were posting indecent comments on videos of youngsters, evading discovery through flaws in YouTube's reporting system.

The reports led several big brands including Mars and Adidas to pull advertising from the site.