Facebook and YouTube crack down on hate speech as top firms halt advertising
Facebook and YouTube have pledged to step up their efforts on tackling hate speech and content on their platforms as a growing number of firms have stopped advertising on the social media networks.
Starbucks has suspended advertising on Facebook, following the likes of Coca-Cola and Unilever, amid concerns the tech giant is failing to address the issue.
Facebook boss Mark Zuckerberg has come under fire for not taking action, including on removing a post by Donald Trump, which the president said “when the looting starts, the shooting starts”, during protests across the US over the death of George Floyd.
Steve Hatch, Facebook’s vice president for northern Europe, said there is “no tolerance on our platform for hate speech” but that the debates around the such issues are “extremely challenging”.
“There is no profit to be had in content that is hateful,” Mr Hatch told BBC Radio 4’s Today programme.
“The debates that we see around these topics are extremely challenging and can be very, very wide-ranging.”
He said the majority of users have a positive experience on the social networking app, but admitted a “small minority” used it because “when there’s hate in the world there will also be hate on Facebook”.
It came as Facebook launched an advertising campaign to improve people’s awareness of fake news shared online, encouraging users to question what they see.
Meanwhile YouTube have also stepped up their efforts against hate speech after it banned several high profile white supremacist channels.
Channels belonging to Canadian white nationalist activist Stefan Molyneux, US white supremacist Richard Spencer and former Ku Klux Klan leader David Duke were among the six taken down.
Google-owned YouTube said the channels were banned for repeatedly claiming that members of protected groups are inferior to others, among other violations.
The video sharing platform began cracking down on supremacist channels in June last year, explicitly prohibiting content that alleges “a group is superior in order to justify discrimination, segregation or exclusion”.
“We have strict policies prohibiting hate speech on YouTube, and terminate any channel that repeatedly or egregiously violates those policies,” a YouTube spokeswoman said.
“After updating our guidelines to better address supremacist content, we saw a 5x spike in video removals and have terminated over 25,000 channels for violating our hate speech policies.”