Websites to be fined or blocked for 'online harms' under new proposals
Video report by ITV News Correspondent Ivor Bennett
Internet sites could be fined or blocked if they fail to tackle "online harms" such as terrorist propaganda and child abuse, under new government plans.
The proposals, which include an independent watchdog and a code of practice that tech companies would have to follow, could see the UK introduce “world first” internet safety laws designed to make the country the safest place on the globe to be online.
Home Secretary Sajid Javid has said "we cannot ignore the crimes being committed on social networks," at the launch of a Government white paper on online harms.
In a speech on Monday afternoon, the home secretary issued a stark warning to online enterprises: "I warned you and you did not do enough. So it's no longer a matter of choice. It's time for you to protect the users and give them the protection they deserve, and I will accept nothing else."
What are the proposals and what could happen?
A white paper on online harms, published jointly by the Department for Digital, Culture, Media and Sport (DCMS) and the Home Office, proposes strict new rules be introduced that require firms to take responsibility for their users and their safety, as well as the content that appears on their services.
Overseen by an independent regulator, internet companies which break these rules could even see senior management held personally liable for the failings.
A newly introduced duty of care would require firms to take more responsibility for the safety of users and more actively tackle the harm caused by content or activity on their platforms.
It also calls for powers to be given to a regulator to force internet firms to publish annual transparency reports on the harmful content on their platforms and how they are addressing it.
The regulator would also have the power to issue “substantial fines, block access to sites and potentially impose liability on individual members of senior management”, if rules are broken.
A 12-week consultation will now take place before ministers publish draft legislation.
Who would the rules apply to?
The proposed new laws will apply to any company that allows users to share or discover user-generated content or interact with each other online, and so would be applicable to companies of all sizes from social media platforms to file-hosting sites, forums, messaging services and search engines.
Why is this happening?
The proposed measures are part of a Government pledge to make the UK one of the safest places in the world to be online, and comes in response to concerns over the growth of violent content, encouraging suicide, disinformation and the exposure of children to cyberbullying and other inappropriate material online.
A number of charities and campaigners have called for greater regulation to be introduced, while several reports from MPs and other groups published this year have also supported the calls for a duty of care to be implemented.
What has the Government said?
Speaking on Monday, Home Secretary Sajid Javid warned social media companies they were not doing enough to prevent harm coming to their users.
"For some reason, some tech companies have long got away with the claim that they cannot possibly be expected to take any more responsibility for the safety of their customers," Mr Javid said at an event in central London.
Prime Minister Theresa May previously said the proposals were a sign the age of self-regulation for internet companies was over.
“The internet can be brilliant at connecting people across the world – but for too long these companies have not done enough to protect users, especially children and young people, from harmful content,” she said.
“Online companies must start taking responsibility for their platforms, and help restore public trust in this technology.”
What have others said?
The mother of 16-year-old Lani Clarke, who took her own life in February 2018, has described the new changes as "wonderful progress".
Bonnie Mellor discovered her daughter had logged into a suicide chatroom and looked at self harm images online.
Ms Mellor said: "As a mum who's lost a daughter, I wouldn't want anyone else to go through that and if this helps protect other kids that are struggling, obviously it's wonderful progress."
She told ITV News she hopes Government will "take a hard line with this" and implement the new laws in a timely fashion.
The father of Molly Russell, the teenager who took her own life after viewing disturbing material online, has also welcomed the plans.
Ian Russell, who now runs the Molly Rose Foundation in memory of his daughter, said he was pleased to see the Government’s white paper include a focus on content that promote self-harm and suicide, which he called a “hidden and harmful” part of the internet.
Mr Russell has previously stated his “absolute certainty” that the content viewed by Molly on social media linked to anxiety, depression, self-harm and suicide had played a part in his daughter’s death in 2017, saying Instagram "helped hill my daughter".
Speaking about the new proposals, Mr Russell said: “The era of self-regulation has allowed harmful content to become all too easily available online, with tragic consequences and so I welcome the White Paper published today and I am pleased to see the Government finally putting into action its promises to hold tech companies and social media platforms to account by introducing an independent regulator.
“I am glad that content promoting self-harm and suicide, which is prevalent on the internet can have a detrimental impact on mental health, is being considered alongside the well-established online dangers such as terrorism and child sexual exploitation.
“A light has been shone on this hidden and harmful part of the internet and it is important that the Government works with the tech companies and relevant organisations to remove harmful content and to continue to raise awareness among young people through better education of the dangers online and to promote the support available.”
What have internet companies said?
Companies including Facebook and Twitter already publish reports of this nature.
Responding to the proposals, Facebook’s UK head of public policy Rebecca Stimson said: “New rules for the internet should protect society from harm while also supporting innovation, the digital economy and freedom of speech.
“These are complex issues to get right and we look forward to working with the Government and Parliament to ensure new regulations are effective.”
What have campaigners said about the proposals?
Peter Wanless, chief executive of children’s charity the NSPCC – which has campaigned for regulation for the past two years – said the proposals would make the UK a “world pioneer” in protecting children online.
“For too long social networks have failed to prioritise children’s safety and left them exposed to grooming, abuse, and harmful content,” he said.
“So it’s high time they were forced to act through this legally binding duty to protect children, backed up with hefty punishments if they fail to do so.”
Does everyone back the plans?
No. The proposals have prompted warnings that oversight should not amount to state censorship.
“These things are always justified as being for good, kind and worthy objectives, but ultimately it’s giving power to a state regulator to decide what can and cannot be shown on the internet,” Victoria Hewson of the Institute for Economic Affairs think tank told the BBC.
“Maybe the authorities should be trying to stop these things at source.”
Former culture secretary John Whittingdale warned ministers risked dragging people into a “draconian censorship regime” in their attempts to regulate internet firms.
Writing in the Mail On Sunday, he said he feared the plans could also “give succour to Britain’s enemies”, giving them an excuse to further censor their own people.
What questions remain?
Ahead of the consultation, the proposals are not set in stone and a number of questions remain such as would a newly created watchdog be given the job of attempting to regulate the internet, or would it be done by current media regulator Ofcom?
It is also not known whether sanctions would apply equally to giant social networks and to small organisations such as parents' message boards.