More than 200,000 secondary school children may have been groomed online, NSPCC research suggests

Research estimates that more than 200,000 teenagers in the UK have sent, received or been asked to send explicit messages or images online Credit: Victoria Jones/PA

More than 200,000 secondary school children may have been groomed online, research suggests.

A survey carried out on behalf of children’s charity the NSPCC found that around four percent of young people aged 11 to 17 questioned had sent, received or been asked to send sexual content to an adult when using various sites and apps.

The charity said that one in 25 children had done so using Snapchat, Facebook or Facebook Messenger; one in 33 using Twitch and Twitter and one in 50 using Instagram and WhatsApp.

The study found children have sent sexual content on platforms such as Snapchat, Facebook, Facebook Messenger, Twitch, Twitter and WhatsApp Credit: PA

A total of 2,004 young people aged 11 to 17 were asked if they had ever sent messages with sexual content, been sent a naked picture or video or been asked to send these images, and the age of the person with whom they were interacting.

NSPCC chief executive Peter Wanless said: “The scale of risk that children face on social networks revealed in this research cannot be ignored and tackling it with robust and comprehensive legislation needs to be a priority for this Government.

“Tech firms need to be forced to get a grip of the abuse taking place on their sites by using technology to identify suspicious behaviour and designing young people’s accounts with built-in protections.”

The NSPCC said that there are around 5,182,045 young people aged 11 to 17 in the UK, and estimated that therefore around 201,696 had sent, received or been asked to send explicit messages or images.

The charity has warned that paedophiles contact large numbers of children on social media and then encourage the ones who respond to move over to encrypted messaging or live streaming.

Those who are tricked into sending images, often after being threatened, can then be blackmailed into sending more.

Facebook’s head of global safety, Antigone Davis, said: “Keeping young people safe on our platforms is a top priority for us.

“In addition to using technology to proactively detect grooming and prevent child sexual exploitation on our platform, we work with child protection experts, including specialist law enforcement teams like CEOP (Child Exploitation and Online Protection Command) in the UK, to keep young people safe.

“Ninety-nine percent of child nudity content is removed from our platform automatically.”

In April, the Government launched an online harms white paper that includes a statutory duty of care on technology companies, enforced by an independent regulator.

A Home Office spokesperson said: “Online child sexual exploitation is an abhorrent crime and one which the Government is committed to stamping out.

“That is why the Home Secretary is this week hosting technology firms and our international security allies to work on how we can make online platforms safer.”