Facebook says guaranteeing children are safe on its platform is a ‘very difficult promise to make’

Facebook Credit: Dominic Lipinski/PA

Facebook has said it cannot guarantee children will not be groomed its site, stating it is "a very difficult promise to make" that children will not be targeted by predators seeking to abuse and exploit them.

The company's senior global operations manager, Julie de Bailliencourt, made the admissions as she gave evidence at the Independent Inquiry Into Child Sexual Abuse on Tuesday.

Internet giants have been accused of inadequately protecting children and failing to effectively arrest the spread of child abuse material, the inquiry has heard.

Facebook has faced criticism for not knowing the true age of its users. Credit: PA

Lawyer William Chapman, representing abused children, said some tech companies are even considering features which will “make it easier, not harder, for child abuse to take place online”, such as introducing more end-to-end encryption.

But Ms de Bailliencourt defended Facebook, which has about 40 million UK users, and 15,000 people working globally to review and moderate content.

She told the hearing: “Through our policies, our detection mechanisms, our partnership and our work with law enforcement in the UK and abroad, I think we have demonstrated that we are serious in being very aggressive and making our platform as inhospitable as possible to this type of behaviour.

“This is a battle that’s never fully won and we need to stay one step ahead.

“I think we’re facing people who are trying to evade our systems, trying to evade detection mechanisms we have put in place, and therefore we need to shift and adapt our approach constantly.

“We’re really trying to reassure parents first and foremost that we take our responsibility very seriously and that we have a number of things in place, this is not the Wild, Wild West.”

Although accounts found to belong to under-13s or sex offenders are immediately banned, Facebook does not try to verify if a given date of birth is accurate or if someone attempting to create an account is on the sex offenders register in the UK, the inquiry heard.

If content is flagged indicating a child may be in danger, it is reported to the US National Centre for Missing and Exploited Children (NCMEC) and then passed on to law enforcement, Ms de Bailliencourt added.

NCMEC is based in the US but shares information with agencies in other countries like the National Crime Agency in the UK.

Around 250,000 accounts on the Facebook-owned WhatsApp messaging service suspected of sharing child sex abuse content are removed every month, she added.

Ivor Frank, sitting on the IICSA panel, suggested to Ms de Bailliencourt she could not promise to parents children would be safe on the platform.

She replied: “As to guaranteeing to parents that people may not attempt to groom or contact a child, it’s a very difficult promise to make, but our promise is that we will put the manpower and the technology that we have at our fingertips to make this as difficult as possible.”

The IICSA is conducting its second investigation phase, into how the internet is used to facilitate child sexual abuse in England and Wales through acts like grooming, sharing indecent images and live-streaming abuse.

Facebook has more than 40 million users in the United Kingdom. Credit: PA

The mother of a sister and brother sexually groomed and abused online earlier told the inquiry that tech companies should be made to pay compensation to victims.

The siblings are not eligible for compensation from the Criminal Injuries Compensation Authority because the abuse they suffered took place wholly online, the hearing was told.

Mr Franks suggested whether Facebook could possibly set aside, for example, £1 billion per year in compensation for victims.

Ms de Bailliencourt replied: “I don’t think I’m the right person to comment on this … I’m taking notes and we’ll relay this to the appropriate teams.”