Google shows ITV News new tech to combat online child abuse - but is it enough to stop paedophiles?
Allegra Stratton
Former National Editor
It’s not often, as a reporter, that you are told to look away from your subject, and you do as you’re told. But that’s pretty much what has to happen when investigating Britain’s startling rise in people viewing child abuse images online.
We looked away from the images of children on the desktop of the people whose job it is it to monitor these criminals and catch them. It’s an increasing number of images that are getting more severe and of younger and younger children. You don’t have to have children of your own to dread seeing what these people see hundreds of times daily, and to fear never ‘unseeing’ them.
People inside the Home Office believe the rapid growth in those viewing child abuse online is one of the most worrying trends in the UK right now.
So, we met the police team in Warrington who are alerted every time child sexual abuse is viewed in this country - a daily drop of at least 100 each day that they told us is increasing in severity.
And, in a world exclusive, we were the first journalists allowed into the buildings at Google’s Zurich base where we met engineers who have written the algorithms to try to stop these images the moment they are created.
Google's Child Safety Executive Claire Lilley speaks to Allegra Stratton
Right now at the Independent Inquiry into Child Sexual Abuse taking place in London, the big tech firms stand accused of not doing enough. It’s clear from our visit that they are trying.
But the first question for me, as so many people, is how can you spend your day investigating this sort of crime?
On the outskirts of Warrington, staff at the National Crime Agency's unit have A4 print outs of inspirational quotes and historical figures up on the so-called welfare wall and, like many an open plan office across the country, banter about their chipped nails and what’s for lunch.
Theirs surely must count as one of the most gruelling jobs in the country.
Every day around 100 alerts will come in of child sexual abuse viewed online. The day we were there we saw the Daily Download and it was 116.
A long list of child sexual abuse on Twitter, Facebook, Google, Snapchat and other lesser-known platforms where we’re told the worst content is hidden.
Each alert is a word document that contained the image and the activity. Known images of online child sexual abuse have a hash - a bit like barcode.
When it is detected, it will be flagged to an organisation in the States and, if it is detected in the UK, it will then come to this room in the NCA.
It will then go to Shelley Abbott, a former primary school teacher who is motivated by catching people who abuse children.
She now leads the team in prioritising the tip offs which come in every day. The most urgent cases involve offenders with access to children. They will be targeted first so that any children at risk of being abused can be safeguarded.
The most urgent cases include the live-streaming of children being abused.
Google Engineer Abhi Chaudhuri explains why he moved from the US to Zurich to work on the new technology
New staff are eased in as gently as possible and gradually exposed to ever-worsening images rather than being dropped in the deep end.
Once in post, staff are discouraged from having family photos up around their work station and from listening to their own music while processing a case, to allow a clear distinction.
There is, of course, regular therapy and counselling. Even so, each of them has a case that was too much.
For Shelley, it was a baby being abused and the fact the video had audio. Hearing the hurt made it all the worse.
When we were there, they looked in depth at one conversation between an adult male and a young teenager.
Because the young girl had said their age and address so early on in the conversation, the team thought this might be an older person impersonating an underaged girl - a paedophile hunter.
Except the conversation continued at great length, something Shelley and her colleague said a paedophile hunter wouldn’t bother with. We watched as they rang up the relevant police force and handed them all the info they needed to perform an arrest.
Like Cheshire police. On Tuesday morning at 6am we watched that force act on intelligence a 35-year-old man had downloaded child sex abuse images. Because he and his wife have a six-month-old baby, the arrest became a priority.
This force will do three arrests this week, two on average. But they told me there was now such an increase, they could do an arrest every day.
Detective Constable Dave Walton told us he believed there had been - what he called - a “boom” in the amount of child sexual abuse online.
Some 116 cases to process on one day is an awful lot for the National Crime Agency’s team which is why they, and a police force like Cheshire, are so desperate for the tech companies to do more too.
Google insists it is taking action. It invited us to one of its engineering bases in Zurich.
It has hired from the NSPCC a new head of safeguarding, Claire Lilley, and put in place some of the best machine-learning engineers from across the company and indeed across the world.
Right now, a known image of child sexual abuse has a fingerprint. Which means that it can be matched and taken down. Crucially, current software only deals with known images, not new images, which will often contain new and current abuse against minors.
So, Google decided they wanted to develop the software that would identify new images of abuse. We met Abhi Chaudhuri the man Google put in charge of developing the machine learning to combat child sexual abuse images being shared on their platform.
Like staff at the NCA’s offices in Warrington, but for different reasons, Abhi has had to look at many images of child abuse.
He needs to look at the images because he is writing algorithms to programme machines to be able to tell when a photo of a child is them in the bath, rather than an image of abuse.
Once the machine finds an image it is concerned by, it will be sent to moderators in the US and then, once it is checked by human eyes, it be sent on to the NCA.
Google’s head of child protection policy Ms Lilley likened it to a game of "whack a mole". And they believe their new technology - which they will now make available for free to anyone who wants it - "allows them to do whack a mole at scale".
Cheshire police didn’t believe Google was the social media platform that caused them the most strife - others more readily allow these images to be shared.
Social media companies are also reluctant to provide the police with the evidence they need for court prosecutions, making their investigations not impossible but harder.
Experts agree that there is something enabling about the internet. Where once child abuse was carried out by a solitary figure in the local community, now the internet allows paedophiles to find like-minded individuals and - in effect - normalise each other.
The sheer number of hidden corners on the internet - different search engines, ones that are not households names, obscure chat rooms - all give paedophiles a multitude of places to hide their pictures of sexually abused children.
But the over-riding frustration remains: why can't the best brains in the business with the deepest pockets prevent this content from reappearing? Why are known images being found, again and again.
We're told about the cunning and determination of the most prolific offenders to re-upload this abhorrent content, and tech companies like Google explain that they can't go into private computers and their hard drives and delete this content. They say they can, and do act as soon as they find it.
The development everyone in law enforcement we spoke to wants to see is the prevention of these illegal images ever being uploaded.
It's something I'm still grappling with and I hope the technological innovators out there are too. It's an innovation we are yet to see but one which would perhaps make the greatest impact at stopping this repeated abuse.
Rob Jones at the National Crime Agency thinks the problem is worse now than ever.
The number of people estimated to be interested in this kind of material has leapt by 50% over the last year alone from 100,000 to 150,000.
The agency believes the time is approaching where everyone may know someone who, unbeknownst to them, looks at child sexual abuse online.
Worried about your online behaviour, or somebody else's? You can visit the www.stopitnow.org.uk website or call the helpline on 0808 1000 900