Tiktok: Videos of dead baby viewed 10 million times raising concerns about graphic content on app
By Digital Presenter and Producer Mojo Abidi
Graphic and disturbing videos of what appears to be a dead baby are circulating on TikTok - and were viewed 10 million times before the social media platform took action.
The videos seem to have been posted by a grieving mother and have upset lots of the site’s younger users.
TikTok has an age limit of 13, but many pre-teens still use the app.
The videos show a woman holding and kissing a deceased child, and lip-syncing to songs with it in her arms.
A 16-year-old girl on the app said her heart “literally dropped” when she watched it.
One young user said: “I saw it and I feel so sick. It’s just there, in the back of my head.”
Another commented: “I’m having nightmares because of this.”
Many reported the videos for ‘violent and graphic content’ but TikTok initially said it would not be taking them down because they don’t “violate our community guidelines”.
After ITV News approached TikTok for a comment, the Chinese-owned app deleted the videos.
A spokesperson for TikTok says: “These videos are clearly heart-breaking and our deepest sympathies are with the mother and her family.
“Our concern is also the well-being of our users, including the woman who shared her experience and those who may have viewed the content, and after careful review, we have removed the videos in question.
"While it can be difficult for platforms like ours to balance the individual’s need for healing expression with the community’s expectation for a safe viewing experience, our own Community Guidelines determined this as the appropriate action to take.”
But the original user has since reposted many of the videos, and they’ve already raked up tens of thousands of views.
Most users said they saw the clip by accident.
On the site’s ‘For You’ explore page, users scroll through videos which play automatically.
This means viewers are unaware of a video’s content until they have already watched it, and often it’s hidden amongst other popular videos, like dance trends.
Andy Burrows, Head Of Child Safety Online Policy at the children’s charity NSPCC, says: “Social media platforms need to do much more to stop children and young people stumbling across such harmful and disturbing content.
"It can cause emotional distress and have a long-term impact on a child's mental health.”
He says TikTok needs to be better at upholding its terms and conditions, responding quickly to inappropriate content and signposting towards support.
“The NSPCC is campaigning for the government to introduce stricter laws on tech firms to protect children,” he says.
“This will ensure that young people do not face avoidable harm on these sites.”
It's not the first time the popular app has been linked to violent or upsetting content.
An ITV News investigation discovered dozens of videos on the app that violated its terms of use, including drug misuse, overly sexual content and general law breaking.
In September, a video of a US veteran's live-streamed suicide went viral on the site.
And even ISIS beheading videos have found their way onto the app.
Mr Burrows says: “This will obviously be really concerning for parents.
“But I think the worst thing a parent could do is to take away a device, ban the child from using the app or anything drastic.
“If your child comes across a distressing video, talk to them calmly and offer support, and work with them to put the highest security and privacy settings on their account.”
The NSPCC encourages any young people who saw the videos and need support to speak to someone they trust, or to contact their helpline on 0800 1111.