Molly Russell inquest: Meta says Instagram posts ‘promoting’ suicide viewed by schoolgirl were safe
Tap above to watch video report by Anna Geary
Instagram content viewed by Molly Russell that her family argued “encourages” suicide and self-harm was safe, social media firm Meta has said at an inquest into the 14-year-old's death.
Meta, which owns Facebook, WhatsApp and Instagram, was represented at North London Coroner’s Court on Monday by head of health and wellbeing, Elizabeth Lagone, who had flown from America to give evidence.
Ms Lagone, who was taken through several posts the schoolgirl engaged with on the platform in the last six months of her life, described them as “by and large, admissive”.
The senior executive said she thought it was “safe for people to be able to express themselves”, but conceded two of posts shown to the inquest would have violated Instagram’s policies.
She repeatedly refused to distinguish between adults and children in the court. Anyone aged over 13 is allowed on Instagram.
Coroner Andrew Walker asked Ms Lagone on Monday “what gives you the right” to make decisions on what material was safe for children to view, but the witness said the site worked “closely with experts”, adding that decisions were not “made in a vacuum”.
Molly, from Harrow in north-west London, died in November 2017, prompting her family to campaign for better internet safety.
During the day’s proceedings, videos the teenager accessed on Instagram were played to the court with the coroner once again warning the material had the “potential to cause great harm”.
He said the content “seeks to romanticise and in some way validate the act of harm to young people,” before urging anyone who wanted to leave the room to do so, with one person leaving.
The teenager had been exposed to pictures and videos featuring suicide, drugs, alcohol, depression and self-harm while on social media.
The Russell family’s lawyer, Oliver Sanders KC, spent around an hour taking Ms Lagone through Instagram posts liked or saved by the 14-year-old, and asked if she believed each post “promoted or encouraged” suicide or self-harm.
Meta executive: 'It's safe for people to express themselves'
Referring to one post seen in May 2017, Mr Sanders asked: “Do you think it helped Molly to see this?”
Ms Lagone said: “I can’t speak to this.”
“Six months after seeing this, she was dead,” Mr Sanders continued.
“I can’t speak to the different factors that lead to her tragic loss,” Ms Lagone responded.
The witness told the court she thought the content was “nuanced and complicated”, adding that it was “important to give people that voice” if they were expressing suicidal thoughts.
The inquest was told out of the 16,300 posts Molly saved, shared or liked on Instagram in the six-month period before her death, 2,100 were depression, self-harm or suicide-related.
Mr Sanders directed the witness to a note on Molly’s phone which used the words “I just want to be pretty”, saying the language was identical to a post the teenager had viewed on Instagram two days before.
“Do you see the connection there?” Mr Sanders asked.
Ms Lagone said: “I see that is similar language.”
“It’s identical language… this is Instagram literally giving Molly ideas that she need to be concerned about her weight, correct?” Mr Sanders asked.
Ms Lagone said: “I can’t speak about what Molly may have been thinking.”
Addressing Ms Lagone as she sat in the witness box, Mr Sanders asked: “Do you agree with us that this type of material is not safe for children?”
Ms Lagone said policies were in place for all users and described the posts viewed by the court as a “cry for help”.
“Do you think this type of material is safe for children?” Mr Sanders continued.
“I think it is safe for people to be able to express themselves,” Ms Lagone said.
After Mr Sanders asked the same question again, Ms Lagone said: “Respectfully, I don’t find it a binary question.”
The coroner interjected and asked: “So you are saying yes, it is safe or no, it isn’t safe?”
“Yes, it is safe,” Ms Lagone replied.
The coroner continued: “Surely it is important to know the effect of the material that children are viewing.”
Ms Lagone said: “Our understanding is that there is no clear research into that.
“We do know from research that people have reported a mixed experience.”
Coroner: 'Who has given you the permission to do this? You run a business'
Questioning why Instagram felt it could choose which material was safe for children to view, the coroner then asked: “So why are you given the entitlement to assist children in this way?
“Who has given you the permission to do this? You run a business.
“There are a great many people who are … trained medical professionals. What gives you the right to make the decisions about the material to put before children?”
Ms Lagone responded: “That’s why we work closely with experts.
“These aren’t decisions we make in a vacuum.”
Last week, Pinterest apologised after admitting the platform was “not safe” when the 14-year-old used it.
The site’s head of community operations, Judson Hoffman, said he “deeply regrets” posts viewed by Molly on Pinterest before her death, saying it was material he would “not show to my children”.
The inquest, due to last up to two weeks, continues.
Want a quick and expert briefing on the biggest news stories? Listen to our latest podcasts to find out What You Need To Know...