U.K. coroner finds "negative effect" of Instagram, Pinterest content contributed to teen Molly Russell's suicide death

Molly Russell is shown in this photo shared by the Molly Rose Foundation. The Molly Rose Foundation

London — A coroner in London concluded Friday that social media was a factor in the death of 14-year-old Molly Russell, who took her own life in November 2017 after viewing large amounts of online content about self-harm and suicide on platforms including Instagram and Pinterest.

"It's likely the material viewed by Molly… affected her mental health in a negative way and contributed to her death in a more than minimal way," senior coroner Andrew Walker said Friday according to British media outlets. "It would not be safe to leave suicide as a conclusion. She died from an act of self-harm while suffering from depression and the negative effects of online content."

Walker said he would prepare a "prevention of future deaths" report and write to Pinterest and Meta (the parent company of Instagram) as well as the British government and Ofcom, the U.K.'s communications regulator.

"The ruling should send shockwaves through Silicon Valley," Peter Wanless, the chief executive of the British child protection charity NSPCC, said in a statement. "Tech companies must expect to be held to account when they put the safety of children second to commercial decisions. The magnitude of this moment for children everywhere cannot be understated."

The conclusion came days after a senior executive at Meta apologized before the coroner's inquest for the company having enabled Russell to view graphic Instagram posts on suicide and self-harm that should have been removed under the its own policies. But the executive also said she considered some of the content Russell had viewed to be safe.

Elizabeth Lagone, Meta's head of health and well-being, arrives at Barnet Coroner's Court, north London, to give evidence in the inquest into the death of Molly Russell, September 23, 2022. Beresford Hodge/PA Images/Getty

Elizabeth Lagone, Meta's head of health and well-being policy, told the inquest on Monday that Russell had "viewed some content that violated our policies and we regret that." 

When asked if she was sorry, Lagone said: "We are sorry that Molly saw content that violated our policies  and we don't want that on the platform."

But when asked by the lawyer for Russell's family whether material related to depression and self-harm was safe for children to see, Lagone replied: "Respectfully, I don't find it a binary question," adding that "some people might find solace" in knowing they're not alone.

She said Instagram had consulted with experts who advised the company to "not seek to remove [types of content connected to self-harm and depression] because of the further stigma and shame it can cause people who are struggling."

Are the Kids All Right?: The Internet | CBS Reports

In a statement issued Friday, Pinterest said it was "committed to making ongoing improvements to help ensure that the platform is safe for everyone and the coroner's report will be considered with care."

"Over the past few years, we've continued to strengthen our policies around self-harm content, we've provided routes to compassionate support for those in need and we've invested heavily in building new technologies that automatically identify and take action on self-harm content," the company said, adding that the British teen's case had "reinforced our commitment to creating a safe and positive space for our Pinners."

Meta said it was "committed to ensuring that Instagram is a positive experience for everyone, particularly teenagers, and we will carefully consider the coroner's full report when he provides it. We'll continue our work with the world's leading independent experts to help ensure that the changes we make offer the best possible protection and support for teens."

The inquest heard that 2,100 of the 16,000 pieces of online content Russell viewed during the last six months of her life were related to depression, self-harm, and suicide. It also heard that Molly had made a Pinterest board with 469 images of related subjects.

On Thursday, ahead of the inquest's conclusion, Walker, the senior coroner, said this should serve as a catalyst for protecting children from the risks online.

"It used to be the case when a child came through the front door of their home, it was to a place of safety," Walker said. "With the internet, we brought into our homes a source of risk, and we did so without appreciating the extent of that risk. And if there is one benefit that can come from this inquest, it must be to recognize that risk and to take action to make sure that risk we have embraced in our home is kept away from children completely. This is an opportunity to make this part of the internet safe, and we must not let it slip away. We must do it."

Teen activist on social media, self-esteem and why it's important to "log off"

In a press conference after the conclusion of the inquest, Molly Russell's father, Ian, said social media "products are misused by people and their products aren't safe. That's the monster that has been created, but it's a monster we must do something about to make it safe for our children in the future."

When asked if he had a message for Meta CEO Mark Zuckerberg, he said: "Listen to the people that use his platform, listen to the conclusions the coroner gave at this inquest, and then do something about it."


If you or someone you know is in emotional distress or suicidal crisis, call the National Suicide Prevention Hotline at 1-800-273-TALK (8255) or dial 988.

For more information about mental health care resources and support, The National Alliance on Mental Illness (NAMI) HelpLine can be reached Monday through Friday, 10 a.m.–6 p.m. ET, at 1-800-950-NAMI (6264) or email info@nami.org.

Find some additional resources here.

f

We and our partners use cookies to understand how you use our site, improve your experience and serve you personalized content and advertising. Read about how we use cookies in our cookie policy and how you can control them by clicking Manage Settings. By continuing to use this site, you accept these cookies.