Molly Russell: Instagram posts seen by teen were safe, Meta says

Russell family

Instagram posts about suicide and depression viewed by a 14-year-old girl before she took her own life “were safe”, an inquest has heard.

Molly Russell, from Harrow, engaged with thousands of such posts in the months before her death in 2017.

Her family claim the content encouraged suicide and self-harm.

Elizabeth Lagone, an executive at Meta which owns Instagram, said she believed it was “safe for people to be able to express themselves” online.

She added the posts were “complex” and often a “cry for help”.

The inquest at North London Coroner’s Court was told out of the 16,300 posts Molly saved, shared or liked on Instagram in the six-months before her death, 2,100 were depression, self-harm or suicide-related.

‘Nuanced and complicated’

Ms Lagone, the social media giant’s head of health and wellbeing, was shown a number of them.

The Russell family’s lawyer, Oliver Sanders KC, asked of each post shown to her whether she believed they promoted or encouraged suicide or self-harm.

She said she thought it “safe for people to be able to express themselves”, but conceded two posts would have violated Instagram’s policies.

Instagram’s guidelines at the time said users were allowed to post content about suicide and self-harm to “facilitate the coming together to support” other users but not if it “encouraged or promoted” this.

Ms Lagone told the court she thought the content Molly saw was “nuanced and complicated”, adding it was important to give people a voice if they were experiencing suicidal thoughts.

Elizabeth Lagone, Meta's head of health and well-being

PA Media

In a heated exchange which saw Mr Sanders shout, he asked why Instagram allowed children on the platform when it was “allowing people to put potentially harmful content on it” and suggested Meta “could just restrict it to adults”.

Ms Lagone replied the topic of harm was an “evolving field” and that Instagram policies were designed with consideration to users aged 13 and over.

When pressed by coroner Andrew Walker to clarify whether she thought the posts were safe, she replied “Yes, it is safe”.

She said Meta’s understanding was there was no clear research into the effect posts have on children but their research reported a “mixed experience”.

Questioning why Instagram felt it could choose which material was safe for children to view, the coroner then asked: “So why are you given the entitlement to assist children in this way?

“Who has given you the permission to do this? You run a business.

“There are a great many people who are … trained medical professionals. What gives you the right to make the decisions about the material to put before children?”

Ms Lagone responded: “That’s why we work closely with experts.”

She added decisions were not “made in a vacuum”.

During the day’s proceedings, videos the teenager accessed on Instagram were played to the court with the coroner once again warning the material had the “potential to cause great harm”.

He said the content “seeks to romanticise and in some way validate the act of harm to young people”, before urging anyone who wanted to leave the room to do so, with one person leaving.

When shown a note on Molly’s phone which used the words “I just want to be pretty”, Mr Sanders said the language was identical to a post the teenager had viewed on Instagram two days before.

“It’s identical language… this is Instagram literally giving Molly ideas that she need to be concerned about her weight, correct?” Mr Sanders asked.

Ian Russell, Molly's father

Kirsty O’Connor / PA

Ms Lagone replied: “I can’t speak about what Molly may have been thinking.”

Referring to all the material viewed by the teenager the family considered to be “encouraging” suicide or self-harm, Mr Sanders continued: “Do you agree with us that this type of material is not safe for children?”

Ms Lagone said policies were in place for all users and described the posts viewed by the court as a “cry for help”.

Soon after the inquest began, Molly’s father Ian Russell said he had been shocked by the “dark, graphic, harmful material” available for children to view online.

Mr Russell told the inquest much of the content seemed to “normalise” self-harm and suicide.

On Thursday, Pinterest’s head of community operations, Judson Hoffman, apologised after admitting the platform was “not safe” when the 14-year-old used it.

The inquest continues.

If you’ve been affected by self-harm or emotional distress, help and support is available via the BBC Action Line.

Presentational grey line

Follow BBC London on Facebook, Twitter and Instagram. Send your story ideas to hellobbclondon@bbc.co.uk

Related Internet Links

The BBC is not responsible for the content of external sites.

You may also like...