Pinterest “sorry” Molly Russell was able to view online content related to self-harm by listening to the investigation

Pinterest “sorry” Molly Russell was able to view online content related to self-harm by listening to the investigation

Pinterest community operations chief said he was “sorry” that student Molly Russell was able to view self-harm content on the platform.

Two streams of content the 14-year-old saw were shown at the North London Coroner’s Court on Thursday, comparing material she had previously seen during her use of the platform and in the months closest to her death.

While the previous content stream included a wide variety of material, the second focused on depression, self-harm and suicide.

Molly, from Harrow, North West London, was found dead in her bedroom in November 2017 after seeing online content promoting self-harm.

She has been an active Pinterest user, with over 15,000 interactions, including 3,000 saves, in the last six months of her life.

Since her death, Molly’s family has campaigned for better internet security.

Judson Hoffman, the Pinterest community’s head of operations, was asked by the attorney representing Molly’s family during her investigation if she agreed that the type of content had changed.

Mr. Hoffman said, “I do and it is important to note, and I am deeply sorry that I was able to access some of the content shown.”

Mr. Oliver Sanders KC asked: “You said you’re sorry, are you sorry this happened?”

Mr. Hoffman replied: “I’m sorry this happened.”

The court heard that the social media giant sent emails to the teen with headlines like “10 Depression Badges You May Like” and “Recovery from Depression, Depressed Girl and Other Pinterest Trend Badges.”

The emails also contained pictures. The family lawyer asked Mr. Hoffman if he believed they were “safe for the children”.

Mr. Hoffman replied, “So, I want to be careful here because of the driving we have seen.

“I’ll say this is the kind of content we don’t want anyone to spend a lot of time with.”

Mr. Sanders KC said “especially children” would find it “very difficult … to make sense” of the material – to which Mr. Hoffman replied: “Yes”.

Mr. Hoffman said he was “unable to respond” to how children might accept that they are potentially exposed to inappropriate content for a child.

In the platform’s terms of service, shown at the hearing, the court was told users to report “bad things” if they saw them on the site.

The November 2016 Terms of Service stated that users may be exposed to material “inappropriate for children”.

Mr. Sanders KC asked: “Keeping in mind that children may be opening the account … when a user opens an account they must agree that there may be inappropriate content for a child.

“If the user is a child, how can he consent?”

“I’m sorry, I’m unable to answer,” said Mr. Hoffman.

People over the age of 13 can use the platform, and coroner Andrew Walker asked if the company distinguished between children and adults when the accounts are created.

“No, we don’t,” Mr. Hoffman replied.

Wednesday, Molly’s father, Ian Russell he called for action to his investigation to “prevent such a young life from being wasted again”.

“No one is immune from this tragedy, he’s closer to all of us than we’d like to think, and breaking the stigma surrounding mental health, self-harm and suicide is literally vital,” he said.

The investigation continues.

Anyone feeling emotionally distressed or suicidal can call Samaritans for help on 116 123 or email jo@samaritans.org. Alternatively, letters can be sent to: Freepost LETTERS OF SAMARITANS.

Leave a Reply

Your email address will not be published.