UK Coroner Urges Separate Platforms for Adults and Kids After Ruling Online Content Contributed to Death of Teen

UK Coroner Urges Separate Platforms for Adults and Kids After Ruling Online Content Contributed to Death of Teen
Undated photo of Molly Russell, taken outside her home in Harrow, northwest London shortly before her death in November 2017. Russell family via PA
Alexander Zhang
Updated:

A coroner has issued recommendations including separate social media platforms for adults and children, following an inquest into the death of a London schoolgirl.

Molly Russell, a 14-year-old girl from Harrow, northwest London, who her father said showed no obvious signs of mental illness, took her own life on Nov. 21, 2017. She had viewed online content related to depression and self-harm for months.

Coroner Andrew Walker, who concluded last month that “negative effects of online content” were contributing factors in her death, sent a Prevention of Future Deaths report (PFD) to social media giants and the UK government on Oct. 13, calling for a review of the algorithms used by the sites to provide content.

Undated photo of Molly Russell, who took her own life in November 2017. (Family handout/PA Media)
Undated photo of Molly Russell, who took her own life in November 2017. Family handout/PA Media

In his report, Walker identified six areas of concern that arose during the inquest into Molly’s death, including the separation of content for adults and children.

The coroner also voiced concerns over age verification when signing up to the platforms, content not being controlled so as to be age-specific, the lack of access or control for parents and guardians, and the absence of capability to link a child’s account to a parent or guardian’s account.

He recommended the government review the use of advertising and parental, guardian, or carer control including access to material viewed by a child and retention of material viewed by a child.

‘Effective Regulation’ Called For

The coroner said in the report: “I recommend that consideration is given to the setting up of an independent regulatory body to monitor online platform content with particular regard to the above.

“I recommend that consideration is given to enacting such legislation as may be necessary to ensure the protection of children from the effects of harmful online content and the effective regulation of harmful online content.”

The report, which was sent to businesses such as Meta, Pinterest, Twitter, and Snapchat as well as the UK government, also suggested that the platforms themselves may wish to give consideration to “self-regulation taking into account the matters raised above.”

Walker said he believes action should be taken in order to prevent future deaths, adding, “I believe you and/or your organisation have the power to take such action.”

Timetable of Action Demanded

Molly Russell’s family welcomed the recommendations and urged social media companies not to “drag their feet” in implementing them.

Molly’s father Ian Russell also urged the government to “act urgently to put in place its robust regulation of social media platforms to ensure that children are protected from the effects of harmful online content.”

In their response to the coroner’s report, Instagram’s parent company Meta said it agreed “regulation is needed.”

The social media giant said it was “reviewing” the report, adding: “We don’t allow content that promotes suicide or self-harm, and we find 98 percent of the content we take action on before it’s reported to us.

“We’ll continue working hard, in collaboration with experts, teens, and parents, so we can keep improving.”

Pinterest said it is “committed to making ongoing improvements to help ensure that the platform is safe for everyone and the coroner’s report will be considered with care.”

The social media firms have 56 days to respond with a timetable of action they propose to take or explain why no action is proposed.

Lily Zhou and PA Media contributed to this report.