A coroner has issued recommendations including separate social media platforms for adults and children, following an inquest into the death of a London schoolgirl.
Molly Russell, a 14-year-old girl from Harrow, northwest London, who her father said showed no obvious signs of mental illness, took her own life on Nov. 21, 2017. She had viewed online content related to depression and self-harm for months.
Coroner Andrew Walker, who concluded last month that “negative effects of online content” were contributing factors in her death, sent a Prevention of Future Deaths report (PFD) to social media giants and the UK government on Oct. 13, calling for a review of the algorithms used by the sites to provide content.
In his report, Walker identified six areas of concern that arose during the inquest into Molly’s death, including the separation of content for adults and children.
The coroner also voiced concerns over age verification when signing up to the platforms, content not being controlled so as to be age-specific, the lack of access or control for parents and guardians, and the absence of capability to link a child’s account to a parent or guardian’s account.
‘Effective Regulation’ Called For
The coroner said in the report: “I recommend that consideration is given to the setting up of an independent regulatory body to monitor online platform content with particular regard to the above.“I recommend that consideration is given to enacting such legislation as may be necessary to ensure the protection of children from the effects of harmful online content and the effective regulation of harmful online content.”
The report, which was sent to businesses such as Meta, Pinterest, Twitter, and Snapchat as well as the UK government, also suggested that the platforms themselves may wish to give consideration to “self-regulation taking into account the matters raised above.”
Timetable of Action Demanded
Molly Russell’s family welcomed the recommendations and urged social media companies not to “drag their feet” in implementing them.Molly’s father Ian Russell also urged the government to “act urgently to put in place its robust regulation of social media platforms to ensure that children are protected from the effects of harmful online content.”
In their response to the coroner’s report, Instagram’s parent company Meta said it agreed “regulation is needed.”
The social media giant said it was “reviewing” the report, adding: “We don’t allow content that promotes suicide or self-harm, and we find 98 percent of the content we take action on before it’s reported to us.
“We’ll continue working hard, in collaboration with experts, teens, and parents, so we can keep improving.”
Pinterest said it is “committed to making ongoing improvements to help ensure that the platform is safe for everyone and the coroner’s report will be considered with care.”
The social media firms have 56 days to respond with a timetable of action they propose to take or explain why no action is proposed.