Social media giant Meta has been accused of being “tone deaf” after it lowered the minimum age for using WhatsApp in Britain and the European Union from 16 to 13.
It comes as campaigners are trying to persuade the government to ban the use of smartphones by those under the age of 16 because of fears of cyberbullying and other threats.
In March 2021, Mia Janin, 14, a pupil at the Jewish Free School in north London, committed suicide after enduring cyber-bullying.
At an inquest in January this year, coroner Tony Murphy concluded she, “took her life while still a child and while still in the process of maturing into adulthood.”
The inquest heard from Rabbi Howard Cohen, a former deputy head teacher at the school, who said there was a culture of “boys-only bravado groups” sharing images of girls.
MP Says Meta ‘Highly Irresponsible’
Conservative MP Vicky Ford, who sits on the education select committee, said it was “highly irresponsible” of Meta to reduce the age recommendation without consulting parents.“Reducing their age of use to 13 years old is completely tone deaf and ignores the increasingly loud alarm bells being rung by scientists, doctors, teachers, child safety experts, parents and mental health experts alike,” she added.
Ms. Greenwell said parents often think of WhatsApp as harmless but she said, “WhatsApp is far from risk-free. It’s often the first platform where children are exposed to extreme content, bullying is rife and it’s the messaging app of choice for sexual predators due to its end-to-end encryption.”
Prime Minister Rishi Sunak told the BBC the Online Safety Act would give the regulator, Ofcom, powers to ensure social media companies like Meta protect children from harmful material.
He said, “They shouldn’t be seeing it, particularly things like self-harm, and if they don’t comply with the guidelines that the regulator puts down there will be in for very significant fines, because like any parent we want our kids to be growing up safely, out playing in fields or online.”
Meta Testing Nudity Filter on Instagram
Meta this week unveiled new safety features designed to protect users from “sextortion” and “cyber-flashing”.It said it would begin testing a nudity protection filter which would operate in Instagram direct messages (DMs), which will be the default sitting for those under 18 and will automatically blur indecent images.
Ofcom’s director of online safety strategy, Mark Bunting, told BBC Radio 4’s “Today” programme the watchdog was currently writing codes of practice for enforcing online safety but their powers to regulate social media will only come into effect from 2025.
He said, “When our powers come into force next year, we’ll be able to hold them to account for the effectiveness of what they’re doing.”
“If they’re not taking those steps at that point, and they can’t demonstrate to us that they’re taking alternative steps which are effective at keeping children safe, then we will be able to investigate,” added Mr. Bunting.
He said of social media services, “We’ve made recommendations that services shouldn’t prompt children to expand their network of friends, not recommend children to other users, and crucially, not allow people to send direct messages to children that they’re not already connected with.”