Kids love social media—and frighteningly, so do sexual predators. One in nine young people has been approached online by one of the estimated 500,000 of these dangerous criminals who are on these platforms daily. Digital social platforms enable the two to interact, while the child often believes that he or she is connecting with a peer, resulting in devastating consequences for millions of children who end up sexually exploited and featured in child sexual abuse material (CSAM).
While the term child pornography is still commonly used by the public, it’s more accurate to call it what it is: child sexual abuse. Commercial sexual abuse material, or CSAM, refers to any visual, textual, or audible depiction or production of explicit or inferred child sexual assault or exploitation. Searching for, viewing, creating, or sharing this content is illegal and places minors in extreme danger. Reports of CSAM online have increased by an alarming 15,000 percent over the past 15 years.
Parents and other advocates for the safeguarding of children must call on companies that are involved in social media platforms and content storage to do their part in ridding the internet of this kind of content, which poses such an enormous threat to our kids. We’re fighting for our children, our grandchildren, and our future generations. Studies have shown direct links between the use of social media and depression, the sexualization of children, eating disorders, anxiety, and suicidal ideation. If we take action, in less than a decade, everything could be different.
Tech companies are legally required to report CSAM when they discover it, but they’re not required to proactively look for it. In August 2021, Apple announced a plan to scan photos that users stored in iCloud for CSAM. The tool was meant to be privacy-preserving and allow the company to flag potentially problematic and abusive content without revealing anything else. To some, the initiative was controversial, and it soon drew widespread criticism from privacy and security researchers and digital rights groups who were concerned that the surveillance capability itself could be abused to undermine the privacy and security of iCloud users around the world.
In September 2021, Apple said that it would pause the rollout of the feature to “collect input and make improvements before releasing these critically important child safety features.” Since then, the company said that in response to the feedback and guidance that it received, the CSAM-detection tool for iCloud photos is effectively dead.
This isn’t acceptable. The quickest way to derail a child’s destiny is to rob them of their identity, innocence, and capacity to love and be loved. Teens and young children who are introduced to this content early become desensitized to it and confused as to what’s appropriate. It’s dismantling their ability for intimacy and a healthy family.
When today’s parents were kids, access to porn was a Playboy magazine that you found hidden in the garage. Today, children are given digital devices that come pre-loaded with obscene sexual content. It’s anything but intimate. Their neural pathways are quite literally being reprogrammed by the imprint left in their mind from an onslaught of hardcore, often violent, always abusive, pornography.
Are we going to stand by and let this happen? We must boldly stand up to the companies that control these online platforms and tell them that this is unacceptable. There’s nothing more important than protecting our children.