Australia’s e-safety commissioner, the independent regulator for online safety, has taken aim at tech giants Apple and Microsoft for allegedly “turning a blind eye” to child exploitation on iCloud and OneDrive.
Firms that were sent the legal demands included Apple, Facebook parent company Meta, Microsoft, Skype, Snap, and others.
According to the e-safety commissioner, responses showed that both Apple and Microsoft were failing to proactively detect child abuse material and exploitation in their storage and streaming services, iCloud and OneDrive.
That is despite the wide availability of PhotoDNA detection technology, according to the commissioner.
“PhotoDNA was developed by Microsoft and is now used by tech companies around the world to scan for known child sexual abuse images and videos, with a false positive rate of 1 in 50 billion,” Grant said.
Apple Drops Plans to Scan iCloud Photos for Child Sexual Abuse
In addition, the independent regulator said that Apple and Microsoft also reported that they do not use any technology to detect live-streaming of child sexual abuse in video chats on Skype, Microsoft Teams, or FaceTime, while noting that the former is often used to conduct such a crime.Microsoft does, however, offer in-service reporting, unlike Apple or Omegle, the report noted.
The decision reportedly comes amid concerns over potential abuses of such surveillance.
“This report shows us that some companies are making an effort to tackle the scourge of online child sexual exploitation material, while others are doing very little,” Grant said. “It is unacceptable that tech giants with long-term knowledge of extensive child sexual exploitation, access to existing technical tools, and significant resources are not doing everything they can to stamp this out on their platforms.”
Glaring Disparities in Response Time to Reports of Child Sexual Exploitation
The report also found wide disparities in how long it takes companies to respond to user reports of child sexual exploitation and abuse, with video-sharing app Snap taking on average four minutes to respond to such reports while Microsoft can take up to two days, or up to 19 days if the reports need to be reviewed again.Elsewhere, the report noted issues with accounts that have been banned for sharing child sexual exploitation and abuse material, pointing to how many are easily able to create new accounts on certain platforms.
Singling out Meta, which owns Facebook and Instagram, the independent regulator said its report found that even if an account is banned on Facebook, the same user may still be able to set up an account on Instagram. Similarly, when an account is banned on WhatsApp, the account owner’s information is not shared with Facebook or Instagram.
“This is a significant problem because WhatsApp report they ban 300,000 accounts for child sexual exploitation and abuse material each month—that’s 3.6 million accounts every year,” Grant said.
The Epoch Times has contacted Apple for comment.