eSafety Commissioner Praises Apple’s Nude Image Reporting Feature for Children

The Commission urged Apple to enhance protections for children and users against online dangers like terrorist content and technology-facilitated abuse.
eSafety Commissioner Praises Apple’s Nude Image Reporting Feature for Children
An Apple logo hangs at a Palo Alto, Calif., Apple store on Feb. 2, 2024. The Canadian Press/AP Photo, Noah Berger
Naziya Alvi Rahman
Updated:
0:00

A new feature from Apple enabling Australian children to report unwanted nude images directly to the company has been positively received by the eSafety Commission.

Apple has announced it is rolling out an in-app reporting feature, which will initially be available by default to children in Australia, with adults able to opt in. The feature is expected to be introduced in other countries later.

Commissioner Julie Inman Grant praised the initiative, noting that eSafety has long advocated for accessible user reporting mechanisms. She pointed out that the first Basic Online Safety Expectations (BOSE) report in 2022 revealed Apple lacked a reporting system.

“It is no coincidence Apple is introducing this feature in Australia before rolling it out worldwide,” she stated.

Inman Grant said that this direct reporting system, alongside Apple’s existing functionality that detects nudity in messages, AirDrop, and FaceTime, represents a significant step towards safeguarding children from unsolicited images on Apple devices.

However, she urged Apple to expand its approach, suggesting additional measures to protect children and users from various online dangers, including terrorist content and technology-facilitated abuse. She also called for other platforms lacking these safety features to follow suit.

On Sept. 6, the Australian Federal Police (AFP) alerted parents and guardians about a troubling trend where young individuals are coerced into producing extreme sexual and violent content online.

The police described sadistic sextortion as an increasing online threat, targeting children as young as 12 on social media and messaging apps. These groups manipulate minors into creating explicit material for acceptance into extreme online communities.

Data released by the AFP-led Australian Centre to Counter Child Exploitation (ACCCE) on Sept. 4 revealed that 560 reports of sextortion were recorded in the first half of 2024, averaging 93 reports per month, a decrease from 300 monthly reports in 2023.

Helen Schneider, AFP Commander of Human Exploitation and the ACCCE, stressed the importance of open communication between parents and children.

She advised parents to discuss any concerns regarding their child’s online activity to provide appropriate support.

“Warning signs of harmful online behaviour in children may include increased screen time, isolation from friends and family, or secrecy about online interactions,” she noted.

The AFP recommends several steps to mitigate further harm: cease communication, take screenshots of conversations and profiles, block offending accounts, and report incidents to the platform.

Families should seek support from trusted friends, relatives, or professional services and consider mental health support through resources like Kids Helpline, which offers free, confidential counselling.

In emergencies, individuals are advised to call triple zero (000) or contact local police. Reporting crimes to the ACCCE is also crucial.

Naziya Alvi Rahman
Naziya Alvi Rahman
Author
Naziya Alvi Rahman is a Canberra-based journalist who covers political issues in Australia. She can be reached at [email protected].
Related Topics