Apple accused of underreporting suspected CSAM cases on its platforms

Apple has been accused of underreporting the prevalence of child sexual abuse material (CSM) on its platforms. The National Society for the Prevention of Cruelty to Children (NSPCC), a child protection charity in the UK, says Apple reported only 267 cases of child sexual abuse material worldwide to the National Center for Missing & Exploited Children (NCMEC) last year.

That number pales in comparison to the 1.47 million potential cases reported by Google and the 30.6 million reported by Meta. Other platforms that reported more potential CSAM cases than Apple in 2023 include TikTok (590,376), X (597,087), Snapchat (713,055), Xbox (1,537), and PlayStation/Sony Interactive Entertainment (3,974). Every U.S.-based tech company is required to report any potential CSAM cases detected on its platforms to NCMEC, which forwards the cases to relevant law enforcement agencies around the world.

The NSPCC also said Apple was involved in more CSAM cases (337) in England and Wales between April 2022 and March 2023 than it reported worldwide in a year. The charity used Freedom of Information requests to collect this data from police forces.

As The NSPCC’s complaint was first reported by Apple, who points out that Apple services such as iMessage, FaceTime and iCloud all have end-to-end encryption, which prevents the company from viewing the content users share on them. However, WhatsApp also has E2EE encryption, and the service reported nearly 1.4 million cases of suspected CSAM to the NCMEC in 2023.

“There is a worrying gap between the number of child sexual abuse offences committed on Apple’s services in the UK and the almost negligible number of reports of abusive content to authorities globally,” said Richard Collard, NSPCC’s head of child online safety policy. “Apple is clearly lagging behind many of its peers in tackling child sexual abuse, when all tech companies should be investing in security and preparing for the rollout of the UK’s Online Safety Act.”

In 2021, Apple rolled out a system that would scan images before they were uploaded to iCloud and compare them to a database of known CSAM images from NCMEC and other organizations. But following pushback from privacy and digital rights advocates, Apple eventually abandoned its CSAM detection tools before uploading to iCloud.

Apple declined to comment on the NSPCC’s accusation, instead stressing The Guardian Apple made a statement when it abandoned the CSAM scanning plan. Apple said it opted for a different strategy that “prioritizes the security and privacy of [its] users.” The company said in August 2022, “children can be protected without companies combing through personal data.”

Leave a Comment