Apple accused of underreporting suspected CSAM on its platforms


accused of underreporting the distribution of child sexual abuse materials () platforms. The National Society for the Prevention of Cruelty to Children (NSPCC), a UK child protection charity, says Apple reported only 267 suspected CSAM cases worldwide to the National Center for Missing and Exploited Children (NCMEC) last year.

This pales in comparison to the 1.47 million potential occurrences reported by Google and 30.6 million reports from Meta. Other platforms reporting more potential CSAM incidents than Apple in 2023 include TikTok (590,376), X (597,087), Snapchat (713,055), Xbox (1,537), and PlayStation/Sony Interactive Entertainment (3,974). Every US-based tech company is required to report any possible CSAM cases discovered on their platforms to NCMEC, which then refers cases to appropriate law enforcement agencies around the world.

The NSPCC also said Apple was involved in more CSAM cases (337) in England and Wales between April 2022 and March 2023 than it reported worldwide in the year. The charity used freedom of information to collect this information from police forces.

whom The first whistleblower to the NSPCC’s claim notes that Apple services such as iMessage, FaceTime and iCloud all have end-to-end encryption, which stops the company from viewing the content users share on them. However, there is WhatsApp E2EE as welland this service reported 1.4 million suspected CSAM cases to NCMEC in 2023.

Richard Collard, the NSPCC’s head of child safety online policy, said: “There is a disparity between the number of child abuse offenses on Apple services in the UK and the almost negligible number of global reports of abusive content they make to the authorities.” . “Apple lags behind many of its peers in combating child sexual abuse, while all tech firms need to invest in security and prepare for the spread of this problem. UK Online Safety Act.”

Apple in 2021 Deploying a system that will scan images and compare them to databases of known CSAM images from NCMEC and other organizations before they are uploaded to iCloud. But watch from privacy and digital rights advocates, Apple before its CSAM detection tools .

Apple declined to comment on the NSPCC’s accusation, pointing out otherwise Guardian To CSAM’s statement when canceling the crawl plan. Apple said it has chosen a different strategy that “prioritises security and privacy [its] users.” This was reported by the company In August 2022, “children can be protected without companies checking personal data.”



Source link

Leave a Reply

Your email address will not be published. Required fields are marked *