Apple explains in more detail why it gave up on its plan to find CSAM in iCloud Photos.

145
iCloud General Feature

On Thursday, Apple gave its most thorough justification to date for dropping its contentious proposal to find known Child Sexual Abuse Material (CSAM) kept in iCloud Photos last year.

In response to the child safety group Heat Initiative’s request that Apple “detect, report, and remove” CSAM from iCloud and provide more options for consumers to report such content to the firm, Apple released the statement that is shared with Wired and reprinted below.

“WE ARE COMMITTED TO BREAKING THE CHAIN OF COERCION AND INFLUENCE THAT MAKES CHILDREN SUSCEPTIBLE TO IT,” said ERIK NEUENSCHWANDER, DIRECTOR OF USER PRIVACY AND CHILD SAFETY AT APPLE, IN THE COMPANY’S RESPONSE TO HEAT INITIATIVE. “CHILD SEXUAL ABUSE MATERIAL IS ABHORRENT.” However, he added, the company came to the conclusion that it could not proceed with the development of a CSAM-SCANNING mechanism, even one designed specifically to protect privacy, after working with a variety of privacy and security researchers, digital rights groups, and child safety advocates.

According to Neuenschwander, “SCONNING EVERY USER’S PRIVATELY STORED ICLOUD DATA WOULD CREATE NEW THREAT VECTORS FOR DATA THIEVES TO FIND AND EXPLOIT.” IT MAY ALSO AFFECT THE RISK OF A SLIPPERY SLOPE OF UNINTENDED REPERCUSSIONS. For example, searching for one kind of content opens the door to bulk surveillance and may pique interest in looking for other encrypted messaging systems across content categories.”

● Must Read:  No New Macs Until 2023 and iOS 16.2 in Mid-December

A method to identify known CSAM photographs stored in iCloud Photos, a Communication Safety feature that blurs sexually explicit images in the Messages app, and child exploitation resources for Siri are the three new child safety features that Apple revealed plans to implement in August 2021. Although CSAM detection was never implemented, Communication Safety was first released in the United States with iOS 15.2 in December 2021 and has since spread to the United Kingdom, Canada, Australia, and New Zealand. Siri resources are also accessible.

Apple had stated that by the end of 2021, CSAM detection will be included in an update to iOS 15 and iPadOS 15, but the feature was delayed due to “feedback from customers, advocacy groups, researchers, and others.” Many different people and groups opposed the ideas, including lawmakers, policy groups, security researchers, the Electronic Frontier Foundation (EFF), university researchers, and even some Apple workers.

Apple’s most recent response to the matter coincides with the UK government’s resuscitation of the encryption debate, as they are planning to change surveillance laws that would force tech companies to turn off security features like end-to-end encryption without disclosing to the general public.

Apple claims that if the bill is approved in its current form, it will remove services like FaceTime and iMessage from the United Kingdom.

● Must Read:  A Mac monitor that transforms into a smart home display when idle is reportedly being developed by Apple.