February 2, 2023

Money News PH

The Premier Blog Where Money Talks

Apple is ending its plan to scan your photos for CSAM. Here’s what’s next

In August 2021, Apple announced a plan to scan photos users have stored in iCloud for child sexual abuse material (CSAM). The tool should protect privacy and allow the company to flag potentially problematic and abusive content without revealing anything else. However, the initiative was controversial and soon drew widespread criticism from privacy and security researchers and digital rights groups, who feared the monitoring feature itself could be misused to undermine the privacy and security of iCloud users around the world. In early September 2021, Apple said it would pause the feature’s rollout to “gather input and make improvements before releasing these extremely important child safety features.” In other words, a launch was yet to come. Now the company says the CSAM iCloud photos detection tool is dead in response to the feedback and guidance it received.

Instead, Apple told WIRED this week that it is focusing its anti-CSAM efforts and investments on its “Communication Safety” capabilities, which the company originally announced in August 2021 and launched last December. Parents and carers can choose to protect through family iCloud accounts. The features work in Siri, Apple’s Spotlight search, and Safari search to warn when someone is viewing or searching for child sexual abuse material and provide on-site resources to report the content and get help Looking for. Additionally, at the heart of the protection is communication security for messages, which caregivers can set up to provide children with an alert and resources when they receive or attempt to send photos containing nudity. The aim is to stop child exploitation before it occurs or becomes entrenched and to reduce the creation of new CSAM.

“Following extensive consultations with experts to gather feedback on child safeguarding initiatives we proposed over the past year, we are deepening our investment in the communications security feature, which we first made available in December 2021,” the company said in a statement to WIRED . “We have further decided not to proceed with our previously proposed CSAM detection tool for iCloud Photos. Children can be protected without companies sifting through personal data, and we will continue to work with governments, child advocates and other companies to protect young people, uphold their right to privacy and make the internet a safer place for children and for all of us close .”

Apple’s CSAM update comes with today’s announcement that the company is significantly expanding its end-to-end encryption offerings for iCloud, including adding protection for backups and photos stored on the cloud service. Child safety experts and technologists campaigning to combat CSAM have often opposed wider use of end-to-end encryption because it makes user data inaccessible to technology companies and makes it more difficult for them to scan and label CSAM. Law enforcement agencies around the world have similarly spearheaded the dire problem of child sexual abuse by opposing the use and expansion of end-to-end encryption, despite the fact that many of these agencies have used end-to-end encryption in the past. Encryption in general, because this is the case can make some investigations more difficult. However, research has consistently shown that end-to-end encryption is an important security tool to protect human rights and that the disadvantages of its implementation do not outweigh the advantages.

Communication Safety for Messages is opt-in and analyzes image attachments that users send and receive on their devices to determine if a photo contains nudity. The feature is designed so that Apple never gets access to the messages, the end-to-end encryption that Messages offers is never broken, and Apple never even knows that a device has detected nudity.

The company told WIRED that while it’s not ready to announce a specific timeline for expanding its communications security capabilities, the company is working to add the ability to detect nudity in videos sent via messages if protection is activated. The company also plans to expand the offering beyond Messages to its other communications applications. Ultimately, the goal is to allow third-party developers to integrate the Communication Safety tools into their own applications. The more features can proliferate, Apple says, the more likely kids are to get the information and support they need before they’re taken advantage of.