Instead, Apple told WIRED this week, it is focusing its anti-CSAM efforts and investments on its “Communication Safety” features, which the company initially announced in August 2021 and launched last December. Parents and caregivers can opt into the protections through family iCloud accounts. The features work in Siri, Apple’s Spotlight search, and Safari Search to warn if someone is looking at or searching for child sexual abuse materials and provide resources on the spot to report the content and seek help. Additionally, the core of the protection is Communication Safety for Messages, which caregivers can set up to provide a warning and resources to children if they receive or attempt to send photos that contain nudity. The goal is to stop child exploitation before it happens or becomes entrenched and reduce the creation of new CSAM. “After extensive consultation with experts to gather feedback on child protection initiatives we proposed last year, we are deepening our investment in the Communication Safety feature that we first made available in December 2021,” the company told WIRED in a statement. “We have further decided to not move forward with our previously proposed CSAM detection tool for iCloud Photos. Children can be protected without companies combing through personal data, and we will continue working with governments, child advocates, and other companies to help protect young people, preserve their right to privacy, and make the internet a safer place for children and for us all.” Apple’s CSAM update comes alongside its announcement today that the company is vastly expanding its end-to-end encryption offerings for iCloud, including adding the protection for backups and photos stored on the cloud service. Child safety experts and technologists working to combat CSAM have often opposed broader deployment of end-to-end encryption because it renders user data inaccessible to tech companies, making it more difficult for them to scan and flag CSAM. Law enforcement agencies around the world have similarly cited the dire problem of child sexual abuse in opposing the use and expansion of end-to-end encryption, though many of these agencies have historically been hostile toward end-to-end encryption in general because it can make some investigations more challenging. Research has consistently shown, though, that end-to-end encryption is a vital safety tool for protecting human rights and that the downsides of its implementation do not outweigh the benefits. Communication Safety for Messages is opt-in and analyzes image attachments users send and receive on their devices to determine whether a photo contains nudity. The feature is designed so Apple never gets access to the messages, the end-to-end encryption that Messages offers is never broken, and Apple doesn’t even learn that a device has detected nudity. The company told WIRED that while it is not ready to announce a specific timeline for expanding its Communication Safety features, the company is working on adding the ability to detect nudity in videos sent through Messages when the protection is enabled. The company also plans to expand the offering beyond Messages to its other communication applications. Ultimately, the goal is to make it possible for third-party developers to incorporate the Communication Safety tools into their own applications. The more the features can proliferate, Apple says, the more likely it is that children will get the information and support they need before they are exploited. Similar to other companies that have grappled publicly with how to address CSAM—including Meta—Apple told WIRED that it also plans to continue working with child safety experts to make it as easy as possible for its users to report exploitative content and situations to advocacy organizations and law enforcement. “Technology that detects CSAM before it is sent from a child’s device can prevent that child from being a victim of sextortion or other sexual abuse, and can help identify children who are currently being exploited,” says Erin Earp, interim vice president of public policy at the anti-sexual violence organization RAINN. “Additionally, because the minor is typically sending newly or recently created images, it is unlikely that such images would be detected by other technology, such as Photo DNA. While the vast majority of online CSAM is created by someone in the victim’s circle of trust, which may not be captured by the type of scanning mentioned, combatting the online sexual abuse and exploitation of children requires technology companies to innovate and create new tools. Scanning for CSAM before the material is sent by a child’s device is one of these such tools and can help limit the scope of the problem.” Countering CSAM is a complicated and nuanced endeavor with extremely high stakes for kids around the world, and it’s still unknown how much traction Apple’s bet on proactive intervention will get. But tech giants are walking a fine line as they work to balance CSAM detection and user privacy. Updated 5:20pm ET, Wednesday, December 7, 2022 to include commentary from RAINN.