Apple Backs Down on Its Controversial Photo-Scanning Plans

In August, Apple detailed several new features intended to stop the dissemination of child sexual abuse materials. The backlash from cryptographers to privacy advocates to Edward Snowden himself was near-instantaneous, largely tied to Apple’s decision not only to scan iCloud photos for CSAM, but to also check for matches on your iPhone or iPad. After weeks of sustained outcry, Apple is standing down. At least for now.

“Last month we announced plans for features intended to help protect children from predators who use communication tools to recruit and exploit them, and limit the spread of Child Sexual Abuse Material,” the company said in statement Friday. “Based on feedback from customers, advocacy groups, researchers and others, we have decided to take additional time over the coming months to collect input and make improvements before releasing these critically important child safety features.”

Apple didn’t give any more guidance on what form those improvements might take, or how that input process might work. But privacy advocates and security researchers are cautiously optimistic about the pause.

“I think this is a smart move by Apple,” says Alex Stamos, former chief security officer at Facebook and cofounder of cybersecurity consulting firm Krebs Stamos Group. “There is an incredibly complicated set of trade-offs involved in this problem and it was highly unlikely that Apple was going to figure out an optimal solution without listening to a wide variety of equities.”

CSAM scanners work by generating cryptographic “hashes” of known abusive images—a sort of digital signature—and then combing through huge quantities of data for matches. Lots of companies already do some form of this, including Apple for iCloud Mail. But in its plans to extend that scanning to iCloud photos, the company proposed taking the additional step of checking those hashes on your device, as well, if you have an iCloud account.

The introduction of that ability to compare images on your phone against a set of known CSAM hashes—provided by the National Center for Missing and Exploited Children—immediately raised concerns that the tool could someday be put to other use. “Apple would have deployed to everyone’s phone a CSAM-scanning feature that governments could, and would, subvert into a surveillance tool to make Apple search people’s phones for other material as well,” says Riana Pfefferkorn, research scholar at the Stanford Internet Observatory.

Apple has resisted multiple United States government requests to build a tool that would allow law enforcement to unlock and decrypt iOS devices in the past. But the company has also made concessions to countries like China, where customer data lives on state-owned servers. At a time when legislators around the world have ramped up efforts to undermine encryption more broadly, the introduction of the CSAM tool felt especially fraught.

“They clearly feel this is politically challenging, which I think shows how untenable their ‘Apple will always refuse government pressure’ position is,” says Johns Hopkins University cryptographer Matthew Green. “If they feel they must scan, they should scan unencrypted files on their servers,” which is the standard practice for other companies, like Facebook, which regularly scan for not only CSAM but also terroristic and other disallowed content types. Green also suggests that Apple should make iCloud storage end-to-end encrypted, so that it can’t view those images even if it wanted to.

The controversy around Apple’s plans was technical, as well. Hashing algorithms can generate false positives, mistakenly identifying two images as matches even when they’re not. Called “collisions,” those errors are especially concerning in the context of CSAM. Not long after Apple’s announcement, researchers began finding collisions in the iOS “NeuralHash” algorithm Apple intended to use. Apple said at the time that the version of NeuralHash that was available to study was not exactly the same as the one that would be used in the scheme, and that the system was accurate. Collisions may also not have a material impact in practice, says Paul Walsh, founder and CEO of the security firm MetaCert, given that Apple’s system requires 30 matching hashes before sounding any alarms, after which human reviewers would be able to tell what’s CSAM and what’s a false positive.

It’s unclear at this point what specific changes Apple could make to satisfy its critics. Green and Pfefferkorn both suggest that the company could limit its scanning to shared iCloud albums rather than involving its customers’ devices. And Stamos says the NeuralHash issues reinforce the importance of incorporating the research community more fully from the start, especially for an untested technology.

Others remain steadfast that the company should make its pause permanent. “Apple’s plan to conduct on-device scanning of photos and messages is the most dangerous proposal from any tech company in modern history,” says Evan Greer, deputy director of digital rights nonprofit Fight for the Future. “It’s encouraging that the backlash has forced Apple to delay this reckless and dangerous surveillance plan, but the reality is that there is no safe way to do what they are proposing. They need to abandon this plan entirely.”

That Apple is holding off on its plans at all, though, is a major concession from a company not typically inclined to give them. “I’m stunned, frankly,” says Pfefferkorn. “It would be hard for them to announce they’re dropping these plans altogether, but ‘hitting pause’ is still a huge deal.”

Additional reporting by Andy Greenberg.


More Great WIRED Stories

Source: Apple Backs Down on Its Controversial Photo-Scanning Plans

*This is a free press release. Upgraded press releases are ad-free!

Megger and IPS Group announce strategic partnership

Megger Group and IPS Group have announced that they have reached a definitive agreement to work together as strategic partners, combining both their experience and product lines to create new value for the electrical supply industry. The agreement includes Megger taking an equity stake in IPS.IPS is one of the leading providers of Enterprise Asset…

Read Press Release

GAMEE plans to use Polygon’s scaling solution for Arc8

Polygon has partnered with GAMEE solidifying its strength in the gaming sectorGAMEE will use the Ethereum scaling solution for its Arc8Being a leading solution for ETH scaling and infrastructure development, the network will allow GAMEE to offer a faster and cheaper way for players to buy tokens and withdraw rewardsPolygon formerly known as the Matic…

Read Press Release

Blubrry Added Podcast Hosting Plans

Blubrry announced that they are once again bringing value to podcast hosting customers with their Advanced Plan. Advanced Statistics and Domain Mapping (for those using our free WordPress websites) are now included in Blubrry’s $20+ hosting plans. Growing a podcast audience is exciting, and Blubrry is offering one of the best ways to keep track…

Read Press Release

Apple Backs Down on Its Controversial Photo-Scanning Plans - Click To Share

Share on facebook
Share on twitter
Share on reddit
Share on linkedin
Share on email
Share on whatsapp