As the old saying goes: If you aren’t doing anything illegal, then you have nothing to fear from surveillance.
Smartphones already act like tracking devices broadcasting the whereabouts of their owners, but Apple is about to open the door to far more advanced forms of smartphone-based voluntary surveillance by launching a new program designed to detect and report iPhone users who are found to have child pornography – known by the academic-speak acronym CSAM – which stands for Child Sexual Abuse Materials. According to a handful of academics who were offered a sneak preview of the company’s plans – then promptly spilled the beans on Twitter, and in interviews with the press.
The new system, called “neuralMatch”, is expected to be unveiled by Apple later this week. The software is expected to be installed on American iPhones via a software update. According to the FT, the automated system can proactively alert a team of human reviewers if it believes CSAM is present on a user’s iPhone. If the reviewers can verify the material, law enforcement will be contacted.
This is how “neuralMatch” will work, per the FT:
Apple’s neuralMatch algorithm will continuously scan photos that are stored on a US user’s iPhone and have also been uploaded to its iCloud back-up system. Users’ photos, converted into a string of numbers through a process known as “hashing”, will be compared with those on a database of known images of child sexual abuse.
[…]
The system has been trained on 200,000 sex abuse images collected by the US non-profit National Center for Missing and Exploited Children.
One academic who was offered a preview of the software explained why this could create serious privacy risks. Apple has gotten a lot of positive press for its commitment to user privacy – remember when it refused to crack an iPhone belonging to one of the San Bernardino shooters? Well, this encryption technology has become a perennial headache for law enforcement. Last January, Apple quietly abandoned plans to allow users to fully encrypt their iCloud backups due to complains from law enforcement.
Now, Apple has found a middle ground: it will assume responsibility for policing iPhones – well, at least to a degree. To accomplish this, the company is rolling out a new machine-learning tool that will scan iPhones for images that match certain “perceptual hashes” known to represent child pornography. But as academics have complained, could potentially be misled.
What’s more, the tool that’s today being used to unearth child pornography could one day be abused by authoritarian governments (like the CCP). And once Apple has committed to using this type of surveillance, governments will demand it from everyone.
I’ve had independent confirmation from multiple people that Apple is releasing a client-side tool for CSAM scanning tomorrow. This is a really bad idea.
— Matthew Green (@matthew_d_green) August 4, 2021
These tools will allow Apple to scan your iPhone photos for photos that match a specific perceptual hash, and report them to Apple servers if too many appear.
— Matthew Green (@matthew_d_green) August 5, 2021
Initially I understand this will be used to perform client side scanning for cloud-stored photos. Eventually it could be a key ingredient in adding surveillance to encrypted messaging systems.
— Matthew Green (@matthew_d_green) August 5, 2021
The ability to add scanning systems like this to E2E messaging systems has been a major “ask” by law enforcement the world over. Here’s an open letter signed by former AG William Barr and other western governments. https://t.co/mKdAlaDSts
— Matthew Green (@matthew_d_green) August 5, 2021
This sort of tool can be a boon for finding child pornography in people’s phones. But imagine what it could do in the hands of an authoritarian government? https://t.co/nB8S6hmLE3
— Matthew Green (@matthew_d_green) August 5, 2021
The way Apple is doing this launch, they’re going to start with non-E2E photos that people have already shared with the cloud. So it doesn’t “hurt” anyone’s privacy.
But you have to ask why anyone would develop a system like this if scanning E2E photos wasn’t the goal.
— Matthew Green (@matthew_d_green) August 5, 2021
But even if you believe Apple won’t allow these tools to be misused 🤞there’s still a lot to be concerned about. These systems rely on a database of “problematic media hashes” that you, as a consumer, can’t review.
— Matthew Green (@matthew_d_green) August 5, 2021
Hashes using a new and proprietary neural hashing algorithm Apple has developed, and gotten NCMEC to agree to use.
We don’t know much about this algorithm. What if someone can make collisions?
— Matthew Green (@matthew_d_green) August 5, 2021
Imagine someone sends you a perfectly harmless political media file that you share with a friend. But that file shares a hash with some known child porn file? pic.twitter.com/YdNVB0xfCA
— Matthew Green (@matthew_d_green) August 5, 2021
The idea that Apple is a “privacy” company has bought them a lot of good press. But it’s important to remember that this is the same company that won’t encrypt your iCloud backups because the FBI put pressure on them. https://t.co/tylofPfV13
— Matthew Green (@matthew_d_green) August 5, 2021
Green isn’t the only ‘expert’ who objects to the idea. “It is an absolutely appalling idea, because it is going to lead to distributed bulk surveillance of…our phones and laptops,” said Ross Anderson, professor of security engineering at the University of Cambridge. Another researcher said it’s only a few steps removed from ‘1984’-style surveillance.
Though the FT managed to find at least one academic willing to defend Apple’s approach.
Apple’s system is less invasive in that the screening is done on the phone, and “only if there is a match is notification sent back to those searching,” said Alan Woodward, a computer security professor at the University of Surrey. “This decentralised approach is about the best approach you could adopt if you do go down this route.”
Still, others warned that the system is only a few steps removed from ‘1984’-style surveillance. Alec Muffett, a security researcher and privacy campaigner who formerly worked at Facebook and Deliveroo, said Apple’s move was “tectonic” and a “huge and regressive step for individual privacy”. “Apple are walking back privacy to enable 1984.”
Via Zero Hedge