Tech experts are continuing to sound alarm about Apple’s controversial new iPhone-scanning feature.
The tool is intended to detect child sexual abuse material, by looking through users’ messages and photos. If it does find such images, it will alert authorities.
Apple has said that all of that is done in a way that protects privacy, by doing the analysis on a users’ phone and not allowing Apple to see those photos unless the iOS software determines it to be sufficiently similar to an image on a database of child abuse imagery.
But critics say the feature undermines the security of those photos and contradicts Apple’s longstanding commitment to privacy.
Many of those critics argue that while the aim of finding and preventing child sexual abuse material is noble, the system could weaken security more broadly.
The Electronic Frontier Foundation (EFF), for instance, said that the system could be used for “broader abuses”.
“It’s impossible to build a client-side scanning system that can only be used for sexually explicit images sent or received by children,” the organisation said in a statement. “As a consequence, even a well-intentioned effort to build such a system will break key promises of the messenger’s encryption itself and open the door to broader abuses.
“That’s not a slippery slope; that’s a fully built system just waiting for external pressure to make the slightest change.”
The EFF was one of a number of organisations highlighted in an open letter, published online, which at the time of going to press had been signed by 6,000 people.
That letter calls for Apple to halt the deployment of the technology immediately, and also requersts that Apple issues “a statement reaffirming their commitment to end-to-end encryption and to user privacy”.
“Apple’s current path threatens to undermine decades of work by technologists, academics and policy advocates towards strong privacy-preserving measures being the norm across a majority of consumer electronic devices and use cases,” the letter reads.
“We ask that Apple reconsider its technology rollout, lest it undo that important work.”
The same sentiment was voiced by Will Cathcart, the head of WhatsApp, who also suggested the system could be abused to scan for other kinds of content.
“I read the information Apple put out yesterday and I’m concerned. I think this is the wrong approach and a setback for people’s privacy all over the world,” Mr Cathcart wrote in a series of tweets.
“People have asked if we’ll adopt this system for WhatsApp. The answer is no.”
He went on to detail the work WhatsApp had done to try and find instances of child abuse, and said that it had reported 400,000 cases last year, without breaking its encryption security. But he said that Apple was taking a different approach – and one that did in fact put privacy and security in peril.
“Apple has long needed to do more to fight CSAM, but the approach they are taking introduces something very concerning into the world,” he wrote.
“Instead of focusing on making it easy for people to report content that’s shared with them, Apple has built software that can scan all the private photos on your phone — even photos you haven’t shared with anyone. That’s not privacy.
“We’ve had personal computers for decades and there has never been a mandate to scan the private content of all desktops, laptops or phones globally for unlawful content. It’s not how technology built in free countries works.
“This is an Apple built and operated surveillance system that could very easily be used to scan private content for anything they or a government decides it wants to control. Countries where iPhones are sold will have different definitions on what is acceptable.
“Will this system be used in China? What content will they consider illegal there and how will we ever know? How will they manage requests from governments all around the world to add other types of content to the list for scanning?
“Can this scanning software running on your phone be error proof? Researchers have not been allowed to find out. Why not? How will we know how often mistakes are violating people’s privacy?
“What will happen when spyware companies find a way to exploit this software? Recent reporting showed the cost of vulnerabilities in iOS software as is. What happens if someone figures out how to exploit this new system?
“There are so many problems with this approach, and it’s troubling to see them act without engaging experts that have long documented their technical and broader concerns with this.”
He concluded by linking to an open letter that was published by Apple in 2016, when it was engaged in a fight with the US government over whether it should weaken security so that law enforcement could see the content of phones. Mr Cathcart noted that Apple had said that such an approach would “undermine the very freedoms and liberty our government is meant to protect”, and urged it to consider that statement in the wake of the new features.