A senior Apple executive has admitted that the announcement of its new anti-child abuse features was “jumbled pretty badly”.
Last week, the company announced two major new tools that would be added to the iPhone. One would scan incoming and outgoing photos when they were sent by children, and look for signs of sexual imagery, while another would use techniques to look for child sexual abuse material inside iPhone owners’ photos.
The tools were immediately hit by loud outcry from security experts and privacy activists, who argued that Apple was dangerously weakening its commitment to ensuring that everything on an iPhone stays private.
While Apple stressed that it had developed the features with privacy in mind, and that no humans would see any other photos unless a specific very high threshold had been reached, many argued that the new tools amounted to a significant weakening of protections that could be expanded further in the future.
In a new interview, Apple’s software chief Craig Federighi has accepted that the way the features were announced was a “recipe for this kind of confusion” and that “a lot of messages got jumbled pretty badly in terms of how things were understood”.
Mr Federighi said that Apple continues to “feel very positive and strongly about what we’re doing” even in the face of protests from both outside and inside the company. But he told the Wall Street Journal that it had not done enough to explain the features.
He suggested that the issue had been largely driven by the fact that the company had not done enough to distinguish between the two features, and so it had left people feeling that Apple was scanning through all photos sent through messages, or other beliefs that were technically incorrect.
Instead, he stressed that the most controversial feature – which looks through the photo library on a phone for known child abuse imagery from an existing database, and alerts Apple if a library reaches a high enough threshold of images – is done on the phone, so that for the most part Apple will not see any images.
Mr Federighi also committed that Apple will allow an independent auditor to check the database of images, which is provided by a third-party organisation.
He also said that Apple would introduce it to the phone in such a way that security researchers can “inspect what’s happening” in the software. “So if any changes were made that were to expand the scope of this in some way – in a way that we had committed to not doing – there’s verifiability, they can spot that that’s happening,” he said.
That comes in response to warnings from some critics, who have said that Apple could for example be forced to add other kinds of content to its database, such as political imagery.