Apple chief executive Tim Cook has said the company will challenge a court order to help FBI investigators build a “master key” to access encrypted data.
The specific phone that the FBI want to access belongs to San Bernardino gunman Syed Rizwan Farook. But in a message to Apple customers, Cook stated that he believes the FBI’s current demands would only represent the beginning of their encroach on privacy and would signal a “dangerous precedent”.
The Apple CEO also criticised the FBI’s unprecedented use of the All Writs Act of 1789 to justify an expansion of its authority, rather than asking for legislative action through Congress.
“The implications of the government’s demands are chilling,” said Cook. “If the government can use the All Writs Act to make it easier to unlock your iPhone, it would have the power to reach into anyone’s device to capture their data.
“The government could extend this breach of privacy and demand that Apple build surveillance software to intercept your messages, access your health records or financial data, track your location, or even access your phone’s microphone or camera without your knowledge.”
Gaining access through the backdoor
The FBI has requested Apple build a version of iOS – iPhone’s operating system – that could be installed and used to circumvent current security features.
The FBI may use different words to describe this tool, but make no mistake: building a version of iOS that bypasses security in this way would undeniably create a backdoor
At present Apple says it has complied with valid subpoenas and search warrants, but has taken exception to what it sees as an “overreach by the US government “.
“The FBI may use different words to describe this tool, but make no mistake: building a version of iOS that bypasses security in this way would undeniably create a backdoor,” says Cook. “While the government may argue that its use would be limited to this case, there is no way to guarantee such control.”
Apple has said that it objects to the FBI’s request “to expose its customers to a greater risk of attack”, and that their court-ordered mandate amounts to asking the engineers who built strong encryption into the iPhone “to weaken those protections and make our users less safe”.
Some commentators have suggested Apple’s motives might not be quite as noble as they seem at first glance. “I’m not in a position to guess whether Apple can break the encryption on its devices – that’s one of those things where you need highly skilled cryptanalysts to bang on them for some years and not find holes,” says Open Rights Group advisory council member, Wendy Grossman.
“What we do know is that Apple promised its customers that it could not access their data. So either it’s infeasible, as they say, or they would be breaking their word to customers. Neither is a desirable state for a public company, so I’m not surprised they’ve gone to court.”
Whatever the tech giants rational for refusing the FBI’s request, Grossman agrees with Apple’s argument that once a backdoor has been established innocent people’s data will be exposed.
“There are always hard cases with respect to law enforcement’s desire for more information. However, Apple’s decision to provide encryption it can’t’ crack for its customers is a rational one because opening the gunman’s phone, for example, doesn’t just expose the gunman’s data but also data relating to innocent family members and friends and other contacts,” says Grossman.
Battling on multiple fronts
The FBI’s request to bypass iPhone’s encryption follows the proposals made by policy makers in California and New York to ban the sale of encrypted phones. In their letter to customers Apple point out that such a policy would “hurt only the well-meaning and law-abiding citizens who rely on companies like Apple to protect their data,” while criminals would still be able to encrypt data, using tools already available to them.
You cannot make a hole that only good guys can use
“The difficulty with policies such as those that have been alluded to by both the US and the UK of banning the use of encryption where law enforcement can’t gain access is a really bad idea, for several reasons. One, you cannot make a hole that only good guys can use, so a law like that opens all of us up to much worse and more pervasive criminal attack that we’ve seen before,” says Grossman.
“Democratic societies have long imposed limits on what law enforcement can access in an effort to balance the right to privacy of ordinary people and their right to protection from crime. Criminals plan in houses, but we don’t require that every householder deposit a copy of their house key in the local police station –this is a close analogy.”