Truth can be stranger than fiction…or at least more surprising. Apple Computer is the current champion of privacy against U.S. government attempts to expand its spying on us. The company, a frequent NSA and FBI collaborator in the past, finds itself in the strange position of confronting a federal court order to dislodge its iPhone security system, an action Apple insists will cripple encryption as a privacy-protection measure.
Last week, federal judge Sheri Pym ordered Apple to help the government de-crypt data on the iPhone of Syed Farook. Farook and his wife, Tashfeen Malik, shot up a government employees’ Christmas Party last year in San Bernardino, California killing 14 people and injuring many more. The couple are suspected Islamist terrorists and the government wants to know about Farook’s calls and text messages.
Apple’s CEO Tim Cook says that complying with the order would deeply damage all encryption on the iPhone and Apple has appealed it. The government counters that it’s doing no such thing: the order it insists is for one phone and Apple can easily help de-crypt the phone without affecting everyone else’s.
The case has drawn enormous attention. Apple is…well…Apple and what it does is big news even before it does it. Encryption has become a major issue after whistle-blower revelations about how the government, particularly the National Security Agency, vacuums massive information from people in this and other countries every second of the day. There has been a predictable flurry of speculation about how the government might get that info without Apple’s help and the analysis about the potential impact has been voluminous, ranging from accepting Apple’s argument to calling it absurd.
In the roar, however, some truth is being drowned out. On the one hand, with ever increasing consciousness about privacy and security, Apple may be acting not out of commitment to privacy but because taking the stand may be good for business. It’s not clear how long it will resist. On the other hand, the government’s “only one case” argument is an outrageous and cynical lie. If Apple complies with the court order, the use of encryption on cell-phones will suffer profound and lasting damage.
That, many believe, is the real reason why the government wants Apple’s cooperation. “The government’s goal in this case has little to do with unlocking a single iPhone,” wrote surveillance expert Joshua Kopstein, “and everything to do with establishing a legal precedent that guarantees them the ability to achieve this access on any device.”
To put it mildly, there is a lot at stake.
There are legal and philosophical arguments in the case, of course. The primary one is over privacy: our right to keep the government out of our personal interactions, meetings, conversations, consultations and whatever else we do. Privacy is guaranteed in our constitution without reservation because a government that can collect any information it wants on its citizens is a repressive government in the making…or in fact.
Technology activists are becoming increasingly concerned about that issue because the government’s arguments in the case seem disingenuous. The FBI openly admits it has no idea what was on the phone. Its investigation is what used to be called, in the days when subpoenas were the norm, “a fishing expedition”. In fact, of the several phones Farook had, this one (a work phone from the San Bernardino health department) was the only one he didn’t destroy. It was also active and turned on when investigators found it; they let its battery drain. Finally, the phone was found with the “Find My iPhone” feature activated, practically assisting investigators in finding it. Doesn’t sound like a terrorist’s primary communications device.
Doubts about its intentions deepened when the government based its arguments on the All Writs Act, a 277 year old law that lets a federal court force people and companies to do anything to help the government if it aids the execution of a court order and doesn’t cause an unreasonable burden. In using an outdated and seldom-cited law for a case about cell-phones and encryption the government has raised questions about whether it believes in its own case.
The court arguments, however, haven’t dwelt on the constitutional or fact issues. Instead the legal issue has been about the technology of encryption and this has served to cloud the public debate since, as popular as encryption is, most people in this country have no idea how it works or what it really does. To most people the case must seem like a screaming match in a language used on another planet.
When the facts are examined, however, there is really no debate at all.
Simply put, an encryption program substitutes strings of incomprehensible letters, numbers and characters for letters and numbers that you can read. You sign your email with your name and an encryption program turns it into a random bunch of characters that are indecipherable. It does the same to your email content or anything else on the computer (or cell-phone) you encrypt.
The only way you can make sense of it is with a key — a passcode that, when keyed in, will trigger the “de-cryption” of your goobledy-gook and render intelligible the information you wrote in the first place.
Sitting in storage on Farook’s I-phone may be text messages that are unintelligible without this key. That’s what the government says it needs to explore and read.
In her order, Judge Pym didn’t order Apple to actually de-crypt the data on Farook’s phone. Apple doesn’t keep people’s keys and so can’t give Farook’s to the government. But the government doesn’t need that from Apple. Its technologists can break the code and acquire that magic “key” through “brute force programs” that generate and submit millions of possible passcodes every minute. After a while, such a program will guess the correct passcode and everything is de-crypted. Since most people use four or five characters in their phone passcodes, any good brute force program will crack the code in a few minutes.
What Judge Prym wants from Apple is to allow the government to apply the brute force program because, right now, that’s almost impossible. When Apple installed encryption in its phone in 2014, it took steps to foil brute force de-cryption. Older Apple phones shut down after ten incorrect passcode entries and then automatically erase all data on the phone; a brute force attack would erase the data. Newer phones (as of 2015) make de-cryption even more difficult. Each time you insert a passcode incorrectly, the phone extends the time it takes to process the attempts. When you consider that brute force programs can make millions of attempts in a minute, a successful decryption could take years. The government doesn’t have that kind of time.
Finally, there is an even more vexing problem. The iPhone’s passcode has to be keyed in by hand. You can’t just hook up another computer and funnel the passwords in as is done with the average brute force adventure. Nobody on earth can type millions of passwords in a minute.
Judge Pym ordered Apple to lift those obstacles, essentially telling it to rewrite its Operating System, the program that actually runs the phone’s computer. That’s what Apple is refusing to do.
“Specifically, the FBI wants us to make a new version of the iPhone operating system, circumventing several important security features, and install it on an iPhone recovered during the investigation,” Apple’s Cook wrote in a letter to customers. “In the wrong hands, this software — which does not exist today — would have the potential to unlock any iPhone in someone’s physical possession. The FBI may use different words to describe this tool, but make no mistake: Building a version of iOS that bypasses security in this way would undeniably create a backdoor.”
A “backdoor”, computer code that lets someone sneak into a system or program by circumventing its security measures, is almost an obscenity in techie language. Most hacks, vulnerabilities and destructive code is introduced through a back door that hackers manage to install. Getting “back-doored” is to computer administrators a very special kind of hell.
Apple argues that, once developed, this backdoor code could fall into the wrong hands: criminals, other terrorists, disruptive hackers. But, good citizen that the corporate giant is, it leaves out the main threat: the U.S. government and its spy agencies. This vulnerable operating system doesn’t have to “fall into” anyone’s hands; it’s already in the most dangerous hands on earth.
Recent history is cluttered with revelations about how the NSA substitutes vulnerable operating systems for secure ones in computers all over the world — either with physical tampering (as it did in its PRISM program) or possibly remotely through the Internet itself. The NSA’s technology collects data from all computers that are on-line, stores it in huge data-banks and then, based on lists of code-words, analyzes it for use in investigations, inquiries or reports. They spy on us every second of the day. If they get this weakened Operating System, they could spy on our I-Phones.
That’s more than a threat to your personal privacy. The I-Phone is ubiquitous, a mini-computer that is now used extensively by activists and movement organizers world-wide. It doesn’t take much speculation to see the threat.
Nor does it take much to understand why the government is doing this.
Many tech experts believe the FBI and NSA can crack this I-Phone without Apple’s help. By simply separating the storage device in the phone (on its symcard) from its operating system, the government could then use its brute force program to decrypt data in a couple of hours at most. Why bother with a huge, public court case?
The answer may be the future: if it has an Operating System that defeats encryption, the FBI (and the NSA it collaborates with constantly) can start confiscating I-Phones from anybody it wants. In fact, it can remotely install a weakened Operating System without the phone owner knowing — that technology already exists. That way it doesn’t have to let spying targets know their data is being captured. As a result, there is no encryption on the I-Phone and no privacy for countless activists world-wide.
As Kopstein points out, however, the threat is not limited to the iPhone. The legal standard of using that Writ to force decryption can be applied to any company, on any phone, anyplace. The government’s goal has always been to end effective encryption. This is a major step towards that goal.
One thing that is harder to see, however, is why Apple is doing this. In his letter, Cooke points out how hard Apple has been working with the government on the San Bernardino case. “Apple complies with valid subpoenas and search warrants, as we have in the San Bernardino case. We have also made Apple engineers available to advise the FBI, and we’ve offered our best ideas on a number of investigative options at their disposal.”
While that statement raises a bunch of questions — what exactly has Apple been doing with the government on the San Bernardino case? — it’s clear that Apple’s collaboration with and support of government spying continues unabated. So why oppose a simple hack? Why now? The answer may be “money”.
The I-Phone has saved Apple. The company’s computer divisions (with their highly specialized and expensive products) have been in constant shrinkage and, with very few exceptions, it has almost no software division left. But it has phones and it makes and sells one of the best hand-held phone devices available. Using software made by its partner Google, another company that fanatically flies the Jolly Roger of profit, Apple has turned its phone into a mobile computer whose capabilities surpass most expectations. It maintains its profitability by selling that phone all over the world and its plans call for a spread of that market.
To keep its edge in the highly competitive cell-phone industry (where companies go from successes to bankruptcies in a couple of years), Apple has kept responding to perceived need and concerns. The revelations about government spying highlighted a disturbing fact: while there were protections in place for computers, none existed for cell-phones. Your phone is a government data collector.
So Apple installed the encryption and began claiming that its phones are the most secure. Once those protections are cracked, Apple would no longer be able to make that claim. While it’s not clear this would affect sales in the U.S., where privacy is still not a sensitive issue among most people, it would certainly have an impact in many countries overseas where the population is more security-conscious and the government more openly repressive.
That motivation, if true, makes the firmness of Apple’s position more suspect and the company more amenable to a damaging “deal” of some kind with the government in this case. There’s no indication Apple is considering that yet but, given what Apple is and has done, such possibilities are always worth remembering.
For the moment, however, the case is most important not because of Apple’s motivations or even what happens to Farook’s cellphone but what will happen to the phones of countless activists and organizers all over the world. Would the U.S. actually use the program to spy on our movements? We can’t be sure. But we can be sure that, given this government’s history and practice, nobody in their right mind should take that chance.