There’s no question that Apple’s refusal to help the FBI gain access to data in one of the iPhones used during the San Bernardino massacre has been getting scads of coverage in the news and business press.
Apple’s concerns, eloquently stated by CEO Tim Cook, are understandable. From the company’s point of view, it is at risk of giving up a significant selling feature of the iPhone to enable a “back door” access to encrypted data.. Apple’s contention is that many people have purchased the latest models of iPhones for precisely the purpose of protecting their data from prying eyes.
On the other hand, the U.S. government’s duty is to protect the American public from terrorist activities.
Passions are strong — and they’re lining up along some predictable social and political fault lines. After having read more than a dozen news articles in the various news and business media over the past week or so, I decided to check in with my brother, Nelson Nones, for an outsider’s perspective.
As someone who has lived and worked outside the United States for decades, Nelson’s perspectives are invariably interesting because they’re formed from the vantage point of “distance.”
Furthermore, Nelson has held very strongly negative views about the efforts of the NSA and other government entities to monitor computer and cellphone records. I’ve given voice to his perspectives on this topic on the Nones Notes blog several times, such as here and here.
So when I asked Nelson to share his perspectives on the Apple/FBI, I was prepared for him to weigh in on the side of Apple.
Well … not so fast. Shown below what he wrote to me:
This may come as a surprise, but I’m siding with the government on this one. Why? Three reasons:
Point #1: The device in question is (and was) owned by San Bernardino County, a government entity.
The Fourth Amendment of the U.S. Constitution provides, “The right of the people to be secure in their persons, houses, papers, and effects, against unreasonable searches and seizures, shall not be violated …”
The investigation that the FBI wants to conduct could either be thought of as a seizure of property (the iPhone), or as a search (accessing the iPhone’s contents). Either way, Fourth Amendment protections do not apply in this case.
Within the context of the Fourth Amendment, seizure of property means interfering with an individual’s possessory interests in the property. In this case, the property isn’t (and never was) owned by an individual; it is public property. Because Farook, an individual, never had a possessory interest in the property, no “unreasonable seizure” can possibly occur.
Also, within the meaning of the Fourth Amendment, an “unreasonable search” occurs when the government violates an individual’s reasonable expectation of privacy. In this case the iPhone was issued to Farook by his employer. It is well known and understood through legal precedent that employees have no reasonable expectation of privacy when using employer-furnished equipment. For example, employers can and do routinely monitor the contents of the email accounts they establish for their employees.
Point #2: The person who is the subject of the investigation (Syed Farook) is deceased.
According to Paul J. Stablein, a U.S. criminal defense attorney, “Unlike the concept of privilege (like communications between doctor and patient or lawyer and client), the privacy expectations afforded persons under the Fourth Amendment do not extend past the death of the person who possessed the privacy right.”
So, even if the iPhone belonged to Farook, no reasonable expectation of privacy exists today because Farook is no longer alive.
Point #3: An abundance of probable cause exists to issue a warrant.
In addition to protecting people against unreasonable searches and seizures, the Fourth Amendment also states, “… no warrants shall issue, but upon probable cause, supported by oath or affirmation, and particularly describing the place to be searched, and the persons or things to be seized.”
I strongly believe the U.S. National Security Agency’s mass surveillance was unconstitutional and therefore illegal, due to the impossibility of establishing probable cause for indiscriminately searching the records of any U.S. citizen who might have placed or received a telephone call, sent or received an email message or logged on to their Facebook account.
That’s because these acts do not, in and of themselves, provide any reasonable basis for believing that evidence of a crime exists.
I also strongly believe that U.S. citizens have the right to encrypt their communications. No law exists preventing them from doing so for legal purposes. Conducting indiscriminate searches through warrantless “back door” decryption would be just as unconstitutional and illegal as mass surveillance.
In this case, however, multiple witnesses watched Farook and his wife, Tashfeen Malik, open fire on a holiday party, killing 14 people, and then flee after leaving behind three pipe bombs apparently meant to detonate remotely when first responders arrived on the scene.
Additional witnesses include the 23 police offers involved in the shootout where Farook and Malik eventually were killed.
These witnesses have surely given sworn statements attesting to the perpetrators’ crimes.
It is eminently reasonable to believe that evidence of these crimes exists in the iPhone issued to Farook. So, in this case there can be no doubt that all the requirements for issuing a warrant have been met.
For these three reasons, unlike mass surveillance or the possibility of warrantless “back door” decryption, the law of the land sits squarely and undeniably on the FBI’s side.
Apple’s objections, seconded by Edward Snowden, rest on the notion that it’s “too dangerous” to assist the FBI in this case, because the technology Apple would be forced to develop cannot be kept secret.
“Once [this] information is known, or a way to bypass the code is revealed, [iPhone] encryption can be defeated by anyone with that knowledge,” says Tim Cook, Apple’s CEO. Presumably this could include overreaching government agencies, like the National Security Agency, or criminals and repressive foreign regimes.
It is important to note that Apple has not been ordered to invent a “back door” that decrypts the iPhone’s contents. Instead, the FBI wants to unlock the phone quickly by brute force; that is, by automating the entry of different passcode guesses until they discover the passcode that works.
To do this successfully, it’s necessary to bypass two specific iPhone security features. The first renders brute force automation impractical by progressively increasing the minimum time allowed between entries. The second automatically destroys all of the iPhone’s contents after the maximum allowable number of consecutive incorrect guesses is reached.
Because the iPhone’s operating system must be digitally signed by Apple, only Apple can install the modifications needed to defeat these features.
It’s also important to note that Magistrate Judge Sheri Pym’s order says Apple’s modifications for Farook’s iPhone should have a “unique identifier” so the technology can’t be used to unlock other iPhones.
This past week, Apple has filed a motion to overturn Magistrate Judge Pym’s order. In its motion, the company offers a number of interesting arguments, three of which stand out:
Contention #1: The “unreasonable burden” argument.
Apple argues that complying with Magistrate Judge Pym’s order is unreasonably burdensome because the company would have to allocate between six and ten of its employees, nearly full-time over a 2 to 4 week period, together with additional quality assurance, testing and documentation effort. Apple also argues that being forced to comply in this case sets a precedent for similar orders in the future which would become an “enormously intrusive burden.”
Contention #2: Contesting the phone search requirement.
Apple isn’t contesting whether or not the FBI can lawfully seize and search the iPhone. Instead it is contesting Magistrate Judge Pym’s order compelling Apple to assist the FBI in performing the search. As such, Apple is an “innocent third party.” According to Apple, the FBI is relying on a case, United States v. New York Telephone, that went all the way to the Supreme Court in 1977. Ultimately, New York Telephone was ordered to assist the government by installing a “pen register,” which is a simple device for monitoring the phone numbers placed from a specific phone line.
The government argued that it needed the phone company’s assistance to execute a lawful warrant without tipping off the suspects. The Supreme Court found that complying with this order was not overly burdensome because the phone company routinely used pen registers in its own internal operations, and because it is a highly regulated public utility with a duty to serve the public. In essence, Apple is arguing that United States v. New York Telephone does not apply, because (unlike the phone company’s prior use of pen registers) it is being compelled to do something it has never undertaken before, and also because it is not a public utility with a duty to serve.
Contention #3: The requirement to write new software.
Lastly, Apple argues that it will have to write new software in order to comply with Magistrate Judge Pym’s order. However, according to Apple, “Under well-settled law, computer code is treated as speech within the meaning of the First Amendment,” so complying with the order amounts to “compelled speech” that the Constitution prohibits.
What do I think of Apple’s arguments?
Regarding the first of the them, based on its own estimates of the effort involved, I’m guessing that Apple wouldn’t incur more than half a million dollars of direct expense to comply with this order. How burdensome is that to a company that just reported annual revenues of nearly $234 billion, and over $53 billion of profit?
Answer: To Apple, half a million dollars over a four-week period is equivalent to 0.01% of last year’s profitability over an equivalent time span. If the government compensates Apple for its trouble, I don’t see how Apple can win this argument.
Regarding the other two arguments above, as Orin Kerr states in his Washington Post blog, “I don’t know which side would win … the scope of authority under the [All Writs Act] is very unclear as applied to the Apple case. This case is like a crazy-hard law school exam hypothetical in which a professor gives students an unanswerable problem just to see how they do.”
My take: There’s no way a magistrate judge can decide this. If Apple loses, and appeals, this case will eventually end up at the Supreme Court.
What if the back door is forced open?
The concerns of privacy advocates are understandable. Even though I’m convinced the FBI’s legal position is solid, I also believe there is a very real risk that Apple’s modifications, once made, could leak into the wrong hands. But what happens if they do?
First, unlike warrantless “back door” decryption, this technique would work only for iPhones — and it also requires physical possession of a specifically targeted iPhone.
In other words, government agencies and criminals would have to lawfully seize or unlawfully steal an iPhone before they could use such techniques to break in. This is a far cry from past mass surveillance practices conducted in secret.
Moreover, if an iPhone is ever seized or stolen, it is possible to destroy its contents remotely, as soon as its owner realizes it’s gone, before anyone has the time to break in.
Second, Apple might actually find a market for the technology it is being compelled to create. Employers who issue iPhones to their employees certainly have the right to monitor employees’ use of the equipment. Indeed, they might already have a “duty of care” to prevent their employees from using employer-issued iPhones for illegal or unethical purposes, which they cannot fulfill because of the iPhone’s security features.
Failure to exercise a duty of care creates operational as well as reputational risks, which employers could mitigate by issuing a new variety of “enterprise class” iPhones that they can readily unlock using these techniques.
So that’s one person’s considered opinion … but we’d be foolish to expect universal agreement on the Apple/FBI tussle. If you have particular views pro or con Apple’s position, please join the discussion and share them with other readers here.
One thought on “Is Apple setting itself up for failure in the FBI’s Syed Farook Probe?”
I could not agree with Nelson Nones more.
As a matter of fact, I wrote something similar to a friend of mine who shares my privacy concerns. In the 1990s, Motorola was the first cellphone company to comply with building in a back door for law enforcement. I do not know the company’s current stance but, at the time, it got my fur up.
This is different, as Nelson stated, for the three primary reasons he stated.