Monday, October 24, 2016

Apple’s iPhone and the FBI: Recalibrating the Right-to-Privacy

On February 29, 2016, a federal judge rejected the FBI’s request to unlock the work-issued iPhone 5c of Syed Rizwan Farook, who with his wife killed 14 people at a 2015 holiday gathering of county workers. The FBI and DEA cited the All Writs Act, a law passed in 1789 that authorizes federal courts to “issue all writs necessary or appropriate in aid of their respective jurisdictions and agreeable to the usages and principles of law.”[1] The U.S. Justice Department was demanding that “Apple create software to bypass security features on the phone.”[2] In other words, Apple was to “write code that overrides the device’s auto-delete security function.”[3] In response, Apple’s lawyers argued that the statute does not give the court the right to “conscript and commandeer” the company into defeating its own encryption, thus making its customers’ “most confidential and personal information vulnerable to hackers, identity thieves, hostile foreign agents and unwarranted government surveillance.”[4] Tim Cook, Apple’s CEO at the time, said the FBI “was asking his company to create a ’back door’ that could be used to unlock other phones, exposing customer data. Agreeing to the FBI's demand would set a dangerous precedent that could lead to other calls for Apple's help to obtain private information, Cook said.”[5] Only weeks later, the FBI abruptly dropped the case because the bureau had found an outside company with technology that could serve as a master key. The FBI could use the “key” to unlock any iPhone. This left customers fearful that their data was now less than private even though Apple had promoted the iPhone product as not having a “back door” In the end, (t)he iPhone fight exposed a rift between the FBI and Silicon Valley technology companies over encryption, and sparked a debate about the right balance between privacy and national security.”[6] I suspect that although a trade-off, or tension between the right of privacy and the national-security interest of the United States existed at the time, electronic privacy would become harder and harder to protect as a result of the FBI’s tactics.  

No doubt focused entirely on national security, the U.S. government cannot be expected to protect an individual right to privacy when it is in the way. The FBI “sometimes loses sight of what is important to corporations . . .  and privacy is incredibly important," Jack Bennett, a key figure in the FBI’s iPhone hack, said after the fact.[7] Even so, he was unapologetic about the FBI being able to access the phone. “We were trying to get on one phone because we had 14 murdered people."[8] As it turned out, investigators did not find anything of significant value on the phone.[9] Even so, the damage was done as far as privacy is concerned.

Even though Bennett “disputed that the FBI was asking Apple for a tool that could access other iPhones, calling it a ‘one-shot deal,’”[10] the bureau could be expected to extend the one-shot deal the next time phone data might serve a useful purpose in preventing or prosecuting a terrorist-attack or even a lesser crime. "What's comfortable for a private corporation that will still provide an investigator the ability to stop or prevent a terrorist attack, a missing child or a national security incident?" Bennett asked.[11] The list could easily be extended; hence, some legal limitation on the FBI’s access would be necessary lest the bureau resort to clandestine data-swooping on a massive scale not limited to particular crimes and people related to them in some way.

Apple’s iPhone was supposed to be hack-proof; the company promoted the product as not having a “back door.” Even so, this turned out not to be so. It may be, therefore, that there’s no such thing as a completely secure system. Privacy may simply be an illusion marketed by the company and valued by the customers. Customers of any smart phone could feel vulnerable, moreover, as a result of the FBI successfully getting into the iPhone.[12]

Furthermore, a court can put a gag rule on a tech company, such that customers may be oblivious to any personal data being extracted. This only exacerbates the insecurity to be felt by customers regarding the privacy of their information. Microsoft had sued the Justice Department over the gag-order practice in April, 2016, “arguing that law enforcement was relying on these orders too often. Specifically, the software giant said the gag orders violate the Fourth Amendment right of its customers to know if the government searches or seizes their property and also the company’s First Amendment right to speak to its customers.”[13] Yet the gag orders could continue. To be sure, a company’s First Amendment right seems a bit of a stretch here.

Lastly, the FBI could be expected to continue to go wherever private data useful in uncovering a crime exists. For example, Open Whisper Systems, a maker of a widely used encryption app called Signal, received a subpoena in the first half of 2016 “for subscriber information, including web browsing histories, telephone numbers, methods of payment, internet providers, and data stored in the tracking “cookies” of the web browsers associated with two phone numbers that came up in a federal grand jury investigation in Virginia.[14]  Interestingly, “one of Signal’s biggest draws is that it does not collect most of that information.”[15] Civil liberties lawyers argued nevertheless that “the Justice Department request fell well outside the bounds of what is typically covered by a subpoena, including basic subscriber information.”[16] Particularly upsetting, the subpoena arrived with a court order that said Open Whisper Systems was not allowed to tell anyone about the information request for one year. Technology companies contend that court-imposed gag orders are being used too often by law enforcement and that they violate the Bill of Rights. The companies also complain that law enforcement officials are casting a wide net over online communications — often too wide — in their investigations. Justice Department officials, for their part, argue that these gag orders are necessary to protect developing cases and to avoid tipping off potential targets. The officials say that they are simply following leads where they take them.”[17]
In conclusion, even wealthy companies like Apple are no match for the FBI and the courts in protecting customer privacy. Just as companies pursue profits often single-mindedly, the FBI can be expected to attempt to uncover any lead. Furthermore, the electronic means of storing personal information may simply be too susceptible—too easily accessed by a government (and hackers)—for privacy to be at all realistic. Smartphone technology, as well as social media such as Facebook pages, is causing us all to come to terms with a recalibrated acceptance of privacy-risk and even loss. It is asking too much, I submit, for a company to be tasked with defending an increasingly antiquated expectation of privacy. As a result, we might expect people to recalibrate what personal information we are willing to put on a phone (or social-media page). The Apple case can be interpreted as one of the triggers of the societal recalibration, rather than settling the matter.




1. Jim Stavridis and Dave Weinstein, “Apple vs. FBI Is Not About Privacy vs. Security—It’s About How to Achieve Both,” The World Post, March 8, 2016.
2. The Associated Press, “New FBI Head in San Francisco Was Key Figure in iPhone Hack,” The New York Times, October 5, 2016.
3. Jim Stavridis and Dave Weinstein, “Apple vs. FBI Is Not About Privacy vs. Security—It’s About How to Achieve Both,” The World Post, March 8, 2016.
4. Jim Stavridis and Dave Weinstein, “Apple vs. FBI Is Not About Privacy vs. Security—It’s About How to Achieve Both,” The World Post, March 8, 2016.
5. The Associated Press, “New FBI Head in San Francisco Was Key Figure in iPhone Hack,” The New York Times, October 5, 2016.
6. The Associated Press, “New FBI Head in San Francisco Was Key Figure in iPhone Hack,” The New York Times, October 5, 2016.
7. The Associated Press, “New FBI Head in San Francisco Was Key Figure in iPhone Hack,” The New York Times, October 5, 2016.
8. The Associated Press, “New FBI Head in San Francisco Was Key Figure in iPhone Hack,” The New York Times, October 5, 2016.
9. The Associated Press, “New FBI Head in San Francisco Was Key Figure in iPhone Hack,” The New York Times, October 5, 2016.
10. The Associated Press, “New FBI Head in San Francisco Was Key Figure in iPhone Hack,” The New York Times, October 5, 2016.
11. The Associated Press, “New FBI Head in San Francisco Was Key Figure in iPhone Hack,” The New York Times, October 5, 2016.
12. Arjun Kharpal, “Apple vs FBI: All You Need to Know,” CNBC.com, March 29, 2016.
13. Nicole Perlroth and Katie Benner, “Subpoenas and Gag Orders Show Government Overreach, Tech Companies Argue,” The New York Times, October 4, 2016.
14. Nicole Perlroth and Katie Benner, “Subpoenas and Gag Orders Show Government Overreach, Tech Companies Argue,” The New York Times, October 4, 2016.
15. Nicole Perlroth and Katie Benner, “Subpoenas and Gag Orders Show Government Overreach, Tech Companies Argue,” The New York Times, October 4, 2016.
16. Nicole Perlroth and Katie Benner, “Subpoenas and Gag Orders Show Government Overreach, Tech Companies Argue,” The New York Times, October 4, 2016.
17. Nicole Perlroth and Katie Benner, “Subpoenas and Gag Orders Show Government Overreach, Tech Companies Argue,” The New York Times, October 4, 2016.