Why the FBI vs. Apple Matters

iphone-6s-tear-downTo most Americans, Apple’s refusal to unlock the San Bernadino shooter’s iPhone seems an untenable position. After all, Farook is a known terrorist who committed a horrific crime. That a legal warrant should be issued to search his every sock drawer and hard drive to uncover links to other terrorists or plots is patently obvious. Clearly Apple should just give the FBI what they want. So why is Apple balking? And why does most of the tech community side with them?

The facts are a bit confusing to those outside the tech community. The average American doesn’t (and probably doesn’t want to) understand the intricacies of data encryption and security. With that in mind, I’m going to try to make a more real-world analogy that everyone can relate to, but still illustrates the problem at hand. To do that, let’s assume that this is the 1960s. “Ivan” has just committed an act of terror in the name of the USSR. He was killed in the event, but police suspect he may have had microfilmed plans and lists of other Soviet agents inside the US.

The FBI discovers Ivan has a safety deposit box at the local bank. They go to the court and get a warrant, present it to the bank, and the bank manager opens the box inside the vault and surrenders the contents inside to the authorities.

This situation is similar to requests Apple has responded to many times before. It is a request for something Apple has possession of (e.g. iMessage conversations on its servers) which are turned over willingly with the proper legal authorization. This is how most people seem to be thinking of the Farook iPhone case, but it is not similar to the current case at all.  For that, let’s move on to the next scenario.

Police then discover that Ivan has an ACME Self Destructing Safe in his basement. The feds know this safe is equipped with an acid release failsafe inside the unit such that if the wrong combination is tried too many times or they attempt to force open the safe, the acid is released and all contents of the safe are destroyed.

The FBI then goes to the ACME company and asks them to open the safe. But ACME explains that even they don’t have the combination. Only Ivan did, and he’s gone. ACME doesn’t own the safe or any of its contents. It just designed and built it. Then the FBI comes back to ACME with a new plan and a court order to make ACME implement it. They want ACME to build them a device that can neutralize the acid failsafe so that the police can then just crack the safe.

However, ACME is aware that this acid neutralizing device will actually work on any of their safes, not just Ivan’s. Further, they know their safes are the bane of the FBI, and that police have hundreds of these legally confiscated safes from other crimes stored in evidence lockers across the country. The FBI would love to open them all.

ACME is worried that eventually one of the neutralizing devices or the plans for one will get out in the public or on the black market, and once that horse is out of the barn, there’s no putting it back. They realize that what the FBI is asking them to do is effectively remove the acid failsafe as a security feature from everyone’s safe, not just Ivan’s. This compromises the safety of ACME’s many legitimate customers who have trusted them to secure their belongings.

Further, the security industry as a whole is worried that if ACME yields, it sets the precedent that no one can build and sell uncrackable safes or unbreakable locks. Every security system must be penetrable by the police without the owner’s cooperation. But such a built in weakness is also exploitable for nefarious purposes, both by corrupt government agents as well as theives and spies.

This is the situation Apple finds itself in with the locked iPhone. Once it builds the crack tool, there is no reality under which it would be used just once and destroyed. Even if that tool was safely destroyed, the FBI would be back next week with another warrant for another iPhone, and they would be forced to build it again. Eventually, it becomes impractical to destroy and rebuild the tool each time, so the issue becomes about controlling access to the tool.

Therein lies the weakness. In a world where horses don’t exist, no one has to worry about watching the barn door. But once you create a horse, then the door becomes a liability. And because horses are useful, eventually you have multiples… then multiple barns… and multiple doors. It’s only a matter of time before one gets loose. After all, no security system is perfect.

6 thoughts on “Why the FBI vs. Apple Matters

  1. Good piece Tim. I guess I keep going back to the “no security is perfect” declaration. And I have to think that is nothing new. Technology changes every day. Might the engineers at Apple build the next generation software gizmo thingy that defies the current technology to crack the safe? Or am I missing something?

  2. If I understand your question, you’re asking if we might someday be able to invent security that only allows in the user as well as the government, when authorized. I suspect the answer is no. It’s not a tech limitation, but a human problem. “The gov’t” is an organization, not a person. Allowing access to an org. requires some sort of a key that is not tied to one person. That key has to be shareable. Moreover, the sharing can’t be controlled by the user for it to be useful. So the gov’t (even if a different branch/org) has to control the sharing. It is this limitation/control of organizational sharing of keys where the core weakness lies. Tech doesn’t fix that.

  3. I know what I mean but I’m not articulating it very well. The line of thought on this seems to be following the paradigm of how to make stuff “secure” . I’m wondering if there is a concept brewing that could fix that, not necessarily thinking/following traditional path ways.

Leave a Reply

Your email address will not be published. Required fields are marked *