So surely by now everyone is aware that Apple has been ordered to help the FBI unlock the company-owned phone of the San Bernardino terrorists. Apple is refusing to do so. The FBI is trying to make Apple out as “the bad guy” by doing so, claiming that the phone might have information on where the terrorists were during the now infamous eighteen minute window that the FBI has no other information on.
I’m not going to go over the facts of the case — there are plenty of other websites out there that have done so. Instead, I wanted to offer my thoughts on the matter. Perhaps they will be no different from others’, but it is always worth having the conversation.
The FBI wants assistance with unlocking this particular phone in question. To do so, they want to enter every combination of 4-digit passcodes until they discover the one that works. This wouldn’t actually take very long to accomplish (which is why your passcodes should always be longer!), except for two reasons:
- iOS adds a delay between failed attempts. This delay increases after failed attempts. This means that running every combination is going to take a very long time indeed.
- iOS has a feature that wipes the phone after ten failed login attempts. Obviously the FBI would trigger this pretty quickly, assuming the terrorists have the setting enabled. Given that this is a company-owned phone, I suspect that it is. Regardless, the FBI has to operate as if this is the case no matter the true value of the setting.
The FBI wants Apple to create a specialized version of iOS that has these two particular features disabled. That would enable the FBI to quickly unlock the phone without risking wiping the device. Unfortunately, there’s several problems with this:
- Simply the existence of such software (even if keyed to a single device) drastically weakens the security of iOS. At minimum, it signals to Apple’s customers that Apple is not only capable of but also willing to reduce the security of their devices. This is a red flag in the minds of many — and it should also be a red flag for government use as well. Furthermore, there’s no guarantee that such software wouldn’t be leaked to the outside world, in which case, the single-device lock could easily be removed, and the software exploited by any number of people and organizations with malicious intent.
What the FBI is requesting works only on phones protected by a four-digit passcode. Newer phones insist on a six-digit passcode, which is much more difficult to crack (though still pretty easy). iOS has also long allowed longer passwords — these would be very difficult to crack indeed. What happens when the FBI comes across another terrorist’s phone that’s locked with a long password? You can imagine they’re going to ask Apple to come up with a way to allow the FBI to circumvent that login as well.
Already you can see that this is setting a dangerous precedent — Apple is going to be overwhelmed with law enforcement requests for software that circumvents device security. At some point the process will become automated. And it wouldn’t be too long before there would be some master switch in the publicly facing iOS that law enforcement could use to unlock any iOS device simply because Apple wouldn’t be able to justify throwing so many resources and person hours at the problem. And boom — we have devices that are no longer secure.
Of course, none of this would stop at Apple — Google, Microsoft, BlackBerry, etc., would all be drawn into the mess. This precedent against Apple easily would apply to other companies as well.
At some point we, as a society, have to ask ourselves where the line is drawn between bringing criminals to justice and the right to privacy. Most people, of course, have very little to hide (as Facebook posts will attest), but the world is a dangerous place, and so we keep things private that could be used to cause us harm. I’m not going to publicize my social security number, for example, because I know some malicious person will take it and use it to their benefit and my detriment. And yet if we eliminate the right to privacy, we lose what little control we have over the parties who may become privy to the information we hold dear. Oh sure, you might say, “the Government can be trusted, especially if you have nothing to hide”, but this is far from the truth, and places an absurd amount of trust in “the system”. People are people, and we know that people can be corrupt — even in the government. Furthermore, the government itself is far from secure. How many hackers have gained unauthorized access to governmental networks? Far too many to risk our private information being in the hands of an inscrutable entity.
Personally, I’m willing to allow the benefits of the right to privacy to outweigh the possible benefits of figuring out what happened during eighteen minutes in the life of a terrorist. Could the phone indicate that the terrorists were working with a group that was planning a much larger attack that could kill hundreds, even thousands of people? Absolutely. And yet are we willing to surrender the right to privacy of billions of individuals on the off-chance that those eighteen minutes would lead to such information?
Face it — there are always going to be bad people out there who wish to injure and even kill other people. Although they should indeed be brought to justice and rehabilitated, we shouldn’t blindly trample the rights of the remaining citizenry. To do so inevitably pushes us towards a totalitarian society where privacy is dead, the economy is in shambles, and no one has an ounce of faith in anyone else. I really don’t want to live in that kind of society.
In that vein, if you agree, I urge you to sign this petition. Yes, it’s attached to the U.S. government, but they already have your name and the like. If you’d rather sign a different petition, sign this one from change.org. Or sign both (I did).