The issue: When Sayed Farook and his wife, Tashfeen Malik, killed 14 co-workers in San Bernardino, California in December, he left behind an iPhone that authorities think might contain clues to the couple's motives and activities leading up to the massacre. Unfortunately for the FBI, the iPhone is locked by a four-digit PIN, and it has only ten tries before the device, as a security feature, bricks itself and becomes unrecoverable. The FBI has obtained a court order requiring Apple to provide a way in - a so-called "back door"
Apple says such a back door does not currently exist. The company, with enough engineering effort, can very well build such a back door (think of it as a master key.) But by handing the resulting technology over to the FBI, it puts end user security at great risk in the event that it falls into the wrong hands. Given how frequently USB drives and laptops loaded with private information are stolen from the trunks of parked cars, it's reasonable to question the FBI's assurances that it'll never be compromised. History suggests it will be.
Apple's response: CEO Tim Cook, in a posting to the Apple website, says if it complies with the court order to help the FBI break into the San Bernardino shooters’ iPhone, it will set a precedent for future cases like this. He calls the request “dangerous” and “chilling”. While company engineers complied with FBI requests in the days after the December attack, Cook says the current order - gain physical access to a locked iPhone - goes too far.
The FBI's defence: The bureau promises the tools - think of them as something like a master key for encrypted devices like the iPhone and iPad - won’t fall into the wrong hands. History suggests otherwise: hackers, crooks and thieves eventually get their hands on tools like this, and the never-ending cops-and-robbers battle goes on.
The context: All this comes just a couple of years after the big NSA scandal, where the government security agency was busted for wantonly snooping on a wide range of devices and networks owned/used by innocent civilians and companies. Tech companies were widely criticized for “rolling over” and simply handing over the keys to the kingdom. They denied being complicit in citizen-spying, but the perception has stuck, and Tim Cook’s stand in this case could at least be partially designed to show how Big Tech is not going to blindly follow government orders.
Apple is already receiving support from other major tech companies, including Google, Facebook, Twitter and Microsoft. None of them wants to be seen as being "soft" in the eyes of consumers. In many respects, this has as much to do with PR as it does security.
As you can imagine, this has been the leading tech story of the week, and it has implications that extend well beyond the tech space, and well beyond America's borders. At issue: Big Government's Big Brother-like ability to track our every online move, and Big Technology's role in either allowing it to happen, or drawing a line in the sand.
We all want to be protected, but I'm guessing not at the expense of our personal freedoms. We all want to prevent the next terrorist attack, but not if it means sacrificing our privacy. Clearly, balance is needed.
Your turn: Where do you stand? Who's right? Wrong? Why? How does this get solved?
Update: I discussed this with CTV News Channel. Link here.