Strong cryptography is clearly required to protect sensitive government, business, and personal information. But it also sometimes makes it difficult for law enforcement and intelligence agencies to obtain information they need. The United States government has been publically struggling with this issue since the mid-1990s, when the painful sinking of the Clipper Chip initiative demonstrated that the American public would not tolerate government-mandated key escrow, and that forcing it on vendors had the potential to cripple tech-sector exports.

Legislators have essentially refused to address the thorny issue, resulting in law enforcement agencies using any available avenues. Recent litigation between the FBI and Apple has brought the issue to a head, and the case is being carefully watched worldwide at the watercooler and the boardroom table.

The FBI is investigating an act of terrorism. Instead of seeking Apple’s advice from the beginning, they apparently made a serious error that denied them access to data stored on an iPhone via the iCloud backup mechanism. It is obvious why the FBI wants to view all data on the phone, and why they are willing to use any legal means at their disposal, including a catch-all law from 1789, in an attempt to force Apple to create a special tool to hack it.

There is a far more troubling aspect to this story. According to the Guardian, “For months, the FBI searched for a compelling case that would force Apple to weaken iPhone security – and then the San Bernardino shooting happened…this carefully planned legal battle has been months in the making, US officials and tech executives told the Guardian, as the government and Apple try to settle whether national security can dictate how Silicon Valley writes computer code.”

It would be naive to believe that this case is about a single iPhone; it is about whether the government can force a vendor to create a backdoor to their product. According to the Guardian, Manhattan District Attorney Cyrus Vance said he wants similar judicial orders to access 175 locked iPhones. If the FBI succeeds with this case, a long line will form at Apple’s door. If the United States can compel Apple to create this tool, so can every other country in which Apple has a physical presence.

Creating a tool to bypass critical security features would not only open the door to a stream of law enforcement and intelligence agency requests, but also more generally demonstrate that the attack is technically possible. Both would be bad news for Apple and their customers.

Most coverage of the FBI vs. Apple case has focused on the legal and public opinion aspects, and many observers have correctly pointed out that Apple is acting in the best interests of itself, its customers, and the general public. Missing from many of those discussion is that using the tool requested by the FBI should not be possible in the first place.

Apple has made significant security improvements in the iPhone, including mandatory data encryption. Users have the option of enabling a feature to erase all data on the iPhone after 10 failed passcode attempts. This is clearly effective given that the FBI’s request is for a custom version of the operating system that disables the erase feature and allows the passcodes to be entered by computer instead of by hand.

In other words, the FBI believes that the only way in is to brute force the passcode, and that is good news for iPhone owners. But a properly designed product should not allow an operating system update (or any other security-related software change) while the device is locked. Perhaps what the FBI wants is indeed not possible and Apple is fighting the order to demonstrate commitment to their customers and prevent a dangerous precedent.

In any event, the outcome of this case will have significant consequences for the entire US technology sector and Canadian vendors with a US presence. Apple has no choice but to strenuously fight the FBI’s demands.