Tim Cook: Let's Talk about Privacy, not Strip it Away

The debate around whether or not Apple should be forced to strip security features out of iOS so the FBI can hack into an iPhone rages on with CEO Tim Cook calling for a commission to discuss technology, security, privacy, and law enforcement. Mr. Cook's proposal is Apple's latest salvo following a Federal court order compelling Apple to create the tools the FBI needs to hack into an iPhone used by one of the shooters in last December's San Bernardino terrorist attack—and that sounds far more level headed than rushing down a one way path that strips away our privacy and national security.

In an email to employees (thanks, TechCrunch) Mr. Cook said,

Our country has always been strongest when we come together. We feel the best way forward would be for the government to withdraw its demands under the All Writs Act and, as some in Congress have proposed, form a commission or other panel of experts on intelligence, technology and civil liberties to discuss the implications for law enforcement, national security, privacy and personal freedoms. Apple would gladly participate in such an effort.

The FBI obtained a court order last week compelling Apple to create a special version of iOS that removes the ten try limit for passcodes before rendering all the data permanently inaccessible. The order also said Apple must remove the time delays between failed login attempts, and give the FBI a way to automate passcode entry.

Apple wants a discussion on technology and privacyApple wants a discussion on technology and privacy

Apple has no interest in complying despite the FBI saying this will be a one-off deal and that the iPhone can stay in the company's own labs during the process. "The government suggests this tool could only be used once, on one phone. But that's simply not true," Mr. Cook said in an open letter. "Once created, the technique could be used over and over again, on any number of devices."

Apple was helping the FBI recover as much data as they could from the iPhone in question back in January. Unfortunately, the FBI shot itself in the foot (figuratively speaking) by ordering San Bernardino County to reset the iCloud password associated with the phone. The county issued the phone to the shooter because he was a county employee.

Once the password was reset, there wasn't any way to get the iPhone to backup to iCloud and the device was essentially cut off from the FBI. Apple had already provided the FBI with the most recent iCloud backups available, which were from about three months ahead of the shootings.

According to Apple, all of the data that it's capable of recovering has been, and it's all in the hands of the FBI. To do anything more, Apple would have to create the hackable iOS version the FBI wants, and that's a Pandora's box Mr. Cook isn't comfortable opening.

Creating the special iOS version for the FBI would set a dangerous precedent where companies could be expected to develop forensic tools intended to let governments bypass security and privacy features. Once governments know those tools exist they, along with hackers, would be lining up to get their hands on them which would only erode our privacy and security even more.

Mr. Cook confirmed that saying, "Law enforcement agents around the country have already said they have hundreds of iPhones they want Apple to unlock if the FBI wins this case."

Should Apple have to comply with the court oder, it isn't simply providing a service where it hands over data to the FBI, it's creating—at the behest of the U.S. Government—forensic tools that any defense team would demand access to so they could verify data isn't altered during the passcode hacking and data retrieval process.

Once Apple is compelled to hand over its code, keeping it out of the hands of people who would be more than happy to reverse engineer it will be nearly impossible. Instead of heading down that path, the public would be better served by the FBI rescinding its request for a hackable iOS and letting a commission composed of technology leaders and companies, security and privacy experts, and government officials work through a reasonable approach to dealing with the needs of law enforcement in a world where data encryption in our pockets is commonplace.

As former CIA and NSA director Michael Hayden said, weakening security and encryption is bad for everyone, including the government. Forcing Apple—and later other companies—to do that is a bad idea, and it's time for the FBI Director James Comey to realize that before he irreparably damages privacy and security for us, law enforcement, and the government.