Senator Dianne Feinstein (D-CA) is dusting off her bill aimed at forcing technology companies to give the U.S. government access to the encrypted data on our smartphones, tablets, and computers. FBI Director James Comey is on board with her plan saying the inability to access our encrypted data is a major security threat to the country.
Director Comey told the Judiciary Committee that more than 3,000—or nearly half—of the smartphones the FBI tried to access in the first half of the year were impenetrable walls. The encrypted data they hold can’t be viewed and is vital to FBI investigations, he said.
The proposed law Senator Feinstein is introducing started life last year as the Burr-Feinstein bill following the San Bernardino mass shooting ans subsequent fight between the FBI and Apple over unlocking an iPhone 5c recovered from one of the suspects. The phone was encrypted and the FBI didn’t have the passcode to unlock it, so agents asked Apple to get them into the phone’s content.
Apple doesn’t have any way to bypass the built-in security features on an iPhone or iPad, so the FBI got a court order to force Apple to make a hackable version of iOS they could use to unlock the phone. Apple refused saying intentionally weakening iPhone privacy protections posed a major security risk to all iPhone owners. They claimed the hackable iOS could be used by criminals, hackers, and rogue governments to access personal information, private conversations, credit card and bank account information, and more.
Apple also said there wouldn’t be any way to keep the hackable iOS from eventually leaking and falling into the wrong hands. The FBI and Department of Justice said that wouldn’t happen, and it would be used just for the San Bernardino case to unlock the shooter’s iPhone.
Other law enforcement agencies, and even other FBI cases, however, were already looking to use the hackable iOS in investigations. Director Comey even said during Congressional hearings on the debate over whether or not Apple should comply with the court order that doing so would “set a precedent” and could lead to other investigations wanting to use the hack.
The FBI dropped its fight with Apple only hours before a scheduled court hearing when Cellebrite managed to hack into the iPhone. That wasn’t the end of the fight, however, because weeks later Senators Richard Burr (R-NC) and Feinstein introduced their bill to force technology companies to include what amount to back doors into our personal encrypted information.
The bill fizzled out fairly quickly when they couldn’t get the support it needed to move forward.
Welcome Back, Encryption Back Door Bill
Now Senator Feinstein is ready to try again and Director Comey is giving it his own thumbs up. She said, “We had looked at legislation that would take into consideration events of national security and provide that devices — there must be some way of even going before a judge and getting a court order to be able to open a device.”
Director Comey replied saying, “What nobody wants to have happen is something terrible happen in the United States and it be connected to our inability to access information with lawful authority. We ought to have the conversations before that happens.”
He says companies need to find ways to comply with court orders granting law enforcement access to our encrypted data. Since that’s typically not possible today, companies would have to build back doors into their products allowing court mandated access.
Director Comey says he isn’t asking for a back door, but instead simply wants tech companies to find a way to let law enforcement see the data encrypted on our devices. He said,
We all love privacy, we all care about public safety and none of us want backdoors—we don’t want access to devices built in in some way. What we want to work with the manufacturers on is to figure out how can we accommodate both interests in a sensible way.
An Encryption Back Door By Any Other Name
The problem with Director Comey’s argument is that in essence he is looking for a different word for “back door.” Giving law enforcement agencies a way to see data that’s otherwise encrypted without using the owner’s passcode requires an alternate way to access the data, and that’s the essence of a back door.
As Apple and many security experts have noted, a back door law enforcement uses is a weakness others can exploit, too. Knowing there’s a built-in security weakness will attract hackers, and eventually someone will find the crack—or someone will leak it.
Changing the words used to describe what Senator Feinstein and Director Comey want doesn’t change what it really is: in intentional weakness in our personal encryption, our private communication, and our online credit card transactions.
The push to erode privacy and online security hasn’t gone away, and the fight to protect encryption is ongoing. Sadly, it seems our legislators and law enforcement have lost sight of the inevitable change should this bill become law: the people they aim to protect will be even more vulnerable, and the people they want to target will simply find other ways to encrypt their data and communication.
[Thanks to TechCrunch for the heads up]