Apple’s Craig Federighi: Don’t let the FBI Turn Back the Clock on Security

| News

Apple isn't letting up on its public campaign to raise awareness in the fight to avoid complying with a court order for software the FBI can use to unlock an iPhone. The latest volley comes from Apple's senior vice president of Software Engineering Craig Federighi in a Washington Post op-ed claiming the FBI is trying to push the digital security clock back in time.

Apple's software boss speaks out against FBI iPhone unlocking orderApple's software boss speaks out against FBI iPhone unlocking order

The fight between Apple and the FBI started when the agency obtained a Federal Court order compelling the company to develop a version of iOS that bypasses the security features preventing brute force attacks on lock screen passcodes. Agents requested the altered iPhone operating system so they can hack into the phone Syed Farook had when he went on a shooting rampage with his wife Tashfeen Malik, killing 14 of their San Bernardino County coworkers and injuring 22 others.

Apple had been working with the FBI to get as much information from the iPhone as they could, but stopped short of creating a hackable version of iOS. The company said the court order was an overreach of government authority and a serious threat to privacy and security.

FBI Director James Comey along with Apple senior vice president and general counsel Bruce Sewell appeared before a House Judiciary Committee last week to share their thoughts on encryption, security, and law enforcement's place in our digital world. Apple also filed a formal objection to the court order and a motion to vacate.

Apple CEO Tim Cook condemned the court order, the company posted a public FAQ, Director Comey responded with his own open letter defending the FBI, and now Mr. Federighi is sharing his thoughts. His take:

It's so disappointing that the FBI, Justice Department and others in law enforcement are pressing us to turn back the clock to a less-secure time and less-secure technologies. They have suggested that the safeguards of iOS 7 were good enough and that we should simply go back to the security standards of 2013.

To be fair, we had good security in 2013, although that's a lot like saying we had good health care in 1970. We did, but what we have now is so much better. For iOS, that's a big thing because hackers are more sophisticated now and in many cases are capable of working around what used to be state of the art security.

Forcing Apple to create tools to bypass iOS security sets a dangerous precedent where law enforcement will push for their use in more and more cases, and at some point the code will slip free from the company's control into the wild. Once there, it'll be a free for all with hackers and other governments using them as they please.

"Great software has seemingly limitless potential to solve human problems — and it can spread around the world in the blink of an eye," Mr. Federighi said. "Malicious code moves just as quickly, and when software is created for the wrong reason, it has a huge and growing capacity to harm millions of people."

Apple plans to fight the court order as long as it can and other tech companies are coming to the company's defense. The American Civil Liberties Union, the Electronic Frontier Foundation, and major tech companies all filed amicus briefs on Apple's behalf—and rightly so because the outcome of this case could have a major impact on every company encrypting customer data.

A world where our data is safe in name only doesn't sound like a very friendly place, nor does it seem like a place where privacy is protected. It sounds more like a world where governments and criminals run roughshod over or personal lives and data, and that's the kind of world the FBI is pushing for.

The Mac Observer Spin The Mac Observer Spin is how we show you what our authors think about a news story at quick glance. Read More →

Herein lies the problem: We shouldn't be in a position where a major company has to take a stand to protect us from the government and fight to preserve our privacy.

Popular TMO Stories


Lee Dronick

Too bad for the FBI that Apple doesn’t have same level security as some of the military unite.


’... It sounds more like a world where governments and criminals run roughshod over or personal lives and data, and that’s the kind of world the FBI is pushing for.’

Bravo, Jeff! Well said!

Hitoshi Anatomi

Something crucial is apparently overlooked in the discussions over the backdoor.  iPhone and many other smart devices already have valid backdoors, namely, a fingerprint scanner or a set of camera and software for capturing faces, irises and other body features, which can be collected from the unyielding, sleeping, unconscious and dead people.

It is now known that the authentication by biometrics usually comes with poorer security than PIN/password-only authentication.  If Apple wants to claim that they are conscious of privacy and security, they could tell consumers to turn off the biometric functions.  If the authority wants to have those backdoors open, they could tell consumers to keep them turned on all the times.  And, security-conscious consumers could certainly refrain from turning them on.


Is it actually the case that creating this key to be used only by Apple to unlock a phone as directed by the courts will makes us all or even millions of fair game to hackers and foreign governments? The way this debate is being framed is that once this key is created that it will somehow escape from Aplle into the open and all our phone will become open books.

Isn’t it true that in order to use such a key the evil hackers and government agencies need to have possession of my phone? If my phone is seized due to a warrant following a crime that is something that the courts have been involved with for centuries. The only other scenario that comes to mind is my phone being lost or stolen. It would then have to come into the hands of evil people who also have this out in the open key. Unless it is possible to use this key remotely I just don’t see the security doomsday that is predicted.

If that is what will happen, then why didn’t it happen before Apple locked down the iPhone?

Old UNIX Guy


The problem is that if the U.S. government wins then Russia, China, North Korea, etc., are all going to demand the same back door and Apple will have no legal standing to refuse their requests.

Once - not if (if the U.S. govt wins) - then do you honestly think the backdoor won’t escape into the wild?




It’s existence will certainly increase the market for stolen iPhones.  It will then make identity theft much easier.  It may be the case that the initial software will only work with the phone present, but that doesn’t mean that talented hackers can’t modify it.

You should also remember that it isn’t Apple that’s writing the code, it’s Apple employees.  How is it that they are connected to the crime, which is required for the application of this law?  They can also be coerced and influenced by outside entities to write similar code.


Yes. I am not happy this is an issue at all and understand the potential for trouble. I only want to caution against the extremes. If Apple is legally made to provide this unlocking code; but is the only party that sees and uses it, the code should be as safe from hackers as iOS itself. Apple has been very capable of keeping its’ goods secret.

As far as Apple and its’ employees providing information in response to warrants is concerned; they have done this in the past and will continue to do so. The employees of telecommunications companies have been doing this as well. Regardless of the outcome of this case, my iPhone will still be secure if I use good protection practices. No one will see what is on it unless I allow it or am careless.


I’m not sure that the writ applies to Apple, itself, and really doubt that it applies to Apple’s employees.  The writ applies to those with a connection to the crime.  That they owned an iPhone is a pretty tenuous connection.  That’s like compelling a car manufacturer to do something because the accused owns one of their cars.  Compelling the manufacturer’s engineers is another matter entirely.

The FBI really has no idea what’s on the phone.  The terrorists wiped their computers, so there really is no idea if it has anything at all.  This is just fishing by the FBI.  Note that a warrent is for providing information, not for building a tool.


skipaq said: “If Apple is legally made to provide this unlocking code; but is the only party that sees and uses it, the code should be as safe from hackers as iOS itself. Apple has been very capable of keeping its’ goods secret.”

As I understand the law in the United States, it is utterly NOT reasonable to assume that it would be true that “...the code should be as safe from hackers as iOS itself”.

Instead, it seems most reasonable to assume this sequence of events:
-  If a prosecution or civil suit ever involved evidence which had been obtained through another court-mandated use of this same “FBIOS”, (on the basis of this legally established precedent of the permissible use of that evidence-obtaining instrument - i.e. the “FBIOS” code, operated in privacy only by Apple, but with an external data-link to the government agency subpoenaing that iPhone’s contents for the purpose of doing a data-safe high-speed bruteforce passcode-guess), all competent US defense lawyers would be entitled to subpoena that evidence-gathering instrument “FBIOS” code for examination by their experts (& any lawyer who failed to subpoena that code might generally be considered incompetent).
- So then, this code would no longer be only in Apple’s hands, but instead in the hands of any such lawyer & his/her assistants/agents (who might be facing bribes in the millions of dollars for the sale of that code), & any law-enforcement office (& of course as we all know, whatever is in the hands of the DOJ/FBI, should reasonably be expected to be in the hands of Chinese & Russian & crime syndicate hackers etc. in short order), as well.

Log in to comment (TMO, Twitter or Facebook) or Register for a TMO account