Apple is Making iPhone Hacking A Lot More Difficult for Law Enforcement with iOS 11.4

iPhone X with Lightning port to USB blocked

Apple’s quest to keep the data on our iPhones safe and secure is taking an interesting turn that no doubt won’t please law enforcement. When iOS 11.4 ships it’ll include a security feature that disables the Lightning port if the iPhone hasn’t been unlocked for seven days.

The feature is called USB Restricted Mode. Apple describes it like this:

To improve security, for a locked iOS device to communicate with USB accessories you must connect an accessory via Lightning connector to the device while unlocked–or enter your device passcode while connected–at least once a week.

The Lightning port will still work for charging after the seven day window, but won’t sync or pass any data. To regain full Lightning port functionality you need to enter the device passcode.

The iPhone Hacking Time Limit

That’s bad news for law enforcement agencies relying on companies like Celebrite and Grayshift to unlock iPhones. These companies often physically connect a device to the iPhone’s Lightning port to hack in, break the passcode, and access data.

Government agencies typically have a backlog of devices they want to crack. In some cases, that backlog is counted in years. Now Apple is about to limit the government’s window to seven days.

iPhone X with Lightning port to USB blocked
iOS 11.4 can disable USB connections through the Lightning port after seven days

Apple’s new time restriction means law enforcement agencies will have to decide quickly which iPhones they want to hack for evidence. Even still, the odds of forensic specialists being able to keep up with the workload is pretty slim.

That doesn’t, however, mean Apple has found a foolproof way to keep governments and hackers from breaking through iPhone security. They can still desolder chips from the phone’s circuit board and work to hack those, although that requires a lot more skill than simply plugging in a cable and waiting.

Apple’s Fight Against Government iPhone Hacking

Apple’s efforts to lock down our iPhones to protect them from hacking took a very public turn in 2015 when the FBI tried to force the company to create a hackable version of iOS. The FBI wanted the special operating system so it could hack into an iPhone recovered from a mass shooting suspect.

[The Government’s Bad Move: Ordering Apple to Hack iPhone Security]

[FBI Hacks into Syed Farook’s iPhone, will Withdraw Apple’s Unlock Order]

The suspect was killed in a shootout with police and no one else knew the iPhone’s passcode. Apple recovered what data they could from the phone, but didn’t have a way to work around the passcode to access its encrypted data.

The FBI went so far as to get a court order demanding Apple make the hackable iOS version. Apple resisted, saying the FBI and U.S Government were overstepping their authority, and added that the hackable operating system would set a dangerous precedent.

Apple never had to go to court because the FBI dropped its case when a hacking company, presumably Celebrite, gained access to the iPhone’s encrypted content. Fast forward to spring 2018 and Grayshift was making the news with its GrayKey iPhone hacking device for law enforcement.

[GrayKey Underscores Why We Need Strong iPhone Passcodes]

Those hacking devices are about to become far less useful thanks to USB Restricted Mode. Once iPhone and iPad owners can install the update, law enforcement agencies will have to work much faster to unlock iPhones for evidence.

That’s not going to sit well with government agencies looking to hack into encrypted devices. It’s a cat and mouse game where Apple needs to stay ahead, despite government desires for easy hackability, otherwise our data is susceptible to anyone with the right skills and equipment—both good guys and bad guys.

6 thoughts on “Apple is Making iPhone Hacking A Lot More Difficult for Law Enforcement with iOS 11.4

  • Hi guys, I’m Isabella and I live in Milton Keynes, London and I was trapped in an abusive, unstable and very tumultuous relationship with my ex husband of 14 years. I was too scared to leave him because of my 3 boys. My very good friend and neighbor helped me to hack into he’s smart devices and got me all the proof of my claims within hours and few weeks later I won full custody of my boys and a well deserved settlement for child support. So anyone out there in a similar situation can reach out to cyberblackhatinfiltrator at gmail , he really is heaven sent.

  • As I’m not a constitutional lawyer, I don’t understand how the 4th and 5th Amendments in the U.S. don’t apply to such searches without a judge-issued warrant. I also don’t understand how law enforcement folks are not subject to criminal charges and loss of job for violating these laws and for their non-performance in upholding them, as they swore they would do.

    If you or I did not do a key part of our jobs, and undermined the organization we work for, we would likely lose our jobs.

    Why are law enforcement folks not held accountable?

    1. The justice system is just 10 years behind understanding that these devices are an extension of our brains/memories. It will get really messy when they start doing actual implants that do some of the same things.

      Not to mention the next wave of fMRI technologies and the ability to read thoughts without consent…

  • That’s not going to sit well with government agencies looking to hack into encrypted devices.

    No, Jeff, it’s not.

    Two things.

    First, we need to see how governments worldwide will respond to this, including not simply the overt and direct responses from agencies decrying this move, and competitors seeking to curry favour (and contracts) at Apple’s expense and who will attempt to portray this as Apple defending criminals, terrorists, child sexual predators and any other miscreant one cares to add to the list but, in authoritarian and nationalist environments, whether or not we see this portrayed in terms of anti-law enforcement, disloyalty to (insert nation of choice) or even treason, and a national security threat, particularly as elections get close in those countries that even bother holding them (which is most) or there is some major crime or terrorist act. This latter could have an impact not simply on market access, but retaliatory legislation, rendering such technology illegal. I suspect, at least in most Western countries, this may be hard to pass.

    Second, and forgive my repeating this, Apple’s superior security measures, such as end to end encryption, secure enclave and now USB Restricted Mode might lie at the heart of why Apple are not disseminating these technologies to other platforms, like Android. In order for Apple to be free to compete in certain controlled markets, they may have had to agree not to make such technologies available to the wider prevalence of less secure devices on the market, like Android handhelds. This may be a quiet agreement, in order for Apple to be able sell their devices in authoritarian environments. Admittedly, this is speculation; however if we continue not to see these technologies extend beyond iOS, this remains one of many plausible explanations.

Leave a Reply

This site uses Akismet to reduce spam. Learn how your comment data is processed.