Apple can comply with the FBI court order

Earlier today, a federal judge ordered Apple to comply with the FBI’s request for technical assistance in the recovery of the San Bernadino gunmen’s iPhone 5C. Since then, many have argued whether these requests from the FBI are technically feasible given the support for strong encryption on iOS devices. Based on my initial reading of the request and my knowledge of the iOS platform, I believe all of the FBI’s requests are technically feasible.

The FBI’s Request

In a search after the shooting, the FBI discovered an iPhone belonging to one of the attackers. The iPhone is the property of the San Bernardino County Department of Public Health where the attacker worked and the FBI has permission to search it. However, the FBI has been unable, so far, to guess the passcode to unlock it. In iOS devices, nearly all important files are encrypted with a combination of the phone passcode and a hardware key embedded in the device at manufacture time. If the FBI cannot guess the phone passcode, then they cannot recover any of the messages or photos from the phone.

There are a number of obstacles that stand in the way of guessing the passcode to an iPhone:

  • iOS may completely wipe the user’s data after too many incorrect PINs entries
  • PINs must be entered by hand on the physical device, one at a time
  • iOS introduces a delay after every incorrect PIN entry

As a result, the FBI has made a request for technical assistance through a court order to Apple. As one might guess, their requests target each one of the above pain points. In their request, they have asked for the following:

  1. [Apple] will bypass or disable the auto-erase function whether or not it has been enabled;
  2. [Apple] will enable the FBI to submit passcodes to the SUBJECT DEVICE for testing electronically via the physical device port, Bluetooth, Wi-Fi, or other protocol available on the SUBJECT DEVICE; and
  3. [Apple] will ensure that when the FBI submits passcodes to the SUBJECT DEVICE, software running on the device will not purposefully introduce any additional delay between passcode attempts beyond what is incurred by Apple hardware.

In plain English, the FBI wants to ensure that it can make an unlimited number of PIN guesses, that it can make them as fast as the hardware will allow, and that they won’t have to pay an intern to hunch over the phone and type PIN codes one at a time for the next 20 years — they want to guess passcodes from an external device like a laptop or other peripheral.

As a remedy, the FBI has asked for Apple to perform the following actions on their behalf:

[Provide] the FBI with a signed iPhone Software file, recovery bundle, or other Software Image File (“SIF”) that can be loaded onto the SUBJECT DEVICE. The SIF will load and run from Random Access Memory (“RAM”) and will not modify the iOS on the actual phone, the user data partition or system partition on the device’s flash memory. The SIF will be coded by Apple with a unique identifier of the phone so that the SIF would only load and execute on the SUBJECT DEVICE. The SIF will be loaded via Device Firmware Upgrade (“DFU”) mode, recovery mode, or other applicable mode available to the FBI. Once active on the SUBJECT DEVICE, the SIF will accomplish the three functions specified in paragraph 2. The SIF will be loaded on the SUBJECT DEVICE at either a government facility, or alternatively, at an Apple facility; if the latter, Apple shall provide the government with remote access to the SUBJECT DEVICE through a computer allowed the government to conduct passcode recovery analysis.

Again in plain English, the FBI wants Apple to create a special version of iOS that only works on the one iPhone they have recovered. This customized version of iOS (ahem FBiOS) will ignore passcode entry delays, will not erase the device after any number of incorrect attempts, and will allow the FBI to hook up an external device to facilitate guessing the passcode. The FBI will send Apple the recovered iPhone so that this customized version of iOS never physically leaves the Apple campus.

As many jailbreakers are familiar, firmware can be loaded via Device Firmware Upgrade (DFU) Mode. Once an iPhone enters DFU mode, it will accept a new firmware image over a USB cable. Before any firmware image is loaded by an iPhone, the device first checks whether the firmware has a valid signature from Apple. This signature check is why the FBI cannot load new software onto an iPhone on their own — the FBI does not have the secret keys that Apple uses to sign firmware.

Enter the Secure Enclave

Even with a customized version of iOS, the FBI has another obstacle in their path: the Secure Enclave (SE). The Secure Enclave is a separate computer inside the iPhone that brokers access to encryption keys for services like the Data Protection API (aka file encryption), Apple Pay, Keychain Services, and our Tidas authentication product. All devices with TouchID (or any devices with A7 or later A-series processors) have a Secure Enclave.

When you enter a passcode on your iOS device, this passcode is “tangled” with a key embedded in the SE to unlock the phone. Think of this like the 2-key system used to launch a nuclear weapon: the passcode alone gets you nowhere. Therefore, you must cooperate with the SE to break the encryption. The SE keeps its own counter of incorrect passcode attempts and gets slower and slower at responding with each failed attempt, all the way up to 1 hour between requests. There is nothing that iOS can do about the SE: it is a separate computer outside of the iOS operating system that shares the same hardware enclosure as your phone.

The Hardware Key is stored in the Secure Enclave in A7 and newer devices

The Hardware Key is stored in the Secure Enclave in A7 and newer devices

As a result, even a customized version of iOS cannot influence the behavior of the Secure Enclave. It will delay passcode attempts whether or not that feature is turned on in iOS. Private keys cannot be read out of the Secure Enclave, ever, so the only choice you have is to play by its rules.

Passcode delays are enforced by the Secure Enclave in A7 and newer devices

Passcode delays are enforced by the Secure Enclave in A7 and newer devices

Apple has gone to great lengths to ensure the Secure Enclave remains safe. Many consumers became familiar with these efforts after “Error 53” messages appeared due to 3rd party replacement or tampering with the TouchID sensor. iPhones are restricted to only work with a single TouchID sensor via device-level pairing. This security measure ensures that attackers cannot build a fraudulent TouchID sensor that brute-forces fingerprint authentication to gain access to the Secure Enclave.

For more information about the Secure Enclave and Passcodes, see pages 7 and 12 of the iOS Security Guide.

The Devil is in the Details

“Why not simply update the firmware of the Secure Enclave too?” I initially speculated that the private data stored within the SE was erased on updates, but I now believe this is not true. Apple can update the SE firmware, it does not require the phone passcode, and it does not wipe user data on update. Apple can disable the passcode delay and disable auto erase with a firmware update to the SE. After all, Apple has updated the SE with increased delays between passcode attempts and no phones were wiped.

If the device lacks a Secure Enclave, then a single firmware update to iOS will be sufficient to disable passcode delays and auto erase. If the device does contain a Secure Enclave, then two firmware updates, one to iOS and one to the Secure Enclave, are required to disable these security features. The end result in either case is the same. After modification, the device is able to guess passcodes at the fastest speed the hardware supports.

The recovered iPhone is a model 5C. The iPhone 5C lacks TouchID and, therefore, lacks a Secure Enclave. The Secure Enclave is not a concern. Nearly all of the passcode protections are implemented in software by the iOS operating system and are replaceable by a single firmware update.

The End Result

There are still caveats in these older devices and a customized version of iOS will not immediately yield access to the phone passcode. Devices with A6 processors, such as the iPhone 5C, also contain a hardware key that cannot ever be read. This key is also “tangled” with the phone passcode to create the encryption key. However, there is nothing that stops iOS from querying this hardware key as fast as it can. Without the Secure Enclave to play gatekeeper, this means iOS can guess one passcode every 80ms.

Passcodes can only be guessed once every 80ms

Passcodes can only be guessed once every 80ms with or without the Secure Enclave

Even though this 80ms limit is not ideal, it is a massive improvement from guessing only one passcode per hour with unmodified software. After the elimination of passcode delays, it will take a half hour to recover a 4-digit PIN, hours to recover a 6-digit PIN, or years to recover a 6-character alphanumeric password. It has not been reported whether the recovered iPhone uses a 4-digit PIN or a longer, more complicated alphanumeric passcode.

Festina Lente

Apple has allegedly cooperated with law enforcement in the past by using a custom firmware image that bypassed the passcode lock screen. This simple UI hack was sufficient in earlier versions of iOS since most files were unencrypted. However, since iOS 8, it has become the default for nearly all applications to encrypt their data with a combination of the phone passcode and the hardware key. This change necessitates guessing the passcode and has led directly to this request for technical assistance from the FBI.

I believe it is technically feasible for Apple to comply with all of the FBI’s requests in this case. On the iPhone 5C, the passcode delay and device erasure are implemented in software and Apple can add support for peripheral devices that facilitate PIN code entry. In order to limit the risk of abuse, Apple can lock the customized version of iOS to only work on the specific recovered iPhone and perform all recovery on their own, without sharing the firmware image with the FBI.


For more information, please listen to my interview with the Risky Business podcast.

  • Update 1: Apple has issued a public response to the court order.
  • Update 2: Software updates to the Secure Enclave are unlikely to erase user data. Please see the Secure Enclave section for further details.
  • Update 3: Reframed “The Devil is in the Details” section and noted that Apple can equally subvert the security measures of the iPhone 5C and later devices that include the Secure Enclave via software updates.

271 thoughts on “Apple can comply with the FBI court order

  1. Pingback: A judge just ordered Apple to help brute-force San Bernardino shooter’s iPhone—and break its promise on privacy | Fusion

  2. Pingback: How long would it take for Apple to break into your iPhone? – The Eclectic Light Company

  3. Hi Dan, can you cite any references for this?

    “Although this feature is not described in Apple iOS Security Guide, it is widely believed that updates to the Secure Enclave wipe all existing keys stored within it. An update to the Secure Enclave is therefore equivalent to erasing the device.”

    FWIW the iOS Security Guide is as comprehensive as it gets on the subject.

      • Correct. The secure enclave was introduced with the A7 . If your phone supports fingerprint unlock you should have the secure enclave. 5s with iOS 8+ fall into this category.

    • “I initially speculated that the private data stored within the SE was erased on update but I now believe this is not true. After all, Apple has updated the SE with increased delays between passcode attempts and no phones were wiped. In all honestly, only Apple knows the exact details.”

      Could be that it allows preserving keys only if it’s unlocked with the passcode, normally with a FW upgrade you’d need to unlock the device anyway.

  4. Pingback: Here's Why Apple Is Going To War Over FBI 'Backdoor' Order - Fortune

  5. Pingback: Here's how it would work if Apple cooperated with the US government to unlock iPhones (AAPL)

  6. Pingback: Apple Goes to War Over iPhone 'Backdoor' Order

  7. Pingback: Apple Goes to War Over iPhone 'Backdoor' Order

  8. Pingback: This is why the FBI can’t hack into iPhones - Quartz

  9. Pingback: Apple posts customer letter in response to FBI's iPhone unlocking demand

  10. Pingback: Apple's Tim Cook: We'll fight 'iPhone backdoor' demands from FBI -

  11. Pingback: Security firm shows how Apple could bypass iPhone security to comply with FBI request | ..:: Frog in the box ::.. | ..:: Frog in the box ::..

  12. Pingback: Here's how it would work if Apple cooperated with the US government to unlock iPhones (AAPL) | ..:: Frog in the box ::.. | ..:: Frog in the box ::..

    • The key for file encryption is made of both the user passcode and the hardware key. All passcode cracking must therefore occur on the physical device. There is no way around it unless you can extract the hardware key (very difficult).

      • On the 4S,5,5C (<=iOS 8.4.1) there is a user code disable and wipe disable available in secret corners of the forensic community. *cough* Cellebrite CAIS *cough* Not sure if the phone in question was iOS 9 or not. Also, it seems you should be able to mill down the chip and use a pcb xray to read the fused key directly from hardware. Dmitry Nedospasov did a similar talk on it at 30c3.

      • “…mill down the chip and use a pcb xray to read the fused key directly from hardware.” Now that sounds like a promising approach. Software that allows decryption ONLY if the HW key is known would be a far safer thing to create.

      • I doubt it. Techniques for physically obscuring a masked key have been in use on the chip on a credit card for more than 10 years. Assuming Apple used these common techniques for the SE then milling isn’t going to get you anywhere as the process of milling would need intimate knowledge of the chip design to ascertain where the key is located and how the mask is scrambled. Electron microscopes are available for a lot less than $1m but looking at such a chip reveals a mask pattern than looks like white noise. Similar masking techniques prevent current or voltage sensing approaches. Life with the A6 might be different but I’m willing to bet that there are sufficient hardware secrets in the chip design that Apple has used some of these techniques from the outset.

      • The milling down and pcb xray is effort is a daily event at an Ottawa company that investigates devices for patent infringement. BTW: that activity is forbidden by DMCA, which is why it happens in Canada.

    • The short answer is that there is no remotely plausible chance to brute force the key. Billions of computers, each doing billions of encryptions per second, give an expected crack time of billions of years.

  13. Pingback: Security firm shows how Apple could bypass iPhone security to comply with FBI request - iOS Gears

  14. Pingback: Is FBI Seeing How Far It Can Push Apple? ← Cryptotips

  15. Pingback: Tim Cook: Apple Won’t Create “Backdoor” To Unlock San Bernardino Attacker’s iPhone | TechCrunch

  16. Sounds like Apple have created a very secure environment for their users, at least, on newer iPhones. It would be very interesting to compare how this FBI scenario would work on an Android device.

      • Android can be made secure by using full-disk encryption with a strong passphrase. The downside, of course, is that you actually have to enter this passphrase on boot.

      • With the lack of a hardware key, the filesystem can be extracted off an Android phone and cracked in the cloud by a virtual supercomputer. This will yield the passcode or password far faster than a comparable iOS phone.

      • I think you are assuming a nontrivial password when using FDE for it to be cracked easily off of an extracted filesystem. Still, you are right — it’s still loads faster than having to deal with a hardware key.

  17. Pingback: Court Order to Apple to Unlock San Bernardino iPhone May Unlock Hackers – Eliot Lear's Ramblings

  18. Doesn’t the use DFU wipe the device completely as the new or replacement OS is installed? Leaving nothing of evidential value? Only asking, I don’t have a crate load of spare IPhones to experiment with….

  19. Pingback: Por que a Apple está lutando contra o FBI para não criar brechas de criptografia – Blog Músicas.Site

  20. Pingback: Michael Tsai - Blog - FBI Asks Apple for Secure Golden Key

  21. My problem with all of this is that even if it is technically feasible for Apple to crack the 5C the precedent it will set will dangerously undermine our protections in the future – what’s to stop the FBI from coming back and demanding a permanent backdoor?

    • Because they are not allowed by law. They need a court order for a specific phone. The open back door is similar to allowing them to listen in on all of conversations and that has never been done or even attempted.

    • “what’s to stop the FBI …?”

      1. The FBI and US Attorneys, who know that warrants require at least the appearance of probable cause and a specific description of their target.
      2. Judges, who will be far slower to approve a general capability than a specific instance like this one.
      3. Appellate judges, who will be far quicker to overturn an order for a general capability than a particular instance.

      Apple’s stance on this certainly will generate pushback from legislators who disagree with them, but the outcome of that is uncertain because the matter is complex and has numerous ramifications.

      • No judge will ever require a “general” capability. Each will only require decryption of a specific phone. The engineers at Apple will eventually be the ones who demand a “general” capability to assist them in dealing with the blizzard of court orders that will follow. The prosecutors’ lame promise that this is a one time thing ignores the fact that there are state attorney generals and prosecutors in other countries where Apple does business that are not bound by any DoJ promise. If Apple is forced to comply, it won’t be a one-time thing…

      • If this is allowed, and Apple is forced to comply … expect a “general” request from the FISA court. Since their requests are [mostly] classified, you’ll never hear about it. But it will happen because someone in one of the myriad government agencies that make requests of the FISA court will determine it’s a matter of “national security”. Done.

        Apple is right to try to stop this right here, right now. We go over a privacy cliff if they can’t.

  22. Pingback: Digitopoly | Game Theory and Apple’s Encryption Challenge

  23. Pingback: Sicherheitsfirma: Apple könnte dem FBI helfen » Apfeltalk Magazin

  24. Pingback: Sicherheitsfirma: Gesperrtes iPhone mit Hacker-freundlicher Firmware ausstatten wäre möglich - Apfelnews

  25. Interesting detail is, where does the SE get the time from? How is the delay measured? What happens with the clock if you leave it completely unpowered? I think it’s possible to design “securely”, e.g. demand that the schematics stays powered for N minutes without interruption. Still, meddling with the clock could be an interesting attack vector…
    It should be also possible to design SE to destroy the master key after N unsuccessful unlock attempts.

    • I don’t see any reason why the SE would *not* have its own isolated timer.
      “..SE to destroy the master key after N unsuccessful unlock attempts” — isn’t that what it does already provided the setting is enabled in OS?

  26. I disagree with your article’s comments. There is no easy way to make a version of the software that only works on one phone. To do so would require hard-coding the software to work via some serial number or other unique identifier. The problem with that is if/when that code got leaked, a savvy hacker could possibly remove the safeguard, and then you have a true public backdoor….. They also mention that they would even let Apple do it themselves. The issue is that would require Apple to create this less secure software, and would make them a huge target for hackers that want a copy for their own use…. This is definitely not a black/white issue…. This case has the chance to set a major legal prescience, and I’m glad Apple is standing up and fighting it….

      • Iseltzer, Please explain how that will work? How will Apple limit it to ONLY THIS PHONE. Phones are mass produced to be nearly identical, only a few identifying numbers are different. Like I orginally posted, Apple would have to hard-code that in, which a savvy hacker could easily remove or replace to breach another phone…..

      • It’s easy: Apple makes the build, the FBI brings the specific phone over to Apple, they install the special iOS build on it and (and only it, over a cable not over the Internet) and crack the password. No other copies of the software are used. It’s not distributed to anyone.

      • “It’s not that the special version is built to work only on this phone. It’s that the special version will only be run on this phone.”

        You can’t guarantee that. Once created, somebody will do anything they can to obtain a copy of the “special version”, after which all phones become vulnerable.

      • @Wisco24 so i think there are two points, if I’ve read the above.
        1) Apple will limit it by not distributing the firmware “The FBI will send Apple the recovered iPhone so that this customized version of iOS never physically leaves the Apple campus.”
        2) The firmware is signed by apple. If it were produce to match specific, and unique features of the phone like serial numbers etc, I would have thought the substitution you are talking about would mean the image would no longer verify.

      • No, developers create builds for specific iOS devices all the time during testing. They use the device’s UUID and sign the code/app based on that. However, I believe it’s fairly trivial to re-sign an app for another device. Whether Apple can more securely sign or otherwise lock a specific iOS image for a particular UUID I don’t know, but I’d be surprised if they couldn’t.

      • Is it really that easy? Now the software that contains these backdoors exists….. How can you be 100% sure the software would never be lost, misplaced, stolen, etc? If Apple creates this, then they are going to be a pure hacking target to get that software version. The goverment isn’t going to give them more money to help them thward the new waves of attacks they would be hit with for this software…… This is in no way “Easy”

      • Not that easy, Iseltzer. The FBI would be keeping the phone with that modified version of iOS on it and they would do whatever they could to copy the OS once the phone is unlocked. As long as the phone is locked, they can’t do a thing to it. Once unlocked, it would pretty much be wide open.

      • Assume Apple flashes stock FW back onto the device, and deletes their customized iOS before giving phone back to FBI and never lets the phone with the custom firmware on it leave their premises or connect to any network \ bluetooth \ wifi \ whatever. Still creates horrible precedent.

    • I thought Apple had to sign any authorized firmware update for a specific device, it isn’t just signed by Apple and then can be installed on any device. In addition, it could have a specific check in the FW for that specific device (based on IMEI, MAC, device ID, etc.)

      If “any savvy hacker” could modify the firmware and install it on another device, they could also modify the stock firmware and install it on any device. They can’t because they don’t have Apple’s signing keys.

      • It all depends on how Apple is signing it and limiting it. I’m not arguing that it is not possible from the technical aspect, and Apple hasn’t said that either. They are looking at this from the precedent side of it, and how it could affect them, and everyone long term. This also opens the door for any other law enforcement agencies to request it if Apple creates it. Arguing this is “a special case” is crap. If the FBI actually believes that, they should offer Apple a document stating it is a one time thing, and they will never ask for it again. I would bet pretty much anything that the FBI wouldn’t agree to those terms. It sets a precedent that Apple is willing to comply. Apple doesn’t want to dedicate resources just to unlock phones every time a different agency wants access.

    • Why/how would that code get leaked if all of this is being performed in an Apple facility, under their supervision? That seems to be a big point that people are missing, even from the language specifically in the order. The FBI wants the data, by any means necessary. If Apple said that they could produce this and hand it to them, they’d take it. The the technical jargon within the order is nothing more than a “hey, we know that you know exactly what we’re talking about, and you know how to do it, so “please” do it for us.”

      • If the S/N and EMEI are hard coded in the software, and the software includes a check of the hard coded values with those in the device, there are two possibilities.

        If the ID values in the executable software are changed, the digital signature will no longer be valid and the modified software will not load and run on any device

        If the ID values are changed in the source code, the compiled version will not have Apple’s digital signature and will not load and run on any device.

      • Fascinating article. I now believe that Apple could comply. But here’s a thought experiment:

        Apple develops this firmware update in its most secure development environment; maybe even a new environment created for this one purpose, never to be used again. They test the entire process using another iPhone 5c configured the way the subject phone is configured. The FBI brings the target phone into the secure facility, Apple determines the UUID and other codes they will use to sign and lock the firmware to the phone, they sign, deploy, brute-force into the phone on their own, extract all data, overwrite the firmware update and leave the phone unlocked with the pass code set to the FBI Director’s birthday. They destroy their development environment, set off a little EMP, burn some incense, bill the FBI, and life is good.

        Great, here’s the experiment. Same situation, but someone has bribed a few Apple employees with a few million self-funded dollars, or else has kidnapped and started killing family members.

        Do the signing keys, software development environment, etc remain secrets in Apples secure, Faraday cage enclosed environment?

        Is there a check large enough, or a threat great enough, to cause this to be stolen?

        Just checking.

      • @johnwsaunders3: Apple must guard this key as carefully as Coca-Cola guards their secret formula. Anyone who has it can produce and sign software that Apple devices will accept as genuine Apple products, potentially destroying the security of the device and exposing all the data on it. The key is an extremely valuable target now, and it would not become noticeably more valuable if combined with the software the order asks Apple to develop.

      • Thomas, I believe the signing certificates will become even more valuable since there will be a purpose-built application to use it with. Suddenly, the barrier to entry is only to steal the software and certificates, and not to also have the expertise necessary to write the actual hack.

        I suspect that the organizations which might steal it are likely better at theft than at programming firmware.

  27. Pingback: Apple sí tiene la llave para desencriptar cualquier iPhone

  28. Yes, Apple is not disputing that they could technically achieve this. They’re arguing (quite reasonably) that it’s terrible public policy for the country and weakens the safety of her citizens to do so.

    • If your security is dependent on what Apple will or will not do, it is already not secure. That’s why later phones have the Secure Enclave.

      The 5c is simply not as secure as the later phones. Apple’s signing keys are already a backdoor to them, allowing significant weakening of the security of the phone.

      • The article speculates that Apple can re-flash the SE to disable the auto-wipe and delays. Not exactly a “secure” enclave if this can actually be done.
        IMHO Apple would be wise to hard code this in newer phones, as I can not fathom why this would ever needed to be changed in any situation other than the government forcing them to do so.

      • @srgmac
        Secure Enclave can be flashed/updated, but not downgraded. To flash it you still need the user passcode. You can’t flash an iOS device unless you do it from a trusted computer. Trusting a computer has the done from an unlocked iOS device. Otherwise the iOS device won’t accept the connection.

      • @Andy you indeed can load a signed ramdisk in DFU mode. This doesn’t require the phone to be unlocked.

    • Neither does this article claim that Apple says it cannot do it. The claim made by the article is that doing this one-off task would not actually weaken the safety of anyone else.

    • Not to mention forcing Apple to work on this — The FBI and DHS NSA etc. already have more than enough funding, engineers, time, resources, whatever to work on this themselves. The DHS already supposedly has the ability to disable the time delay and data wipe on iOS 8.4.2 and below.

    • @Andy: The SE can apparently be flashed by loading a signed ramdisk whilst in DFU (phone can be locked and it can still be flashed that’s no problem here). Not only is this horrible on its own, but the fact that the delay timers and limiter to 10 tries then wipe values are even modifiable by SE firmware is also ridiculous. I just don’t see why someone would EVER need to modify those values unless it’s from a government request to brute force the phone, which Apple obviously doesn’t want.
      SE has it’s own internal counter and clock also, just FYI to anyone who cares. I believe someone was talking about cutting power to the chip exactly at the right time after trying a password, apparently in some cases it would not increase the counter but still run a check to see if the pw is valid and notify you of such? But I have a hard time believing this…Maybe they were talking about iOS 8 and with non SE devices.

      • What the FBI and other security agents want, and have made very clear in the past and even with this case now that they want, is a back door that they can use any time they want on any iPhone at any time. They’re using the excuse that this is a one-time special case, but if they retain possession of the phone after the task, they retain a copy of the back door for their own use that would be easy (comparatively) to study, duplicate and modify to crack the later models.

        In short, the US government is blatantly stating that Apple’s encryption is the best in the world and they can’t crack it on their own.

      • The FBI’s specific request was for a version of iOS with the counter removed; to be used only once on the phone in question, true… but still in their possession and capable of being copied and modified to work on any iPhone using the same encryption technology.

      • How can that patch NOT remain in the phone unless they turn right around and re-install the un-patched version, doubling their work and not even guaranteeing that the data doesn’t simply gain a different password through mis-handling as it appears has occurred here?

      • My understanding is that the patch is meant to disable the auto-wipe function, remove the delay between pass code retries and to facilitate the rapid entry of pass codes through something faster than the touch screen.

        After the patch has performed all of its functions and the pass code is known, why could Apple not remove the patch or else upload another patch which would overwrite the bits of the first?

      • Reinstalling the original iOS code again would re-activate the timer, the wiper AND the counter. The FBI has made it quite clear they want all three aspects of iOS’ security removed from this phone for however long it takes the FBI itself to crack the passcode. This means the FBI/NSA would be in physical possession of a known and valid cracked code and they would be stupid (albeit ethical) if they did not copy that cracked code.

        People are making too many assumptions about this, totally ignoring the many very probable reverberations that would come from this. The iPhone, despite all the claimed vulnerabilities touted by Apple haters, is clearly the most secure commercially available communications device in the world by the FBI’s own off-handed admission. ANY individual or government currently using iPhones in otherwise secure areas would never again be able to trust that spies from another government couldn’t use that code to access their secure data. The US is already considered a ‘questionable’ ally considering Snowden’s reported revelations about the NSA itself and how the US already spies on everyone… with or without cause.

      • It was my understanding that the FBI would accept having the pass code cracked at Apples campus. Once they know the pass code for the phone, why do they need the timer, wiper and counter changed? Won’t the FBI be able to get into the phone with only the pass code after that?

      • “How can that patch NOT remain in the phone unless they turn right around and re-install the un-patched version, doubling their work and not even guaranteeing that the data doesn’t simply gain a different password through mis-handling as it appears has occurred here?”

        The request specifically allows for this all to occur at an Apple facility, where the commercial software will be reinstalled on the phone before the phone leaves the facility. This doesn’t double the workload; producing a custom version of the software is the predominant workload here; installing it and then reinstalling the commercial software is trivial.

  29. Pingback: White House Plays With Words, Says Department Of Justice Isn’t Asking Apple To Create A Backdoor | TechCrunch

  30. Pingback: Apple resists government request to weaken iPhone security -RocketNews

  31. Pingback: So könnte Apple das iPhone fürs FBI knacken | iTopnews

  32. Pingback: Why the FBI's request to Apple will have an effect on civil rights for a era - naijaknow

  33. Pingback: Experts say if the FBI can force Apple to help unlock an iPhone, it will make the entire internet less safe | Business Insider

  34. The answer is still the same, it isn’t that Apple “wont” hack its that they “cant”. The fact of the matter is, even with a “hacked” OS that would have to be built by Apple to bypass the auto wipe feature the data is still encrypted even to the OS. While it might be possible that at the moment this is “all” the FBI is asking for, it wont stop there. How many years of them punching in pin codes till this starts again with a new order to “force Apple to comply” with unencrypting the data. The order of which would be simply impossible since they do not have access to the keys.

    If Apple demonstrates that you can bypass this with fail safe wipe with a random OS installed without even a pin/password/key protection what stops anyone else in the world who isn’t an “approved” unlocker who isn’t the FBI or Apple from doing this to your devices? Apple will be on the liability end of a very big security hole. Companies will stop approving these devices and the issues abroad will be even worse. China for example is not going to be happy that the US can serendipitous change settings without their authorization and infiltrate data.

    How long after this supposed hacked OS built by Apple will one suddenly show up in the wild and take advantage of this workaround? You think the FBI is going to use this only once? You can bet that this hacked OS will be kept by them “for future needs” and wont be controllable. If any such workarounds exist, Apple is sure to plug those up immediately after. This is exactly how malware works, its by design that no one but you can change these settings.

    • Note also that if Apple caves here, what possible justification would they have to refuse to unlock the phone of a Chinese dissident at the behest of the Chinese government? Or anyone else?
      These phones are sold all over the world, and are subject to many governments.

      • @John: Apple has not said they cannot; they have said they will not (at least until exhausting all other legal options). The general understanding, discussed in some detail in the article we all are commenting upon, is that they can do it, and probably without a great deal of effort.

        @c.curtis: If Apple wants to operate in China, they will have to do so in compliance with Chinese law, as modified by bilateral and multilateral treaties to which the US is a signatory. That is true now and will be true later regardless of the final outcome of this US case (with the possible, but unlikely exception where there is a treaty in which the PRC committed to follow US law in these matters). Exactly the same statements are true for every other sovereign nation. That might be unfortunate and we might not like it, but not liking unfortunate facts is pretty much a waste of time.

    • I don’t know iOS, but I would have thought that unlocking the phone with the pass code would provide enough information to create a full decryption key. In other words, doesn’t logging in with the pass code provide complete access to the encrypted files?

  35. And how many more custom programming jobs will be required for how many more confiscated iPhones? Who decides what crime, real or alleged, rises to the level that compels Apple to perform custom programming for the government? In the law precedent is very very powerful. No excess of caution is possible for the issue of encryption.

  36. Pingback: The FBI is striking at the heart of Apple’s security system

  37. Pingback: Apple seeks public discussion on iPhone security encryption over San Bernardino case - Techie Avenue

  38. Pingback: Why Apple Is Right To Reject The FBI’s Push To Brute Force iPhone Security | TechCrunch

  39. Pingback: How Apple could hack terrorist’s iPhone for FBI (if it wanted to) | Apple Act

  40. Pingback: Confused as to WTF is happening with Apple, the FBI and a killer's iPhone? Let's fix that - GeekTechTalk

  41. Pingback: iOS backdoor – Apple says no | Mac Virus

  42. Pingback: Here’s Why the FBI Can’t Hack Into iPhones - OSINFO

  43. Pingback: Edward Snowden defends Apple in fight against FBI - The Financial Press

  44. Could someone clarify to me why is it impossible for a trillion-dollar three-letter agency to reverse engineer the hardware and hack it at that layer?

    It sounds like just getting a RAM dump might suffice in this case, unless the device was turned off since the last time the passcode was entered. Even if the key is not available in RAM, could it be that hard to extract the hardware key from the underlying nanotechnology used? And at that point, take the storage flash out of the iPhone, plug the storage into something else, then do the bruteforce elsewhere by mixing the hw key with various passcodes?

    Obviously really hard to do for the average hacker, but I can even imagine someone at xda having done something of the sort heh

      • The method proposed by Steve does not require custom software for the phone and therefor does not require Apple’s private key. You need to be able to extract the hardware key(s) used (if Apple cannot supply it) and know the method of deriving the encryption key from the passcode and hardware key. The extraction seems to be the only real obstacle… and the order allows Apple to suggest an alternate method like this.

  45. Pingback: Why Apple Is Right To Reject The FBI’s Push To Brute Force iPhone SecurityGadget Hub | Gadget Hub

  46. “Those who would give up essential liberty to purchase a little temporary safety deserve neither liberty nor safety.” Benjamin Franklin

  47. Pingback: Apple Slams Order to Hack a Killer’s iPhone, Inflaming Encryption Debate – The Intercept – 10 Tech

  48. It occurs to me that Apple whether they cooperated in the past or not sees the potential slippery slope of government intrusion in our lives. In applying pressure it’s easy to imagine the FBI saying to Apple “well you helped us before”.

  49. Pingback: In Praise of Tim Cook: a new Gandhi for the digital serfdom? | Tech Bash

  50. Pingback: Edward Snowden defends Apple in fight with FBI | Inside Forex Trading

  51. Pingback: Forensics Expert: Apple Should Comply With Federal Order | The Daily Caller

  52. Pingback: Why You Should Care About Apple's Fight With The FBI | Gizmodo Australia

  53. Pingback: Apple CEO's Dangerous Game

  54. Pingback: Why Apple Is Right To Reject The FBI’s Push To Brute Force iPhone Security | Tech Digital News

  55. Pingback: Liberty, an IPhone and the refusal to think political

  56. Pingback: Liberty, an IPhone and the refusal to think politically

  57. Am I missing something: if Apple signs the particular version of the code it uses for this iPhone (required for this iPhone to be willing to use this code) and that code includes checks to ensure that it only runs on that specific iPhone, but doesn’t divulge the code signing key (which the FBI isn’t asking for), then any attempt to modify the object code to work on another iPhone won’t work (code signing key would be invalid). While in general I agree with Apple against producing a version of IOS with a backdoor is a terrible precedent, if this version is protected from running on any iPhone other than the one in possession of the FBI seems to be somewhat more benign.

    The fact that the source code has been modified to allow for this doesn’t open that big a can of worms: the real protection is that the resultant object code will only work if signed by Apple, and no one is requesting that Apple divulge the code signing key.

    If simply having a version of the source code with this feature was dangerous, then reverse engineering the object code to produce a version of the source code which also had this feature added would be equally dangerous. Apple’s real protection here is the code signing key, which remains safe (or at least as safe as Apple’s current procedures for protecting it).

    What am I missing?

    • One point that I think a lot of people are missing is that the FBI need never take possession of the custom code, source or binary. The procedure occurs at Apple HQ with the FBI present (to ensure chain of custody). Once the system is cracked and all the data recovered, Apple installs the current commercial version of the OS on the subject phone and hands it back to the FBI. If read the order (section 18 I think) it’s clear that what they want is the contents and they don’t need to have the means themselves to crack the system.

      • Larry, the point you are missing is that this sets a legal president, which is much bigger than this one case. What’s to stop a sheriff in a small town from obtaining a court order for Apple to do this for them on a drug dealers phone, etc. Apple doesn’t not want to have to do all that work for every phone that would get sent in to them. If Apple had to do this, phone prices would have to go up significantly so that Apple can get more headcount just for this…..

      • And at some point in all the hundreds of occasions where Apple is working on their new business of breaking into their own products, someone will steal the technology and all keys or certificates necessary to make it work. There IS a check large enough.

      • The court order requires that the installed software be retained unmodified and that the modifications be in a system image file to be loaded into memory. It allows for, but does not require, delivery of the system image to the FBI, and permits it to be installed and used at an Apple facility. For the purpose of this order, once the FBI has the pass code, they no longer need Apple’s custom software or other assistance, and will be satisfied if the phone is powered off and returned to their exclusive control. Apple could destroy all copies of the modified program, but would be unwise to do that because hundreds (certainly) or thousands of similar demands would soon follow.

        Because of the code signing requirement to load the system image, theft of the modified (or unmodified) source code is not a huge risk to anyone. Theft of a private signing key, however, would be a large problem, depending on how many software products the stolen key was used to sign. Apple might well have a number of different keys to limit the impact of one of them being compromised, and they undoubtedly have extensive and carefully planned safeguards to prevent theft as far as possible and to trigger alarms if it should happen.

      • @johnwsaunders3: Mine came from theregister.co.uk, but it’s the same document. Apple may, but is not required to deliver the product to the FBI, and may not alter the software presently installed, a condition somewhat stronger than one that would require restoring it. The order suggests, in paragraph 3, a memory image containing the modifications required in paragraph 2, but in paragraph 4 allows Apple to provide a different solution that meets the fundamental requirement of permitting the FBI to efficiently attempt a brute force attack

  58. Pingback: Why You Should Care About Apple’s Fight With The FBI – DailyITfix.com- Get Your Geek Fix

  59. Pingback: Apple rejects order to unlock killer’s phone | NewsWib

  60. Pingback: Mengapa permintaan FBI untuk Apple akan mempengaruhi hak-hak sipil untuk generasi

  61. Pingback: Edward Snowden defends Apple in fight with FBI - News-9.comNews-9.com

  62. Everyone keeps saying “but it’s just this ONE phone”. But as some have mentioned, it’s the PRECEDENCE it sets. Soon, judges who sign daily subpoenas will soon be issuing court orders to Apple to create this same custom OS, but tied to a different phone in question every day. If Apple is forced to do this, the lines are blurred, and there is no putting the genie back in the bottle.

    • The correct word is “precedent,” and if the government wins the case it certainly will set one. It is necessary to be clear, though, about what the precedent is and what it is not. The precedent, if set, will be that in circumstance similar to this one the government will be able to require reasonable and possible assistance in executing lawful search warrants that target devices similar to the iPhone. The state of New York has over 100 cases pending, none of them reported to involve terrorism, that would follow such a precedent. What the precedent would not do is increase or decrease the number of similar cases in which a search warrant might lawfully be issued. It also would not, contrary to Apple’s claim, expose any other Apple devices to a risk that does not now exist.

      It is worth mentioning also that the solution the government requested is, because of its specific terms and Apple’s code distribution and installation procedures, not usable on any but the device specified in the court order and the underlying search warrant. The court order, in fact, requires that the code developed be usable only on that iPhone, and does not require it to be delivered to the FBI. Apple, and only Apple, could repurpose it easily, however, for use on other identical or substantially similar devices.

      • IANAL and I don’t know iOS (though this discussion is teaching me). Question:

        Are there any other cases currently working their way through the courts which would require Apple to create software to disable security on an iPhone 5c or later, and to assist a law enforcement agency in performing a brute-force attack on a phone?

        Are the 100 cases in New York that you mention pending because they require Apple to use “all means at their disposal”, and because they do not currently have at their disposal a tool to disable security on an iPhone 5c?

        If the answers to both questions is “yes” then there are at least 100 cases which will be able to proceed as soon as Apple successfully gets this “one phone” unlocked.

      • The latter part of your first paragraph, “It also would not, contrary to Apple’s claim, expose any other Apple devices to a risk that does not now exist,” is what everyone is arguing and what most disagree with. The type of ‘hack’ the FBI and Justice Dept. are demanding would put ALL iPhones at risk because the FBI, NSA, whomever could simply plug that modified version into another phone and be free to crack it at their own leisure. There is almost no way that the modification could be absolutely set to work on one device and one device only. The simple fact that the coding required would single out the specific MAC address (or whatever other form of ID it uses) means the identifier would be plainly discernible and re-coded to match any other iPhone in their possession, even if that possession is only for one hour or so during an interrogation or covert activity.

      • “The simple fact that the coding required would single out the specific MAC address (or whatever other form of ID it uses) means the identifier would be plainly discernible and re-coded to match any other iPhone in their possession, ”

        Nope. That would mung the signature, and the code wouldn’t run on the new target.

      • @johnwsaunders3: I also am not a lawyer or familiar with iOS internals. Information about NY cases is from published news reports, which give numbers from 117 to 175, if I remember correctly. The published reports suggest that the circumstances are quite similar to this one. One, started in September,2015, is very similar to this one and was being litigated in the Eastern District of New York. The criminal case (having to do with guns and illegal drugs) ended with a guilty plea, but both the government and Apple requested the judge to continue the related case over use of the All Writs Act.

        So yes, a win for the government (after Apple exhausted its appeals) probably would bring several hundred equivalent orders rather quickly.

        @saukrhiann: 1gandydancer has it exactly right. The security and integrity of Apple software depends on the digital signature, which depends, in turn, on every character in the object file. If the FBI (or anyone else) received a copy of the file, which the order does not require, they could not alter it without invalidating the digital signature, and because they do not have and cannot use Apple’s secret signing key, they could not create a new signature block. As a result, the device would refuse to load and run the software. It would, in fact, be safe, although unwise, for Apple to release the source code they develop for the FBI; safe, that is, until and unless someone else acquires the ability to sign software using their private signing key.

  63. Pingback: Tim Cook、FBIの要請でも全機種に適応できるようなiPhoneの「バックドア」は作らないと明言 | TechCrunch Japan

  64. This issue is not about what is technically possible for Apple to do but rather what is technically possible for the government or anyone=ne else to do once Apple opens the door. After the Snowden revelations who in their right mind thinks this will be a one time thing or that the government could keep the FBIOS protected from bad actors. I say not only NO but HELL NO to the FBI.

    • No “door” is being opened that isn’t open already. Look, I think that Snowden is a hero and that a lot of what the NSA has done is criminal and unethical. And if the FBI was in anyway requesting a permanent backdoor to all phones or access to the code signing key so the FBI could turn this on for other phones, I’d be on the front line of trying to stop them. My initial bias was in favor of Apple’s position and against the FBI. But reading what the FBI actually was asking for changed my mind. Your so-called FBIOS doesn’t exist and won’t exist, because the fact that that any modified code needs to be signed before an iPhone will run it is very good protection. If fact, those who claim that the mere existence of a modified version of IOS that works on this phone makes it a target for bad actors miss the point if that’s the case, then the code signing key is already a target for bad actors (which I’m sure it is), since that is all you really need (along with an engineer good at reverse engineering object code). But as one who really doesn’t trust the government’s long term intentions here either, I think that one needs to still look objectively and specifically at what is being asked for here. To me, this is really no different than a court order demanding a bank to open a single safe deposit box, which doesn’t put all safe deposit boxes at any greater risk than exists already (when I get a safe deposit box, I know that it’s pretty safe but is still subject to a possible court order to open it). And the fact that a court can order a bank to open a safe deposit box hasn’t proven so financially onerous that prices of safe deposit boxes have skyrocketed. So I’m not bothered by a process protected by court orders that would require a vendor to unlock a specific phone, as long as it’s a court order process that has appropriate legal protections and is open. That rules out orders issued by FISA courts, but that’s a whole other subject… One can still firmly believe (as I do) that building into all phones a backdoor is a terrible idea that should be fought on every front as it is ultimately unprotectable while supporting the very limited specifics of this particular court request. Or you could alternately argue that the backdoor is already in the phone (via the ability of Apple to use the DFU to download a new version of the IOS) and is protected solely by the fact that only Apple (hopefully) knows the code signing key required to get a phone to accept this alternative IOS.

      • @Randy Frank:

        I suggest doing a wee bit of research where it comes to the FBI and other US agencies asking Apple and Google both to install back doors. Yes, even the NSA has made the request and Apple has refused for over a year.

        I do understand the FBI’s position here and it does sound reasonable. But you can’t honestly believe that they’ve given up on their desire to have a “universal back door” into iOS, can you?

      • @Randy Frank

        Not at all comparable. If the bank’s safe deposit key is copied or stolen, does that compromise the security of all their customers? Does that undermine the bank’s business model and reputation to the point where the bank will be significantly impacted? Safe deposit boxes are, for most banks, loss-leaders. Banks don’t make any money from the service. In truth they would prefer to do away with it, but customers want them. iPhone, on the other hand, is responsible for 2/3 of Apple’s profit. So safe deposit boxes are in no way comparable to iPhones in terms of the risks involved in facilitating access.

      • “To me, this is really no different than a court order demanding a bank to open a single safe deposit box”

        Problem with your lock box Metaphor is that this is more like a court ordering the maker of the lock box — who gave the key to the lock box to the consumer/bank when they sold it to them — to hire a locksmith to break into the lock box. It’s simply not the maker of the lockbox’s problem and the government should not have to right to compel them to hire a locksmith to break into their own products. Especially when their products are sold specifically to keep things locked up. Additionally this would then set the precedent that the lock box manufacturer will always be responsible for breaking into their own products whenever the government needs them to.

      • @Randy Frank

        Not at all comparable. If the bank’s safe deposit box key is copied or stolen, does that compromise the security of all their customers? You still need valid forms of ID, valid pin, vault is still in the bank proper under lock and key. Does a copied or stolen safe deposit box key undermine the bank’s business model and reputation to the point where the bank will be significantly impacted? Safe deposit boxes are, for most banks, loss-leaders. Banks don’t make any money from the service. In truth they would prefer to do away with it, but customers want them. iPhone, on the other hand, is responsible for 2/3 of Apple’s profit. So safe deposit boxes are in no way comparable to iPhones in terms of the risks involved in facilitating access. Is the FBI going to legally accept all risk (financial or otherwise) associated with this request?

      • Quite a good post overall. The tl;dr version is that the order does not increase or decrease the risk to Apple’s (or other manufacturers’) devices, and in the US the real protections for citizens are rooted in the Constitution, the law and the courts, as well as the law enforcement agencies that follow them to a very great degree.

        An observation, though, about the Foreign Intelligence Surveillance Court. The FISC differs from other federal courts in three (going on two) essential respects. The first is that it operates in secret because most of its proceedings involve classified activities and material. Secondly, it has in the past heard from only one party, the government. That makes some sense, sometimes, since many of the activities over which it has authority are intended to concern mainly foreigners who are not lawfully in the US and do not have standing and affect “US persons” only peripherally or accidentally. The law recently has changed to include a public advocate, which still is not quite the same as the adversarial procedure used in other courts. The third difference is that the FISC has its own separate appeals court, which also operates in secret, and which accepts appeals only from the government when the FISC denies a government request. The Chief Justice of the Supreme Court appoints judges from other federal courts or appeals courts (as additional duty) to the Foreign Intelligence Surveillance Court and the Foreign Intelligence Surveillance Court of Appeals.

  65. Question. Is there any way in iOS to creat a “deadman switch?” An app such that if I do not log in to my phone for “X” amount of time (user definable) the phone acts as if it had received 10 consecutive bad password attempts. I understand that people are using faraday bags and cages to prevent remote wipes, so I just wonder if it would be possible to have the phone do it by itself. If not standard iOS, what about a jail-broken device?

    • Jailbroken, absolutely — but, why would you want to do this with an iPhone though? I don’t see the point. If you want this level of customization there are other phones and operating systems out there that are better suited to this level of customization.

      • I want to do it because I work in health care and I take my patients privacy very seriously. I do my best to delete protected health information as soon as I can, but when other people send me information, I have no control if I don’t know it’s there and it’s just sitting min my mail. My phone rarely leaves my person, and is locked with the same 10 failure wipe protocol, but still, I would fee safe knowing that even if I drop dead, or the phone is stolen or confiscated, the data will be destroyed. Destroyed is better than stolen.

  66. Firstly thank you for the article! Secondly, is it not possible to clone the iPhone, make several images and try to brute force the images?

    • if it doesn’t have the se then if you backup the phone then try out your passcodes then reflash the backup then you can try some more on that. but yeah if they want to brute force the images than they would need to brute force a 256bit key which is a matter of centuries. or use an electron microscope and grind down the hardware to read the key off it. which would still take considerable time and even more considerable money.

  67. So apparently DHS can crack iOS 8.1.2 successfully. I don’t know how that changes the bigger picture…if at all — just saying if Apple wanted to they could sign an iOS 8.1.2 fw, give it to the FBI, they could flash it and crack it themselves without Apple creating a compromised iOS 9 firmware bundle. The problems Apple talks about though are not negated here — this would obviously not be a one time thing, they would ask them to do this over and over again. Meh.

  68. Not to sound a bit paranoid, but does anybody see the risk this software might create inside Apple as well? I mean, if Apple caved in to this, won’t the programmers/software engineers who will participate in creating such backdoor will eventually become a high risk target? Even if the software is erased, the people who made it exists, they could technically sell their knowledge to independent parties. Or am I just paranoid?

    • The FBI almost certainly know how to do this, in general. The text of the court order is quite specific and puts this on display. Together with NSA, the government may well know exactly what has to be done. There very likely are others with similar or equivalent knowledge, probably some foreign and others criminal. The problem all of the have is that there is at least one necessary thing that they cannot do. They cannot apply Apple’s digital signature that is required to load and run the required software. Only Apple can do that (an assumption, but one that if false means that Apple has lost control of iOS updates, with all that entials.

  69. Pingback: Does the FBI Have a Strong Legal Case Against Apple? Here's an Analysis - Fortune

  70. It’s a 13th amendment issue, analogous to the FBI forcing a locksmith to open a door. It’s OK if the locksmith does it voluntarily, or the FBI uses their own locksmith. Otherwise, it’s involuntary servitude. Good on Apple for standing up to the Justice Dept. on this one!

  71. Hi – If the USA enforcement agencies (FBI, CIA, Local Cops, etc) succeed in obtaining this sort of help in the USA, how does that help them in a globalized society? Wouldn’t this make them less competent since other countries might not be willing to help (or can’t) and these authorities now lack the skills to hack phones? After all, the rest of the world would not stop making encrypted communications only because the USA stops.

    • IMO, this court order is treasonous. If adhered to by Apple it will lead to ending U.S. leadership in technology. All people who want security in their devices will have to purchase from companies doing business in countries that allow secure technology to exist and do not demand open government access to all devices, which makes all our devices insecure. Our FBI and courts are anti-American in their efforts to make all our devices insecure and destroy U.S. leadership in technology.

      The media, courts, FBI, and politicians are clueless on this issue. They not remotely understand the technical implications of this issue.

  72. If you think that the FBI, the NSA and or any other law enforcement agency is going to stop with this one phone you are sadly naïve. As a nation our civil rights have been ground into the dirt under the pretense of national security.
    What is going to stop this technology, which does not even exist, from ending up on the open market. After all this current administration has released more personal information on citizens than anyone effectively doing the hackers job for them
    Get a reality check. The FBI has billions in their budget. They should figure it out on third own.

  73. Pingback: The Contrarian Response To Apple’s Need For Encryption | Hackaday

  74. The bigger issue here is what kind of precedence would this become if Apple complies? What fail-safes would be put in place, if any, to ensure “the hack” or new specialized iOS update wouldn’t fall into the wrong hands? I think Apple should comply, but in a way that will 100% not compromise the general Apple user community.

  75. Pingback: Apple Technically Able to Help FBI Crack Shooter's iPhone | Threatpost | The first stop for security news

  76. The US government has a horrible record of abuse of power. This would open the flood gates for them to require/demand backdoors in all internet connected devices. And the race would be on for all black hat hackers to find and exploit all the backdoors. At that point the only option to have your cell phone secure in any way, would be a sledge hammer to it. Not to mention that Apple’s phone sells would be limited to the US at best. No foreign government in their right minds would allow sells of Apple devices once a know back door has been implemented. Mobile security researchers would have no choice but to smash anyones phone who asked them to make it secure, as that would be the only honest way they could do it.

  77. Pingback: Quora

  78. Everyone seems to be focusing on Apple. The county owned the device. Can a mobile device manager, like Air Watch, unlock a phone?

    • Quentin, that would require the MDM to be preinstalled. Mental Health/special needs is very poorly funded in many US States and Cities, so it’s a relatively safe bet the county doesn’t have that type of solution in place…. Many of them do allow you to do a password reset though, but that’s provided its configured properly before hand and working correctly.

      • Actually, Quentin is close to a possibility. Remember, these phones tend to be tied to a desktop/laptop machine for their backups/updates and if they’re corporate or agency-owned devices it would be the agency that holds the control. Why can’t the FBI discover which computer the phone is tied to and see if they can crack the backup there. Even if it is encrypted, they would be able to copy the file and work on cracking that copy.

      • I’m responding to Saukrhiann — most people do not do local backup with iTunes anymore. They use iCloud. All iCloud data is encrypted as far as I know. Even if he did do a local backup, chances are that would be encrypted as well (iTunes presents that option to you when doing local backups).

      • OK it looks like iCloud data is encrypted but in that case Apple has the key to decrypt it. Wow, I did not know that. So they most likely got all that data already.

    • @srgmac: The part that refutes your argument is the fact that the phone belonged to the agency the perpetrator worked for, so there’s at least the strong probability that the passcode is a corporate one, not a private one. Additionally, it would have very probably been activated on a corporate computer, not a private one. What we may be seeing here is that the FBI (and the Judge) are trying to do an end-run around the owning company which itself has refused to give the data.

      • Hmmm… Ok, the owners (a utility company within the county) gave their approval. However, this opens a while new can of worms because it becomes very clear that the county itself has a pitiful IT structure where they have no centralized control of the devices they issue to their employees.

        Maybe, just maybe that agency itself is at fault from the outset. After all, by the original reports of the incident, it seemed more a violent reaction to an argument during a division picnic rather than a ‘planned’ terrorism event. The farther this goes and the more arguments that get presented, the more I get the idea that someone is looking for a scapegoat.

        Now just as an idea… What if Apple does ‘crack’ the phone’s security and there’s simply nothing to be found? What if this whole episode now is nothing but a highly distracted effort to force Apple to create a “back door” so that everyone else’s privacy is compromised? I’m not a Libertarian but even I understand the arguments and complaints about giving too much power to a government, no matter what government that may be.

      • Exactly. They already have this guys data. They have his entire iCloud virtual filesystem — DECRYPTED, since Apple holds the keys. They have his gmail, Facebook, etc. all that stuff.
        This case is not about this one phone…It’s not about terrorism, it’s not about victims, and it’s not about ISIS. It’s about the government forcing a private corporation to hack their customer’s data.
        Let’s assume down the line Apple was forced to put in a backdoor for JUST the US government — how do we know the Chinese government won’t figure out how to access it? Or the Russians? Or anyone else in the world who has near unlimited budget and state sponsored hackers on payroll? What if they were able to get into a US senator’s phone because of this? What if they were able to steal an American CEO’s information that eventually lead to intellectual property or research data theft?
        Sad day for us all when Apple is the one who are defending our rights and the FBI is on the opposite side — isn’t it SUPPOSED to be the other way around!?

      • It does not appear that anyone is trying to circumvent the phone’s owner (San Bernardino County Department of Public Health). The warrant application states that the owner consented to the search but does not know the pass code. It further states that the phone’s last iCloud backup occurred several weeks before the shootings on December 2, 2015, explaining why the FBI wants the data on the phone.

  79. Very limited knowledge here, so this may sound stupid.
    Can we, in parallel, build the coding to break this, however Apple achieves it, and in parallel build an upgrade security fix that automatically uploads into all the rest of the worlds Apple devices ( just like a software update) that does not allow the coding break to be used on the phones that are updated? This would allow this approach to be used once, but then you vaccinate the rest of the populations phones, so that if it does get out, it can’t be used against our phones.

      • I don’t quite understand what Sandy is getting at by “coding break” but I really hope you are wrong about them modifying the encryption.
        IMHO they should hard-code the timer values into the SE (I mean really, what possible situation would arise where those values would need to be changed in the future anyway, aside from a government requesting?) and make it so they can *not* be modified at all by a change in firmware, and put this issue to bed once and for all. I can not fathom a law actually passing in any democratic country that would penalize Apple for doing this, and it would show their customers once and for all that they take security seriously.

      • Seems like all that would be involved is requiring passcode entry for DFU. If repeated passcode fail wipes the phone during DFU then even a signed DFU image won’t help access the encrypted data. I think my iPhone 6 requires passcode entry to boot after update so maybe this is already in place on newer phones.

        Thanks for the excellent post!!

      • I am replying to Morton — Morton, that would lead to a device that can’t be used at all (ie. a brick), if someone forgot the password, and didn’t have the wipe on 10 incorrect tries enabled. IMHO hard coding the SE timer delay values would do the trick; but that’s not regressive — so it would have to be newer phones only (ie. iPhone 7 and later).

  80. If the other 2 phones were totally destroyed, the hard drive removed and disposed of, why wouldn’t the terrorists just punch in 10 wrong security codes on this phone to erase it, That’s why they left it -just easier. Think the FBI is going to find NOTHING if they get it unlocked.

    • Either there was nothing on that phone to begin with, or the two terrorists were too busy being killed to wipe the phone.

      The holiday party cannot have been their prime target, or they would have killed everyone there. They were certainly on their way to other targets and maybe planned to wipe the phone later.

      Or, there is nothing but restaurant inspection reports on the phone.

  81. Pingback: Nach Gerichtsurteil gegen Apple: Die neuen Crypto Wars schwelen nicht mehr | netzpolitik.org

  82. Pingback: Troy Hunt: Everything you need to know about the Apple versus FBI case

  83. Pingback: Apple-vs FBI - im Klartext - Neue Technologien Nachrichten

  84. Pingback: Apple could break into Farook's phone if it wanted to - The Financial Press

  85. Pingback: Why the FBI's request to Apple will affect civil rights for a generation - Macworld Australia - Macworld Australia

  86. Pingback: Google CEO backs Apple in resisting court order to create iOS backdoor for San Bernardino investigation

  87. Good post. Thanks for the in-depth explanation. Question: why bother with the remote PIN entry? Presumably, it’d be faster and easier to implement the bruteforcing directly on the custom iOS software (the lockscreen could just loop through possible passcodes on its own). I don’t have particularly strong convictions on whether Apple should do this, but I’m really confused as to why the FBI is asking for remote entry functionality just to write a trivial script to iterate over 10,000 or 1,000,000 pin codes. Is it so they can feel productive?

    • Apologies if this comes across too crass but taking a few minutes to read this post in which we’re all commenting, you’ll come across some pretty thought-provoking stuff. It also specifically addresses and answers your questions.
      So one of the primary issues at hand for the FBI is being able to actually implement the brute force you’d mentioned being a very highly probable means to unlock the device. Unfortunately for the FBI, they’re not able to go this route on their own and without Apple’s support . One of the reasons for this is the existence of an auto-erase function on the device that may have been enabled. There’s no way of knowing either way without first unlocking but if it were, iOS would completely wipe out the phone’s data after too many incorrect PIN entries. Another restriction for the FBI is again from each incorrect password entered, iOS implements a series of other delays, even further compounding after each passcode is entered incorrectly. Lastly, continuing with the brute force efforts of the FBI to unlock, each PIN would even need to be keyed by hand and only on the actual subject device itself. While if Apple were to support, any number of faster physical device ports could be used to expedite. Additionally, Apple would be capable of disabling any existing (& currently prohibitive) auto-erase features along with allowing for unlimited PIN guesses without negative consequence

      • I cannot imagine what in my comment made you think I did not fully grasp all of that. I will re-ask my question a slightly different way, I suppose. IF Apple is creating a custom iOS software installation (which disables auto-erase functionality and eliminates the timeout on unsuccessful PIN entry), then why wouldn’t the iOS software itself perform the bruteforcing operation as well, rather than implementing remote PIN entry. It seems like remote PIN entry is an odd request if all it would be used to do is trivially cycle through 4- or 6-digit PINs; Apple could loop through those PINs on the device itself rather than open up the device to remote PIN entry.

        It seems like an odd request by the FBI if the phone uses a 4- or 6-digit numeric PIN. Of course, maybe there’s more sophistication required in bruteforcing the passcode because Farook used the indefinite-length numeric PIN or alphanumeric PIN options. This article was unsure about that, and perhaps the request for remote entry is a clue that the passcode is of higher sophistication.

      • The FBI has the ability to do this already on non TouchID devices running iOS 8.4.2 and below. They have trillions of dollars in budget and resources \ engineers – they don’t need Apple’s help to unlock the phone…It’s just more convenient for them at this point, and it’s also about future policy.

      • @srgmac: The FBI’s FY2014 budget was 8.4 billion, so it’d take a lot of years to get to a trillion, let alone multiple trillions. But that’s besides the point. In order for remote entry to occur, Apple has to implement it, and then the FBI’s engineers will have to write code to interface with it, all for the sake of implementing what would be, if the device uses fixed-size 4- or 6-digit numeric PINs, a trivial for loop. This wouldn’t be a matter of relieving Apple of effort; implementing remote PIN entry seems harder than writing a simple for loop, doesn’t it?

    • “@srgmac: The FBI’s FY2014 budget was 8.4 billion”
      What about NSA, CIA, DHS, DEA, budgets — and for the sake of argument, let’s combine them from the time the original iPhone was released.
      They can’t talk to each other and share resources?
      They do that all the time.
      Apple doesn’t need to implement remote entry.
      AFAIK iOS has supported wired keyboards for awhile now.

      • There’s a lot of evidence to suggest that those agencies are actually notoriously bad at communicating and sharing resources, and even then “trillions” is a stretch (although the actual numbers for many of those agencies are classified, though there are estimates). To get to trillions, you’d prob need to lump in the A LOT of the federal government’s operating budget, and then all of a sudden spy engineers are dipping into medicaid and social security dollars. But I digress; it doesn’t actually matter for the technical point I’m trying to make.

        I have never used a remote keyboard on an iOS device, so I don’t know if they can be used for entering passcodes. If so, then you’re right. Presumably, the FBI could spoof keyboard input and run an automated script through that input method. If that’s the case, though, why did they request it at all?

      • Additionally, the FBI is offering to have this occur on Apple premises, which means they’d have to bring their little automation-enabled keyboard device (which is probably a computer) with them. It’ll also greatly increase the chance of the FBI somehow compromising Apple’s custom software install (maybe they have a way of pulling the software image off the device?

        So, again, I ask. Why would the FBI request this remote input functionality instead of just requesting the custom iOS image bruteforce itself?

      • “Presumably, the FBI could spoof keyboard input and run an automated script through that input method. If that’s the case, though, why did they request it at all?”

        Because it’s not about this one phone, or terrorism, or ISIS. It’s about future policy — this is the federal government of the USA using the court system to force private companies to hack their own customers. This is going to be a very important case for generations to come. You’re hung up on the trillions — it was figurative — the point I was trying to make is, they have more money, resources, and engineers than Apple does. They don’t need Apple to do their dirty work for them — they just want to send a message and try to influence public policy. Not to mention they already got these terrorists other information from Facebook, Google, AND including their Apple iCloud data (which is encrypted BUT in this case Apple actually *does* have the key). The only possible thing of interest to the FBI would be application data that’s local and not sync’d to iCloud and iMessage data that wasn’t deleted beforehand.

      • “Because it’s not about this one phone, or terrorism, or ISIS. It’s about future policy — this is the federal government of the USA using the court system to force private companies to hack their own customers.”

        Right. This is all true. But irrelevant to my point. The FBI would be requesting this regardless of whether the custom iOS image it’s requesting bruteforced the passcode itself or if it allowed remote passcode entry.

        “You’re hung up on the trillions — it was figurative — the point I was trying to make is, they have more money, resources, and engineers than Apple does. They don’t need Apple to do their dirty work for them…”

        I’m not really hung up on trillions. I’ve said multiple times it’s irrelevant to my point. And I’ll say it again. Sure, they have trillions. And brilliant engineers of their own. They don’t have Apple’s software signature, so they need Apple’s help here; I stress that I’m saying that as a restatement of the facts of the case, not as something relevant to the original point I was trying to make, and am still hoping to at some point make to someone who actually understands me.

        And yeah, I agree that unlocking the phone isn’t even that important. Again, not relevant to my point! I stressed in my original comment that I wasn’t even making a point about whether or not Apple should do this or whether or not it was important. It didn’t even trigger my curiosity!

        My curiosity is the non-sequiter of asking for remote access! They asked for two things that make a lot of sense and a third thing that doesn’t seem relevant. I am only making a technical point! Not a policy point. Not a budgetary point. Not a philosophical point. Why do you keep engaging with me when you don’t appear to understand what I’m saying at all? Is it because you like my username?

  88. Feasibility aside, is it technically possible for the secure enclave manufacturers to records UIDs in such a way that the UIDs could be re-associated with an iPhone? If possible, feasibility would depend on which country the secure enclave is manufactured in and how cooperative that factory is.

  89. “After the elimination of passcode delays, it will take a half hour to recover a 4-digit PIN, hours to recover a 6-digit PIN, or years to recover a 6-character alphanumeric password. It has not been reported whether the recovered iPhone uses a 4-digit PIN or a longer, more complicated alphanumeric passcode.”

    Just to be clear — even if the government were given software with the ability to test a new passcode every 80 ms, it could still take a lot longer than years, because the alphanumeric passcode can be of any length; it’s not limited to 6 characters:

    “iOS supports six-digit, four-digit, and arbitrary-length alphanumeric passcodes”
    (from p. 12 of the iOS Security Guide referenced above)

  90. So what if I forget my iphone passcode (and the touch passcode fails to work for some reason)? Does that mean my iPhone 6s is locked forever? Apple cant reset the passcode for me after verifying my identity? That’s scary. But If Apple CAN unlock my iPhone should I forget my passcode, then since “The iPhone is the property of the San Bernardino County Department of Public Health where the attacker worked”, why can’t Apple unlock the phone in question for San Bernadino County Department of Public Health?

    • A couple things:
      1. The touch passcode does not unlock your phone under a lot of circumstances. After a restart, for example, you must put in the passcode. My guess is this is partially because your fingerprints are stored in an encrypted fashion and need to be decrypted using the passcode, and partially because fingerprints aren’t actually as secure as passcodes (because passcodes can be changed, and fingerprints can’t, and I don’t think there’s any foreseeable tech that’ll truly eliminate the ability to “fake” a fingerprint image using a silicon imprint and a real finger, for instance).
      2. No, Apple cannot unlock your phone for you in that circumstance, but your phone isn’t locked forever; you can restore it. If you don’t have a data backup, your data is lost forever. That is just the cost of security in this case.

    • If you forget your passcode and the fingerprint reader isn’t ON, then the only recourse is to connect it to iTunes on a laptop or desktop and, using the iTunes controls, RESTORE it from a backup … erasing absolutely everything on it and restoring the backup stored in iTunes (that should be updated weekly), but without the old passcode that could then be set. If there’s no iPhone backup in iTunes (and only airheads don’t have backups), then it would definitely be like starting all over with a brand new iPhone because the iPhone would be restored to factory settings. This is as it should be to protect one’s most private contents. [Farook’s iPhone5c is pre-fingerprint reader … that came in with the 5s].

  91. Pingback: How Apple could let the FBI crack your encrypted iPhone | Tech Camp

  92. Pingback: Judge orders Apple to assist in hacking a phone - Page 11 - Pelican Parts Technical BBS

  93. Pingback: A $20 tool could have prevented the FBI’s iPhone encryption fight - The Parallax

  94. Pingback: Why Apple Is Fighting Not To Unlock iPhones For The Government – nerdbynatureblog

  95. Pingback: Apple vs. the FBI: Critical Thinking in a First-Year Seminar – Agile Learning

  96. Apple needs to comply. These are terrorists that murdered our citizens. Apple has the ability to open this one phone to help investigation without giving away their secret backdoor key. Take the phone to Apple in an armored car, give it to Apple, let them open it, have Apple give it back (lock proof), send it back to the FBI an armored car. The FBI nor Homeland Security gets Apple’s. Backdoor.

    • The FBI won’t get it at first. It will still be available for Apple to use on the next subpoena. And the next. And the next.

      Eventually, it will be stolen and the FBI won’t need subpoenas anymore – they’ll be able to pay the hackers.

  97. Would it be possible to remove the limitations of requiring a passcode after 48 hours or 5 invalid fingerprint authentication attempts by patching IOS or the Secure Enclave?

    • Based on this article, it sounds like it. You’d still need the passcode after restart, though, unless you wanted to entirely eliminate encryption on the secure enclave (or trivialize it a bit by using a static UID, not “tangled” with any custom passcode, for encrypting). I’m not sure what you’d gain by doing this, though; the ability to bruteforce a fingerprint? I dunno if anyone’s done any credible estimating on how much human fingerprints can vary, but my guess is it’s an even bigger problemspace than a very long alphanumeric passcode.

  98. Pingback: Why Apple's iPhone Battle With the Government Will Likely Be a Privacy Setback

  99. Pingback: Episode 9: CryptoWars 2.0 – Two Shades of Brown

  100. Pingback: Apple's fight with the FBI could trigger a password arms race | Apple Act

  101. Pingback: Liquidmatrix Security Digest Podcast - Episode 65 | Liquidmatrix Security Digest

  102. Pingback: Gavel Down: Closing Out the Week in Congress: Feb 15-19, 2016 – The POPVOX Blog

  103. IMO, this court decision, if adhered to by Apple will end U.S. leadership in technology. People who want secure devices will be forced to purchase from companies housed in countries allowing secure technology to exist without demand backdoor access to break encryption on secure devices. IMO it is anti-American to make our devices insecure because it will destroy U.S. leadership in security technology and force security software and firmware companies to move out of the U.S.

    The media, courts, FBI, and politicians seem to be clueless on this issue.

    • You are spot-on. That Senator Dianne Feinstein – of California – supports the FBI request is ASTONISHING. She sits on the Senate Committee on Intelligence, and that apparently doesn’t describe the intellect of its members. That she doesn’t see what she’d do to her own state if the government wins this is astounding. She’s CLUELESS! I live in Silicon Valley, only minutes from Apple headquarters and MANY other high tech companies. It’s a boom town now, but certainly will not be if true encryption is killed.

  104. Pingback: Apple and the FBI | In Defence of Liberty

  105. Pingback: Eurotech Episode 14 - Error 53 Podcast Not Secure - Euro Tech

  106. Pingback: Why the FBI’s request to Apple will affect civil rights for a generation | blog.L4networks.com

  107. Pingback: “Stance Not Grounded In Principle”: Apple Unlocked iPhones For The Feds 70 Times Before « mykeystrokes.com

  108. Pingback: How Apple could let the FBI crack your encrypted iPhone - netStructure Solutions

  109. Pingback: Why it’s technically possible for Apple to bypass your iPhone’s passcode (AAPL)

  110. Pingback: Технически Apple может взломать айфон для ФБР | Threatpost | Новости информационной безопасности

  111. The issue isn’t whether it’s feasible, it’s whether it causes an unexceptable problem (compromising the security of iPhones). Also, if Apple complies in a way just for that iPhone, how many other requests are they going to get from the government or other governments, and how much time and effort are they going to have to make for each of these requests? Even if the governments pay for it, it could grind Apple to a halt, as well as other companies these governments would expect cracking from, and scare companies from making their phones secure and make them more likely to allow backdoors just for governments. It’s a scary implication of full government control over our communication devices.

  112. If Apple refuses to write the software to exploit the backdoor in the iPhone, could the government simply issue a judicial order for Apple to provide to the FBI the digital certificate that is required to load software on the iPhone so the FBI could write the exploit themselves?

    • Could the government get a judge to order Apple to disgorge its private software signing key? Almost certainly not. The application for such a warrant would be denied on the basis that it is obviously unreasonable and disproportionate to any plausible requirement.

  113. Pingback: 苹果为何拒绝协助政府解锁 iPhone | TechCrunch 中国

  114. Great post, thanks. My question is, if the 5C lacks a Secure Enclave, then why isn’t the FBI just asking for a firmware that spits out the device key (the one that gets entangled with the passcode) so that they can crack the encryption offline?

    • I’ll answer my own question. Apparently, the SE is a new feature, but the method used to protect the device key is not. They use a hardware AES implementation, so that you can use the key but not read it.

  115. Pingback: Apple’s Principled Stand | Re/code

  116. Pingback: Who Sets the Rules of the Privacy and Security Game? | Just Security

  117. Pingback: Apple’s battle with the FBI: All your questions answered | Fusion

  118. Pingback: Decrypting an iPhone for the FBI - Schneier on Security

  119. Pingback: Apple, FBI, Security Enclaves, and firmware | Firmware Security

  120. If Apple can do this to any iPhone, why do you trust Apple?
    Are they unconditionally good guys?
    The government is of the people, by the people and for the people, but commercial companies are not.

  121. Pingback: A Brief Response to Cy Vance About Apple and the FBI - Windypundit

  122. Pingback: 005 - Apple, Encryption, and the FBI - The Context: a weekly podcast about why things happen.

  123. This is the one of the few articles I have found that really gets the details right.

    Apple cannot provide the FBI a way to defeat AES encryption and I see nowhere that FBI is requiring Apple to install a compromised version of AES in all its phones.

    So all this hullabaloo is about the FBI trying to force Apple to create a ‘master unlocking software’ for all iPhones – something that is just impossible. So Apple is being completely deceptive when it says they are being forced to create the software equivalent of ‘cancer’.

    What Apple can do is disable the passcode delay and auto-wipe for this phone, possibly for all iphones currently out there. Four sure, they could make the SE such that this might become impossible to do for future iPhones. All this has absolutely nothing to do with building an encryption backdoor and I wish folks who understand the difference spoke more clearly about it. Healthy skepticism of the Government is great, and we should be equally skeptical of private companies.

  124. Pingback: Quora

  125. Pingback: SANS Digital Forensics and Incident Response Blog | A Technical Autopsy of the Apple - FBI Debate using iPhone forensics | SANS Institute

  126. Pingback: 4. Compliance with Court Orders Act of 2016 – GRyan Masters Blog

  127. Thanks for such a well written article.

    What I don’t understand though is why the FBI don’t just get the data from the iCloud backup. That contains the complete phone contents. Or iTunes if they use that.

    Maybe they didn’t back it up …

    • Have you been living under a rock? :)
      Apple gave them ALL the previous iCloud backups.
      The FBI didn’t care, they wanted the latest one.
      Some private security company broke the phone and got the latest data for the FBI already so they have withdrew *this* case against Apple.

  128. The SE keeps its own counter of incorrect passcode attempts and gets slower and slower at responding with each failed attempt, all the way up to 1 hour between requests.

    I can’t get any information on this, but wouldn’t a good a avenue of attack simply be to reset this counter? What specifically prevents this?

  129. Pingback: Apple Versus the FBI - Reason.com

Leave a Reply