Hacker News new | past | comments | ask | show | jobs | submit login
Apple can comply with the FBI court order (trailofbits.com)
328 points by admiralpumpkin on Feb 17, 2016 | hide | past | favorite | 207 comments



Crucially, this is software which doesn't currently exist in the world and which Apple has no intention of voluntarily writing. There is no specific law or regulation (like CALEA) which requires Apple to provide this functionality.

What the FBI is attempting is to use 'All Writs Act' from 1789 which authorizes Federal courts to issue "all writs necessary or appropriate in aid of their respective jurisdictions and agreeable to the usages and principles of law." Are there limits to what a judge can order a person, or a company, to provide?

A warrant describes "the place to be searched, and the persons or things to be seized." I would not expect a judge can draft a warrant for something which doesn't actually exist, and then force someone to create it.

This is not about providing physical access, or about producing documents which are in your possession. This is whether the government can usurp your workforce to make you create something that only you are capable of creating, against your will, not because there's actually a law which says you have to provide that capability, but simply because some investigator has probable cause that given such a tool they could use it to find evidence of a crime!

If 'All Writs' somehow does give the government the ability to enslave software developers to creating this particular backdoor, what is there to legally differentiate this request from, for example, one that would function over WiFi or LTE remotely?

There's been a lot of discussion about the 'secure enclave' and how this particular attack isn't possible on the iPhone 6. I think that's missing the point.... If 'All Writs' can force Apple to open a black-hat lab responsible for developing backdoor firmware for the 5C, then it can do the same for the 6. For example, why not force Apple to provide remote access to a suspect's device over LTE while the device is unlocked / in use? While we're at it, the iPhone has perfectly good cameras and microphones, let's force Apple to provide real-time feeds.

Think about the sheer quantity of networked devices which exist (or will exist) in an average home which could be used in the course of an investigation. If they can force Apple to create a 5C backdoor, I can't see any reason they can't apply the same logic to WiFi cameras, Xbox Kinects, or even your cars OnStar. Heck, even TV remotes come with microphones and bluetooth now... And don't get me started on Amazon Echo!

Fundamentally, the question is can you force a device manufacturer to implement backdoors into their products to be used against their own customers? Notably, service providers have already lost that battle, they are required to architect their systems to be able to spy on their users and provide that data to law enforcement, often through specially design real-time dashboards. At least in that case it is based on duly enacted legislation with that specific intent.

But this is something really quite shocking -- can investigators, simply through obtaining a warrant, force companies to re-design the personal devices that we own and keep with us almost every moment of the day to spy on us? I truly hope not.


> There's been a lot of discussion about the 'secure enclave' and how this particular attack isn't possible on the iPhone 6. I think that's missing the point.... If 'All Writs' can force Apple to open a black-hat lab responsible for developing backdoor firmware for the 5C, then it can do the same for the 6.

Well stated. That's the crux. The technical difficulty of any given hack is going to vary and is ultimately irrelevant. The idea that the government can commandeer a company's resources towards its ends, especially when those ends compromise the security of a larger community, is a dangerous one.


Forcing Apple to develop said backdoor aside... I have to wonder if it wouldn't be much easier, legally speaking, for the government to force Apple to hand over it's firmware signing keys? Those undoubtedly do already exist.

Once in possession, they'd be free to contract an independent entity to modify the firmware and resign it with Apple's own key. It can't be all that hard to NOP out the code that increments the unlock attempt counter and associated delay mechanisms.


It doesn't quite work that way. The files on the device are ultimately encrypted with a dependency on both the hardware key in the Secure Enclave and the user's passcode. It's impossible to update the firmware without both pieces of information or erasing the device, and on A7 or later processors the unlock attempt delay is a direct result of the method of encryption used and tied to the hardware so it must be performed on the device itself.

https://www.apple.com/business/docs/iOS_Security_Guide.pdf


Do we have any documentation that indicates it's impossible to do firmware updates without the user's passcode? That's what the government seems to believe that it could (at present) do, given Apple's cooperation. The iOS Security Guide doesn't seem to address that particular point.


It's possible to reflash the firmware in DFU mode, but it requires erasing the data on the phone. Or more technically, there's an encryption key in storage that needs to be regenerated to give access to the filesystem, but that encryption key (in conjunction with the hardware key in the Secure Enclave and the user's passcode, if set) is also necessary to read any user data on the phone.


Are you saying the article is wrong or did you not read it?

"As many jailbreakers are familiar, firmware can be loaded via Device Firmware Upgrade (DFU) Mode. Once an iPhone enters DFU mode, it will accept a new firmware image over a USB cable. Before any firmware image is loaded by an iPhone, the device first checks whether the firmware has a valid signature from Apple. This signature check is why the FBI cannot load new software onto an iPhone on their own — the FBI does not have the secret keys that Apple uses to sign firmware."


Yes, you can only load new firmware onto the device by erasing it, which would make it pointless.


<spoon feed> At this point it is very important to mention that the recovered iPhone is a 5C. The 5C model iPhone lacks TouchID and, therefore, lacks the single most important security feature produced by Apple: the Secure Enclave.

In these older devices, there are still caveats and a customized version of iOS will not immediately yield access to the phone passcode. Devices with A6 processors, such as the iPhone 5C, also contain a hardware key that cannot ever be read and also “tangle” this hardware key with the phone passcode. However, there is nothing stopping iOS from querying this hardware key as fast as it can. Without the Secure Enclave to play gatekeeper, this means iOS can guess one passcode every 80ms. </spoon feed>

And even if it did have a Secure Enclave, from page 5 of your link:

When an iOS device is turned on, its application processor immediately executes code from read-only memory known as the Boot ROM. This immutable code, known as the hardware root of trust, is laid down during chip fabrication, and is implicitly trusted. The Boot ROM code contains the Apple Root CA public key, which is used to verify that the Low-Level Bootloader (LLB) is signed by Apple before allowing it to load. This is the first step in the chain of trust where each step ensures that the next is signed by Apple. When the LLB finishes its tasks, it verifies and runs the next-stage bootloader, iBoot, which in turn verifies and runs the iOS kernel.

So the LLB is run first, and could be used to contact the Secure Enclave and guess passwords. And from the article:

At this point you might ask, “Why not simply update the firmware of the Secure Enclave too? That way, Apple could disable the protections in iOS and the Secure Enclave at the same time.” (crossed out)Although it is not described in Apple iOS Security Guide, it is believed that updates to the Secure Enclave wipe all existing keys stored within it. An update to the Secure Enclave is therefore equivalent to erasing the device.(/crossed out) I initially speculated that the private data stored within the SE was erased on update but I now believe this is not true. After all, Apple has updated the SE with increased delays between passcode attempts and no phones were wiped. In all honestly, only Apple knows the exact details.

So, it may be that Apple does have a backdoor to upgrade the SE. Only Apple (and maybe other state actors) really know this. So, it certainly isn't as cut and dried as you imply, if the device did have a Secure Enclave.

But, in this case, the device does not have a SE, so it is clear that with Apple's keys the device could be, with a practical amount of effort, be hacked.


Under extraordinary circumstances, compelled expert witness testimony has lots of precedents, including situations where experts are required to expend resources to develop that testimony.

People on the Internet tend to believe that technology poses confounding problems for the law, but the law has been dealing with technical challenges for centuries. See, for instance, any case involving a complicated medical issue.


Yes, you can compel expert witness testimony, but this is different. Testimony is answering questions. Here, they are seeking to compel engineering/coding work. I've never seen that compelled by a subpoena.


If you read old-ish legal journal articles about expert witness compensation, you find that the requirement to do up-front work in order to generate the knowledge required to handle questions is a dividing line for whether (or, at least, whether in the 1960s) expert testimony must be compensated. From that, I gather that this kind of request isn't unprecedented.


Under extraordinary circumstances, compelled expert witness testimony has lots of precedents,

Can you name some of them?

IANAL, but the Federal rules of evidence section 706 explicitly states, "But the court may only appoint someone who consents to act."


Compelling expert testimony happens all the time. (You might have to pay for it, but the witness has no choice.) The Apple subpoena is different. It is an attempt to compel engineering and coding work.


I would argue that expending resources is different from creating something entirely new though.


>People on the Internet tend to believe that technology poses confounding problems for the law, but the law has been dealing with technical challenges for centuries.

The problem is different now than it was centuries ago when technology was oil, gas, and steam powered. Now it is electric, meaning information moves at the speed of light all around the world simultaneously.

Such technology is getting better, faster, and cheaper at exponential rates leading to ephemeralization. Meanwhile paper-based political processes have stalled and remain slow as ever.

Say a paper-based political system begins the process of banning a new technology. By the time they finally get around to completing their process, 5 better, faster, and cheaper technologies have already been invented making the old one irrelevant. This is an increasingly important problem to deal with considering the existential nature of paper-based political processes and the rate of technological change.

Paper-based information systems and processes are simply too slow to keep up with the speed of the electric medium. It's like trying to race a lightning bolt.


But Darpa sees 20 years ahead, and if they decide to ban a technology ,i'm sure the legal process will be expedient enough.


> This is whether the government can usurp your workforce to make you create something that only you are capable of creating

In a way, this would resemble war-time confiscation of production capability. You have a car factory, for instance, and the government tells you that from now on, you'll be producing tanks, thank you very much.

I think this was not strictly speaking what happened in the US during Second World War, because companies were willing enough to produce war material for the US government in exchange for considerable sums of money. But for instance, the Skoda factory in Czechoslovakia was just confiscated and directed to making military vehicles.


If you have done nothing wrong, broken no law, can a court order you to write a letter or a book? If not, then on what basis does a court order someone (a company) to write code? This isn't just a matter of removing the anti-bruteforcing code, the FBI has also asked for a way to quickly iterate passwords without using the on-screen keypad. That's creating new functionality, not just removing existing functionality. This seems specious.

This almost certainly ends up at the Supreme Court, but due to the national security implications of the case it's plausible certiorari petition could be made by either Apple or the government. And yet we have a 4-4 court right now.


I think most technical people, under ordinary circumstances, if asked to make a build of some software with a security feature disabled,[1] would not consider it akin to being asked to create something new akin to writing a letter or a book.

[1] Or really any small tweak. I can remember at least a couple of times being asked to provide someone with a special build of software I worked on that, e.g., logged something it didn't ordinarily log to help debug an issue a customer was having that we couldn't reproduce in-house. Can't say I ever thought of that as creating new software, even if I added some fprintf statements that weren't there before.


Adding a means for FBI to input passcodes via Lightening connection instead of the touch screen is certainly a non trivial addition.

Can a court order someone to delete phrases from a letter or book?

Code is protected under copyright law. Code is a form of speech. How is either a court, or law enacted by Congress, telling someone what to write or delete what's already written, not abridging speech? Why is code different?


This would not be a "freedom of speech" issue, as it's not preventing speech (which is what the 1st amendment protects). It's compelling (arguably) a significant amount of labor / work, that in the end results in compelled speech (code) -- in a way, this is more of a 4th (due process) and 14th amendment issue (equal protection / indentured servitude [slavery]).

A bigger question might be if the government has the ability to require Apple to do this without compensation, and if they do, then who gets to set the rates or define the project -- Apple or the government? Possibly, a better tact/approach for Apple (discounting the setting of precedent, etc) would be to bill the government $700-$1000/hour per developer, and assign a team of 200-300 developers to this, with an expected delivery date of 3-5 years. As the government starts paying down $100 million/month for in effect, nothing, very quickly the public would become outraged and force the question of compensation into court. At the end of the day, the people/government would have to decide if its worth either 1) forcing Apple to do this uncompensated or 2) if it is worth the cost to decrypt one phone -- neither I believe would be widely popular/supported.

Regardless of what happens, unless the FBI withdraws this request, which is highly unlikely -- they have clearly chosen this case due to the terrorism connection to be its legal Alamo -- this is going to end up in front of the Supreme Court.


I agree the compelling of work is more convincing than the court compelling speech stated or withheld. But code is a language, it's not just building, it's something of a hybrid. And the court order requires both the creation of new code, as well as the inhibition (deletion) of previous code, so it's telling Apple to change their speech as well as their reputation using their own labor to do so.

Almost certainly this expertise is billable to the government, and won't take $100 million, I'd be surprised if it took $1 million, but what do I know?

The more concerning thing is FBI almost can't lose. If they lose the case in court, they've put pressure, will continue to put pressure, on the public and Congress to change this in law, which is why I think it's important to establish constitutional reasoning for why this is a bad idea.


Well the question would be, who gets to dictate the size/scope of the project and the rate charged to the government? Is any person outside of Apple really qualified to make that claim? It's not open source code, so Apple could make any claim they wanted. Also, who says they would have to put their 'A' players on this? Why not back-burner that with specially hired 'E', and 'F' level developers? Charging $700-$1000/hour might seem high, but on a custom development project may not be too far out of line.

At the end of the day, there really is no way the government will succeed here, you really can't force Apple's developers to do this, as they could just tell the government to go pound sand. The only way they could get it done is through threat of violence/incarceration or by gigantic sums of money. The government can't really argue that Apple with its billions in profits, that their developers' time isn't worth a few billion.


>It's not open source code, so Apple could make any claim they wanted.

You don't get to play games like this with the judiciary. They can require you to allow an outside auditor to review the code.


The First Amendment doesn't protect written works, it protects expression. Non-written works can be expression (black arm bands worn in protest), and written works can be functional and non-expressive (e.g. a legal contract). Code is only speech when it's being used as a form of expression. If it's just used to make a commercial product go, it's not speech.


> How is either a court, or law enacted by Congress, telling someone what to write or delete what's already written, not abridging speech?

Abridge means to curtail. Telling someone to create something doesn't curtail anything.


Adding an extra log line is a bit different than custom tailoring a hardware id check that can't trivially be reverse engineered by a highly motivated party.


Can the All Writs Act not then be used to acquire Apple's private keys? After which law enforcement could create the needed software?


Nope, because everything is ultimately dependent on the hardware key in the Secure Enclave and the user's passcode, neither of which Apple has knowledge of. The FBI would be able to create the OS, but they wouldn't be able to load it on the device without erasing it. It's really a brilliant system.


>Nope, because everything is ultimately dependent on the hardware key in the Secure Enclave

The phone in question is an iPhone 5C which does not have a Secure Enclave, only a burned-in hardware ID.

It's also worth pointing out that there's no clear evidence that the Secure Enclave must clear all keys as part of a SE firmware update (just speculation that this would be reasonable).


Compelling production of Apple's private keys would (for better or worse) be a far easier legal question than what this subpoena seeks to compel. Remember the Lavasoft case? The private keys were ordered produced there. The difference here is that what is sought to be compelled here doesn't exist now; it will have to be created through expert engineering and software design work.


Apple is not being asked to engineer a new backdoor.

It is being asked to help exploit an existing backdoor in a device they produced; a device that was used by a person who slaughtered many people.


There is no backdoor


Why would Cook not explicitly make that case?

If the "10 false codes and the device is bricked" can be circumvented, there is a backdoor.


The encryption only has one way through, with the passkey. So the encryption works and this is good. No backdoor there.

The 10 try deletion is a completely separate system from the encryption. If you stop it from deleting, the encryption is still intact. No one will be able stop this from happening short of Apple taking action or divine intervention, so there is no backdoor here.

What you seem to be pointing out is a philosophical point, about if in the future Apple designs software to backdoor your phone, then there is the potential for a backdoor right now, therefore, there is a backdoor right now. This is false. There is no backdoor. Apple, being god in this case, could make a backdoor. Apple, or god (which ever you prefer) didn't make a backdoor, so there is no backdoor.


That's not so much a back door as a potential vulnerability.


A distinction in phrase only.


Not true. To say something has a "back door" is to imply security was deliberately compromised by the manufacturer, whereas a vulnerability is a security weakness in the the architecture or implementation.

This is not a "back door". It's just not.


I get what you're saying but it would have made more sense had Apple not added the Secure Enclave to newer iPhones. I.e. they tried to make it more secure, so I have to conclude that they had tried their best with the older versions, too.


A simple flow chart or state diagram would show the vulnerability. I'm not saying it was intentional, but I believe the backdoor label can be applied to accidental entrances as well.

The bottom line is that Apple produced a device whose security features could be circumvented.


All security features can be circumvented. Security is not absolute


I know my opinion is probably not popular, but if there was a way for Apple to physically install a firmware on a single device to allow for brute-forcing and if Apple did that in response to a direct court order, then this is probably the best compromise we can get.

I really believe that there should be a way for law-enforcement to get access to specific devices in response to a court order as long as the solution doesn't involve weakening the encryption for everybody else.

I'm absolutely against backdoors, secret* keys or similar crap. But physically access a single device in order to make brute-forcing it possible, that seems acceptable to me as that won't affect any other device.

That would be similar to a court order allowing law enforcement to enter your premises and take out the safe in order to pry it open at some other location where specialised equipment is available.

If this is all law enforcement wants, then maybe it's time to hand this over before law enforcement wants even more which will doubtless pave the way for mass surveillance of devices.

* until they leak. Then everybody has access.


> maybe it's time to hand this over before law enforcement wants even more which will doubtless pave the way for mass surveillance of devices.

The FBI is already paving that way with this case. They don't overly care about access to this particular iPhone. They're taking this case through the courts so that they can establish a precedent that allows them to force manufacturer cooperation to unlock any phone.

Edit: If they really cared about access to this individual phone, they wouldn't be going through the courts to get it; they'd be talking to the NSA TAO or other LEO with advanced forensic capability. As several people have pointed out, this iPhone 5C does not have a Secure Enclave and probably does not present a significant challenge to forensically analyze, to people that know what they're doing. They're going through the courts on this so they can get carte blanche to access iPhones 5S and above, which no LEO currently has capabilities to inspect.

Further edit: This is Farook's work phone. His main, personal phone was found destroyed in a dumpster near the site of the attacks. I find it incredibly unlikely the FBI really cares much about the contents of this individual phone, they just want a high-profile test case to expand their surveillance capabilities.


> They don't overly care about access to this particular iPhone. They're taking this case through the courts so that they can establish a precedent that allows them to force manufacturer cooperation to unlock any phone.

This is an analysis, not an objective and demonstrable fact.

I could just as well argue that yes, the FBI really does care a lot about this particular iPhone, and that's why the asked-for update is to be keyed to this iPhone and only this iPhone.

At the same time, even assuming that is true, we're talking about the FBI going through a legal process, reviewed by a judge, to get the data off one phone at a time. If that's how it works every time, I don't see a problem; that is how the system is supposed to work. I am kind of baffled as to why we're cheerleading the fact that Apple is refusing to perform what appears to be a perfectly reasonable request that is being made in accordance with the law. If you are operating under the presumption that the government is always a bad-faith actor, then we have much, much bigger problems.

Also, apparently this 'precedent' has already been set; according to a link in the article, Apple had previously offered custom firmware images to law enforcement after a court order that bypassed the lock screen on earlier iPhones.

http://www.cnet.com/news/how-apple-and-google-help-police-by...


US law enforcement has an impressive track record of extending their authority through unilateral reinterpretation of the US code. Once it's been established that Apple are able to extract data, what's to prevent agencies from slapping Apple with gag orders and forcing them to comply under completely opaque proceedings, that may not even have a way of appeal? I think Apple is right in resisting while it's still in the open. For all we now, this might really be about forcing them to demonstrate the technical capability to cooperate for use with in one or several secret cases we do not now.


Every comment I've read so far has said that Apple should help in this instance, so I don't see the cheerleading-- yet. Except now I may provide it. I just read Apple's letter to customers, and now I agree with them that the very creation of backdoor software -- even if it's only meant to help in specific instances -- is a dangerous thing. Applying specialized knowledge that Apple has about iOS and iPhones, plus Apple's engineers, to creating an innovative backdoor that does not exist today, means that it can never be un-designed. It will never have fewer people aware of it, unless you kill them after they create the software. The knowledge will only spread. The software can only leak. The engineers can only get conveniently hired by a competitor or foreign government or our own government. I agree, it is troubling.


> creating an innovative backdoor that does not exist today, means that it can never be un-designed

Wasn't it just yesterday a story was published about an upcoming documentary about the STUXnet virus that claims that the US and Israel developed it in secret together and had a very successful, but very limited use for it. Only when Israel allegedly went off on their own to modify and deploy it did it spread wide and far, popping up on the radar of anti-malware companies and getting researched and publicized.

Like what you said, after the exploit/backdoor/software is designed, it can never be un-designed. It will exist as a tool that can only be mitigated, but not destroyed.


The knowledge will only spread. The software can only leak.

Then why don't we have Apple's private keys yet?

Plenty of companies keep a lot of things very secret, including things like powerful debug modes, for a long time. At least long enough that everybody forgets the details and the software has long since rotted away.


Because it's Apple who keeps them, not FBI.

Nobody in FBI would give a damn about leaking the patched OS image: it's Apple's reputation on stake, not FBI's.


But. The FBI doesn't want the keys in this case. They not even want a build that works for on any phone but the one in question.

There is nothing of value for the FBI to leak.

This is the huge difference between this order (which I can live with) and blanket encryption backdoors using key escrow or other crap (which I'm absolutely vehemently against and willing to fight to the teeth)


"They not even want a build that works for on any phone but the one in question."

That is completely not true. There is no way to make such a thing that can only work on one particular phone. There will be some point at which the compromised firmware image checks to see if it's that device, at which point it would be possible to change that to whatever device you want.

"This is the huge difference between this order (which I can live with) and blanket encryption backdoors using key escrow or other crap (which I'm absolutely vehemently against and willing to fight to the teeth)"

No, there is absolutely no difference between those two.


If Apple hands the FBI a signed, compiled firmware image that say, checks the serial number of the phone, how does it make the jump to 'whatever device' they want? Why were Leos previously filing for multiple court orders for each older iPhone requiring a backdoored image?


> That is completely not true. There is no way to make such a thing that can only work on one particular phone

The technique that makes this possible is described in Apple's iOS Security White paper, page 6 ("System Software Authorization"): https://www.apple.com/business/docs/iOS_Security_Guide.pdf

This mechanism explains why you can't take an old release of iOS off a different phone and copy it to yours.


You've missed the point: By doing this, they've shown that it's possible, and that they already have the tools. Meaning that next time, it's going to be almost impossible to say no.


Yes, there is. Firmware updates must be digitally signed using Apple's private key. That means no one except Apple can edit out the device check, or indeed modify the firmware in any way.


The original argument is that if an bruteforcy firmware were created that there are now more people who have knowledge and they (Apple employees) are at great risk of exposing the capability in a real way.

Not LEOs.


If Apple rotates their keys, that means that their private keys can be unlearned, whereas a method to backdoor iPhones could not be unlearned in the same way.


If the backdooring method uses a special firmware update that needs to be signed by Apple, rotating their keys means that it could be unlearned as well.


The court order specifically requests a firmware update that can only be used with that particular device ID.


Replying to the reply: the FBI doesn't want this leaked because it would jeopardize their own agent's apple devices.


> This is an analysis, not an objective and demonstrable fact.

No. You don't have to assume that law enforcement or intelligence agencies are bad faith actors to see they are constantly seeking to expand their powers.

http://thehill.com/policy/cybersecurity/235910-fbis-hacking-...

https://www.eff.org/issues/national-security-letters

http://www.latimes.com/nation/nationnow/la-na-nn-fbi-using-d...

https://epic.org/foia/fbi/lpr/


Amezarak is not (necessarily) assuming they are bad faith actors. Amezarak is pointing out, I think correctly, that the parent poster is inferring intent of an organization from its actions. Without specific documents from that organization spelling out that intent, I agree that such inference is analysis and not fact. It may be reasonable or even probable analysis, but that does not make it fact.


Also we don't have to assume it because we know it for a fact - they are bad faith actors.


> If you are operating under the presumption that the government is always a bad-faith actor, then we have much, much bigger problems.

Given what we know about government surveillance programs, why would one assume the government is a good faith actor when it comes to encryption?


> If you are operating under the presumption that the government is always a bad-faith actor, then we have much, much bigger problems.

It needn't be "always", just (perceived as) too often. It seems fair to say the broad sentiment is that the national security arms of U.S. government have broached that barrier.


>At the same time, even assuming that is true, we're talking about the FBI going through a legal process, reviewed by a judge, to get the data off one phone at a time. If that's how it works every time, I don't see a problem...

And how will we ensure that's the case? Once they have the firmware they need they can install it on other phones.


Maybe that would force manufacturers to be in a position where they can truthfully say "we can't do that". I'd prefer that to the current situation where Apple basically has a backdoor they're just trying to keep for themselves.


As the original article mentions, it sounds like Apple has already engineered a situation where they can't get into their own phones. This Apple only backdoor option exists only in equipment that predates the secure enclave hardware (such as the phone in this case).


Except that there is no proof of this. It was stated as false by John Kelley, an ex Embedded Security engineer at Apple: https://twitter.com/JohnHedge/status/699882614212075520

For all we know so far, Apple could still provide a signed firmware bypassing the bruteforcing delay implemented by the Secure Enclave.


The only way to bypass the brute forcing delay would be to increase computing power, since it's a function of the encryption method used. It basically goes through a number of iterations chosen to make it take about 80 ms per attempt.


Such a thing does not currently exist, though. They would have to develop one.


They always can. Microcode updates come to mind. While apple have the ability to keep their ecosystems locked, they also have the capability to unlock them.

Probably not easy and not on mass scale. It may even need to have to see inside the silicone itself. Which is expensive and hard. But as long as the keys are on the device or in possession of apple - they can be extracted.


The private keys are not in the possession of Apple though. The key is in the secure enclave.


Worse case, you disassemble the die and read the bits off with an electron microscope. It's still possible, just expensive, painful, and maybe dangerous if you damage the chip.


As long as there is no law that mandates that a phone must be unlockable to be sold in US (which probably FCC has authority to insist for) - if there is technical capability for a device to be unlocked - it is manufacturers' fault and they must comply.

Apple chose themselves to have total control over the device, signing and ecosystem - now it backfires.


> That would be similar to a court order allowing law enforcement to enter your premises and take out the safe in order to pry it open at some other location where specialised equipment is available.

I would think it's more similar to the following: the government has gone to the safe manufacturer to help it open a safe it has a warrant for, has access to and can move, but can't open.

The government is asking to develop a method to modify the safe so that it can be opened. The safe manufacturer says that it they did so the same method would be able to be used on all of their safes, and thereby make all of their products less secure.

I would imagine that a reasonable safe manufacturer would bring up the same objection.


The mere fact that the safe manufacturer is able to make such a modification already means the safe is not very secure. The development of the modification is almost incidental at that point.

If the safe manufacturer doesn't want to be put in this position, it should make it so there is no such modification possible. Which as far as I understand it is what Apple did with their safes^H^H^H^H^Hphones starting with the A7 CPU, but this phone is older.


The firmware for the Safe Enclave can be updated, but only with the Apple private key. Which is as secure as it's possible to get in a consumer product that gets improved upon after sale.


Merely requiring an Apple private key is insufficient. The OS as a whole requires that too, but as we see here that just puts Apple in the position of potentially being forced to sign an update which removes security.

I'm guessing that the secure enclave not only requires a private key from Apple, but that it wipes the crypto keys it contains (effectively wiping the device) if it's updated without first being unlocked with the user's passcode. That would prevent even Apple from cracking it, barring an exploit of the secure enclave's software, or some sort of highly advanced attack on the physical hardware.


I got the impression that that isn't the case, but as they want to improve in ways that would be unhelpful to LEOs, what you described will be the setup soon.


> I really believe that there should be a way for law-enforcement to get access to specific devices in response to a court order as long as the solution doesn't involve weakening the encryption for everybody else.

This requirement is self-contradictory. The device has no way to determine whether the attacker trying to gain access is a good or bad guy, nor can it.


>there should be a way for law-enforcement to get access to specific devices in response to a court order as long as the solution doesn't involve weakening the encryption for everybody else

Which law enforcement? The FBI? Really? how about the DEA? TSA? How about the federal police in China? Venezuela? Saudi Arabia? Syria?


The FBI's backup in this case to Apple agreeing to work with them might just be to force Apple to disclose their signing key for iOS disk images, which could potentially be worse since it would enable the FBI and not Apple to control on which device(s) the image was installed. From a PR standpoint, that might be better for Apple since they could argue that they did not cooperate in creating a bypass, but from a technical perspective, it would be much worse.

Were I in Apple's position, I would probably do what Apple is doing here... but it's a harder question than just "should we cooperate?" They have to ask, "what if we don't?"


I was of the opinion that apple is actually signing individual OS installs (using https://en.wikipedia.org/wiki/SHSH_blob), so ultimately, apple would still be very much aware, if not totally in control of what firmware is installed where.


> But physically access a single device in order to make brute-forcing it possible, that seems acceptable to me as that won't affect any other device.

Sure, they can try brute-forceing it, or even break open the chips and try to extract keys with an electron microscope. That's all within their domain. But why should anyone be forced to assist them?


Except that it isn't possible. Once it's done for one device, it can be done for all. If Apple caves in this one time, then there's no reason they shouldn't cave in (or be compelled to cave in) the second time, and the third time, and so on.


I don't think the safe analogy holds up, maybe someone can help me understand why it would, given:

A safe is a container filled with physical objects: property. Property is subject to search and seizure with appropriate warrants, levies, writs, orders, wants, etc.

A phone is a container filled with information. The only physical property relevant to evidence consists of the electromagnetic state of the memory on the device. This would be no different from the bioelectric state of the neurons in the human brain, which coincidentally, also is a container filled with information. In both cases, there seems to be easy precedent to state that the information in those containers represents protected information, as it pertains to the possibly incriminating testimony of that information.

A safe can be physically removed and brought to a place where there are more specific tools available to access its physical contents. I stipulate to that.

A phone can also be physically removed and brought to a place where ... What? What tools exist to interrogate the electromagnetic state of the phone that aren't already accessible? Asking Apple to create some software allowing them to unlock and read the information is tantamount to asking a neuroscientist to create software allowing them to unlock and read your mind.

Not trolling, these are my sincere beliefs. Are they wrong?


Crucially, anyone with sufficient technical skill can open a safe, therefore the manufacturer offering that service isn't a violation of customer rights.

Nobody can bypass the security on an iPhone but Apple, requiring them to do so isn't requiring a company to assist in a search, it's requiring a company to use its unique position as the manufacturer to damage their product.


And if I built a physical safe that was so complex that only I could safely open it and then sold the safe to an alleged criminal, would the court have the power to force me to reveal the secret to opening this safe? Sorry, but the law is not exactly on our side here. The courts have legitimately exercised such warrants before and will do so again, as is their right and authority.

Unfortunately, I think this sort of example paints you into a corner. The court is not requiring the company to permenantly damage the product, any more than a locksmith who opens a locked door to assist in the execution of a warrant is damaging the door. Once the court has what it requests it is trivial to reinstall the original firmware.

There is an old saying that hard cases make bad law. This is a hard case. The defendants are both heinous and deceased; they lack standing to resist and few will support such an effort. Apple is a third-party here and is on such shakey legal ground that they are left defending themselves in the court of public opinion because they know they are going to lose in an actual court. Bad things tend to come out of situations like this.


Going with the safe analogy given reasonable suspicion and a court order its clearly within the governments powers to say drill the safe, it MIGHT be in the governments power to force the individual to open it depending on how they see it. It is hard to see how a 3rd party not involved in the crime ought to be compelled to work on the governments behalf just because they happen to have made the original safe and possess the know how to break into it.

Can you explain how that duty comes to be?


Because not stepping up when asked to perform such a task opens you to obstruction of justice charges. Is a building super somehow not compelled to open a door in a building if they have the key and the agents of the government present a valid warrant? When they have a warrant you do not get to decide if you think what they ask for is justified, that ship has sailed already. You get the choice of do as asked, or potentially go to jail while your legal team tries to have the warrant quashed or at least your involvement in same. Best of luck on that...

And since you asked, I would bet that most of the people sitting on a jury deciding if you go to jail or not for impeding the execution of the warrant probably think that your duty to act in such a situation is considered a part of the price of admission to civil society.


[deleted]


The FBI explicitly requested a firmware image that is bound to a specific device (by checking a unique ID -- read the court order please). That signed image would not be usable on other devices without going through Apple again.


So what? How would Apple be able to say yes in this case, but no in the next case? Developing such a system when none exists is effectively creating a back door for every iOS device out there.


> How would Apple be able to say yes in this case, but no in the next case?

Well, because an actual judge would be required to make such a decision, and in each case, the merits would be assessed separately. So Apple could say no, and a judge might agree with them "right, the government has no grounds here".

(IANAL, and IANEAA - I am not even an American).


That is extremely difficult to do after it's already been revealed that Apple has the capability to do so.


And then they know that this is a simple change to an if statement the next time they want to do this.


Presumably that change would invalidate the signature.


So? They'd compel Apple to resign the firmware.


Not an Apple fan so genuinely curious how this unique ID is secured.


Indeed, this is precisely why Apple is writing an open letter. If this was an iPhone 6, they would have simply told the judge "no" and that would have been the end of the story. But since its a 5C, it is possible and Apple doesn't want to do it to avoid setting a precedent. If they cooperate with law enforcement to backdoor this phone, then they would face much more pressure to comply with any future laws that require backdoors be built-in.


> Apple doesn't want to do it to avoid setting a precedent

Unfortunately that's exactly what they're going to end up doing with this faux resistance. It seems like in this case there is a master key that only Apple has. If the device's security is broken in this manner, then this is a terrible place to make a stand, as Apple will have no choice but to eventually comply.

Next time, with an actually secure implementation, the stance will be "you protested last time and gave in, do that again". And when USG realizes Apple isn't bluffing that time, their bolstered entitlement will result in the inevitable law for Apple to go back to the backdoored nearly-just-as-secure scheme.

To the first order, USG doesn't care about the argument that foreign governments could also compel Apple, since that simply reduces to traditional physical jurisdiction. And governments seem to be more worried about protecting themselves from their own subjects than from other governments.

We can only hope that the resulting legal fallout is implemented in terms of the standard USG commercial proscriptions based on the power of default choices, leaving Free software to continue to be Free.


Apple doesn't have the master key. Apple can't create a master key. Apple can, if they are forced to, create a variant of their software (which doesn't exist today) that will allow the FBI to try all keys and hope that they find one that works.

Apple could be forced to write software that removes the rate limiter and the FBI could still be stuck without access because it's possible the user used a password with too much entropy.


If this software variant can be installed on a locked device and post-facto modify its security model, then either that device was insecure or there is a master key.


Disabling the bruteforce inhibiting code does not mean it was insecure or that there's a master key.


It certainly means the bruteforce inhibiting was insecure.


Explain.

Include in the explanation how removing code makes the prior state insecure.


If an attacker can modify the code, they simply nullify that check (as we're discussing here). The prior state was insecure because the whole point of crypto/trusted hardware is to prevent against such attacks, and the prior state should have never allowed a code update on a locked device (if we're talking about trusted hardware).

If we're not talking about trusted hardware, then naive code which calls sleep() is defective for the same reason - the security of the system cannot depend on running "friendly" code. See Linux's LUKS which has a parameter for the number of hash iterations when unlocking, which sets the work factor for brute forcing.

If this still isn't apparent, you need to try thinking adversarially - what would you require to defeat various security properties of specific systems?


An attacker can't modify the code. The code isn't public. Only the sole keeper of the code can modify the code, it's proprietary software. Further the code is signed by author's private key, so even if an attacker could modify compiled code (via a decompiler for example), they still can't inject that modified code into the hardware without signing.


> Only the sole keeper of the code can modify the code, it's proprietary software

LOL!

> Further the code is signed by author's private key

This is the crux - if Apple is in a privileged position to defeat security measures and you're analyzing security in terms of Apple/USG, this counts as a backdoor. It doesn't provide full access, but it does undermine purported security properties of the system.

It's quite possible to implement a system with similar properties that doesn't give Apple such a privilege. It sounds like they didn't.


> An attacker can't modify the code. The code isn't public. Only the sole keeper of the code can modify the code, it's proprietary software.

This is not correct. Reverse engineering is a thing. Proprietary software just makes it harder. People modify proprietary code all the time.

> Further the code is signed by author's private key, so even if an attacker could modify compiled code (via a decompiler for example), they still can't inject that modified code into the hardware without signing.

This is the actual point.


There is the most master of all keys - the one they sign the bootloader/os with.


As long as anything like this exists, and can be used to flash a new system image while data remains intact, then Apple claiming they have a system secure against government is extremely negligent.

An OS signing key is never a replacement for a bona-fide user-initiated upgrade intent.

In designs with trusted hardware to prevent evil maid attacks, the boot trust chain should use a hash rather than a signature. This hash is updated only when the trusted chip is already unlocked.

To avoid creating useless bricks, said trusted hardware should allow the option to wipe everything simultaneously. But nothing more granular.


In the end, they will have no choice but to comply.

Do people think this a game? Apple doesn't run things, the federal government does, and will, in the end, use it's full power to get what it desires.


It absolutely is a game. And governments don't always succeed in winning the game against powerful coalitions. Besides "the government" is a large, fractured entity with political divisions and its own internal power struggles. If one subset of "the government" can look impressive by milking some showy support of Apple, and use this to defeat internal rivals, they absolutely will.


Sure ok...but here is a reality check.

The vast majority of people in the US agree that, when it comes to the illusion of keeping them safe, that Apple should bend over and give up the info.

Now, I personally do not agree with this stance, but it's obvious to me which way the wind is blowing.

[edit]


Thus the open letter. To inform the public of what everyone here already knows.


The public doesn't want to be informed.

They want the illusion of safety that the 3-letter agencies provide.


Even if they think security is more important than liberty, iPhones are more important than security, so it evens out. It's a gamble but Apple may win this.


If you make these decisions by the polls how willing are you to admit that the polls are very much in Tim Cook's favor in this matter?


Why are you saying this? There is always a choice. TC may choose to go to jail. His successor may also choose to go to jail. To the point that Apple shareholders may choose to let Apple go bust and burn the servers with the source code of iOS. You can argue that this is silly, idiotic and what not. But there is always a choice. Some brave people have choosen to go to jail previously.


This is not about going to jail. No one is going to go to jail. If Apple loses, they will have to comply (assuming it is technically possible to do so). They are going to appeal what is (at this point) an order from a magistrate judge. They have at least three higher forums (district judge, court of appeals, supreme court). If and when they get a final, adverse order they can't appeal, they are going to comply.


Apple also has a form of resistance not available to you or I. They don't have to fall on their sword. A sword as big as theirs can be used it to decapitate (or, more aptly, recapitate) the FBI. With a 12-figure bank account, you can get a lot of people elected...people who will be more than willing to replace those in charge at the FBI.

What's more, they don't even have to actually do it, they just need to make the FBI believe that they actually would do it if the FBI presses the issue.


Right, but the FBI knows they wouldn't. This whole situation is generating good PR for Apple. Getting people elected that can change things would be a waste of their money because if known, it is very bad PR (the public is generally against lobbying like this) and the good PR created by this "fight" ends immediately.


Right, I was just suggesting that Apple has better "nuclear" options than Tim Cook going to jail out of principle and Apple going out of business.


$550bil dollar companies do not allow themselves to "go bust and burn."

Way too many powerful and wealthy people are interested and invested.


He is clearly not implying that it is actually going to happen. This is a hypothetical to show that there is always a "choice".


The Feds will always try, but in this case there is some push-back. The fact that this case has been made public and Apple is fighting against this is a victory in and of itself.

EFF to Support Apple in Encryption Battle-

https://www.eff.org/deeplinks/2016/02/eff-support-apple-encr...


Large companies don't have to stay in one place. If the USA were to force their hand, they have the option of leaving the country.

Chinas behaviour made HSBC, the worlds 5th largest bank, move to London. It's not an unheard of move.


No. The people of the United States of America run things. The government is our servant, not our master.


Even if that was ever true, it hasn't been true for a very, very long time.


Lololololol

A masterful troll statement!


I like to imagine that in the end the people run things. (I know, I know.)

Apple may have to comply with this order (after appeals), but this also helps muster the troops for the battle against universal backdoors.


And if they do have to comply this time, it would be nice if Apple were very vocal about continuing to make it impossible to comply in the future with newer iPhones.


I'd like to know what would happen if Apple said "Looks like we can't guarantee the safety of our users as an American company anymore. We're moving our headquarters and signing key to Iceland. For now, we'll keep our US company as a subsidiary which will provide R&D etc. to Apple Iceland as a service since we can't just move the office, but Apple is now an Icelandic company."


What? This is literally a game. What are the feds going to do against Apple, one of the most valuable companies in the history of modern civilisation? Have them pay some fines? That's about the worst they can do. They can't shut down the company.


Sure they could, if they wanted.

Otherwise, they could start locking people up.

They do that all the time for not complying with edicts.

For those of you whom have never experienced it, even 48 hours in jail is a truly miserable and ugly experience, one that no corporate titan is interested in.


The implications are huge however. AFAIK most smartphone users use iPhones. People will be outraged -- and not only people but _MANY_ business depending on Apple tech, and many governmental agencies both in USA and abroad.

If they actually try to hit that hard they'll find themselves in a very bad PR situation. They might not care for that but the consequences of such an action will bite them really hard on the ass.


Governments are ultimately bound by the people. So no, it's not supposed to do whatever it wants.


Does the federal government think this a game? Apple doesn't run things, the people do, and will, in the end, use their full power to get what they desire.

Also, "its", without an apostrophe.


If it was an iPhone 6, the FBI could just use the dead finger to unlock the phone, no?


No, because the touch-id is deactivated after 24h (or 48h, I'm not sure) of inactivity.


Assuming the attacker used Touch ID.


Not after 48 hours, when you have to use the PIN.


or maybe a well constructed fake using a finger print on file


>Apple has allegedly cooperated with law enforcement in the past by using a custom firmware image that bypassed the passcode lock screen.

If true, precedent has already been set.


> This simple UI hack was sufficient in earlier versions of iOS since most files were unencrypted

It probably wouldn't apply as precedent as it previously had nothing to do with encryption.


This is so far the clearest and easiest to understand explanation of the Apple/FBI case I have come across. Great read.


I wonder about the relevance of the case styled "United States v. Hubbell" (530 U.S. 27)†

Specifically (emphasis mine):

> ...the Self-Incrimination Clause ... may be asserted only to resist compelled explicit or implicit disclosures of incriminating information. Historically, the privilege was intended to prevent the use of legal compulsion to extract from the accused a sworn communication of facts which would incriminate him.

-and-

> ...the act of producing documents in response to a subpoena may have a compelled testimonial aspect. We have held that “the act of production” itself may implicitly communicate “statements of fact.” By “producing documents in compliance with a subpoena, the witness would admit that the papers existed, were in his possession or control, and were authentic.”

-and-

> Compelled testimony that communicates information that may “lead to incriminating evidence” is privileged even if the information itself is not inculpatory.

https://www.law.cornell.edu/supct/html/99-166.ZO.html

EDIT: Wikipedia summary of the case here: https://en.wikipedia.org/wiki/United_States_v._Hubbell


This is all well and good but the 5th amendment doesn't apply to corporations. So United States v. Hubbell is irrelevant.


Makes you wonder why the FBI bothered with the bit about submitting PIN guesses electronically, and didn't just ask for a firmware that looped over all 10,000 PINs until it found the right one.


Perhaps the phone in question has a 6 digit PIN, so it's useful to try it non-sequentially.


Great piece. Get thee to a "Secure Enclave" supported device, everyone.


John Kelley (@johnhedge), former Apple security engineer, says that Secure Enclave isn't protected against that kind of tampering, so that's not a solution, either. Until manufacturers start going to embedded HSMs, anyway.


Or any rooted android. Good luck in defeating LUKS. No custom firmwares will help them.


Make sure you set high enough LUKS master key iteration counts, and/or very complex password, so that they can't image the LUKS header and brute-force your passphrase off-device.


A rooted android is even less secure.



a rooted android is probably easiest to own over the air with a push notification, so yeah, that's a great idea! NOT


Could you expand a little on what you're referring to? Is this a specific vulnerability?


A powered down device rarely has that vulnerability.


A powered down device isn't exactly terribly useful.


You can't send a powered down phone a push notification for post-hoc analysis.

You would have had to know the target and push a vulnerability beforehand, which wouldn't have helped in this case.


So power it on? It will still boot with encryption. Isn't Android encryption is an extension of ext4 and only protects some data. It's not full disk / LUKS last I knew.


Despite the assertion of this article, it doesn't actually give any evidence to support the claim that Apple is capable of writing this backdoor. The important question is whether it's possible for Apple to update the OS on the phone (or to load a program into memory that runs on the phone) via DFU mode or something similar without triggering a wipe of the phone. And this article doesn't even acknowledge that question, it makes the blind assumption that this is possible. But is it? As far as I know, nobody has ever updated an iPhone over DFU mode without erasing the phone. It's plausible that Apple has the know-how to do that, but it's also plausible that the device firmware may have been written to trigger a wipe the moment any modification is made via DFU mode.

As a side note, the author mentions that Apple has updated the Secure Enclave with increased delays in the past without wiping data, though they state that only Apple knows how it really works. I just want to put forth the theory that maybe the Secure Enclave allows its firmware to be updated if and only if the user's passcode is provided at the time the OS tells the Secure Enclave to prepare for a firmware update. That would be a reasonable way to ensure the Secure Enclave can't be subverted.


One would hope that the Secure Enclave only allows itself to be updated after the PIN has been entered successfully.


The security architecture described here seems pretty clever. Is this degree of security unique across mobile devices? If the phone in question was from a different manufacturer or ran a different OS, would the FBI have to ask its creator for help?


Chrome OS was already doing chain of trust booting when it was first announced/revealed/open-sourced, but with maybe a hardware switch to enable flashing other loaders. (Since then that has been probably removed.)

There are Android full disk encryption schemes, and of course phones with signed bootloaders.

https://nerdland.net/unstumping-the-internet/pattern-unlock-...

http://www.extremetech.com/mobile/216560-android-6-0-marshma...

What Apple did that was so valuable is providing a very clear, almost abstract implementation, from scratch, hitting every point along the way (randomized device private keys, read and execute only Secure Enclave, signed loaders, proper AES(-XTS?) full disk encryption, probably also requiring strong a password too, full lock after ~48 hours - sure, it'd be good if this could be customized to something lower).


Would setting this precedent to enable brute-forcing the PIN on a less secure model really be that dangerous, then? This isn't a request to circumvent encryption on all models. It would be worrying if the government were asking for a backdoor into the Secure Enclave feature to retrieve the encryption keys, but that's nowhere near what this request is actually detailing.

I still remain opposed to any kind of circumvention that reduces security, which this definitely does. Just questioning whether it's something to be so shocked about since it's not exactly far removed from the kind of requests that they have conformed to in the past.


If Apple can update the firmware of the Secure Enclave, could they not also write one that leaks out all the data contained within it?

Presumably this could then be used for offline attacks against the image dumped from the phone's flash memory.


The part that I find most interesting as a former enterprise systems administrator from the 90's is that the employer owns the device but does not have a pass code for it. Is this the normal IT policy for these kinds of devices?


It's bad security to tell your manager your passcode/passwords, even for accounts they can get into through admin channels. The risk isn't the manager, the risk is that someone overhears you telling them the password, finds the piece of paper it's written on, etc.


Maybe? I work in 50-100k people software corporation, employer owns the phone and PC i use, but does not know phone pin or LUKS password.


Obviously government should not be allowed such power. Simply because it cannot be trusted. Everyone knows it will use this backdoor against everyone without limit. And everyone knows the lengths to which the US will go to spy on their people and everyone else. The fact that this request is being made is further evidence of the government's intentional malevolence. This behavior should not come as a surprise anymore. I mean any reasonable person by now knows what to expect from US government.


I am wondering if there is no other way to get in. There are lots of security researchers out there who can hack in to iOS or any other mobile os. Would not be it simpler to hire somebody who could do it? There are probably many ways to get access to this device other than guessing the pin.

Update: In the meantime I was talking to my security engineer peers and it is not feasible to carry out an attack this way. The user partitions remains un-mounted until the PIN is provided after the boot.


> There are probably many ways to get access to this device other than guessing the pin.

It's encrypted data using the PIN (and a key embedded in the phone). There's not another way in.


The problem is that you are thinking of offline access when encryption is relevant, what I am thinking about is online access, for example let the device boot, connect it to a familiar wifi access point that is modified to redirect the normal iOS traffic to a site that infects the phone with a malware that opens it to unauthorized access. I am not sure if it is feasible though.


The phone can't decrypt itself, it literally doesn't know how.


So I've been wondering, since the PIN is obviously not a strong cryptographic secret, the way the encryption works is basically security by obscurity.

All an attacker would have to to was clone the contents of the the device's SSD and somehow read the secret key that is embedded somewhere else. I'm not sure how feasible the latter part is, but surely this shouldn't be beyond the capabilities of US three-letter-agencies?


It's much more complicated than that (which is why the FBI needs help). The encryption uses the PIN and a key that is in the phone. If you take the image and try all the PIN combinations you will fail because you don't have the embedded key.

http://www.darthnull.org/2014/10/06/ios-encryption

> The UID key is used to create a key called “key0x89b.” Key0x89b is used in encrypting the device’s flash disk. Because this key is unique to the device, and cannot be extracted from the device, it is impossible to remove the flash memory from one iPhone and transfer it to another, or to read it offline. (And when I say “Impossible,” what I really mean is “Really damned hard because you’d have to brute force a 256-bit AES key.”)

Newer phones also include a secure enclave that introduces another key and hardware restrictions on timing. The FBI's request wouldn't make sense for a modern iPhone.


How does "cannot be extracted" work? There must be some physical representation of the key inside the phone, so surely it should be posible to retrieve it somehow (e.g. using a scanning tunnel microscope or whatever)?


I don't know if this is implemented in the iPhone's Security Enclave, but many modern HSMs are designed so that physical tampering (such as extracting the chip for analysis) damages/destroys the data.


The pin is just used for getting access to the actual key that is used to decrypt the content as far as I know.


That's exactly what I was talking about: The actualy key has to be stored somewhere, too. Why can't we just read it using an STM?


What I don't understand is why Apple can't just unlock the phone the same way that they would for any typical customer that forgot their passcode. I've never owned an iphone, what is the recovery process like for a typical customer that can't get into their phone?


What would happen when other governments, their court, their agencies make similar demand?

The Pandora's box is opened?


FBiOS :)


Technically feasible, sure. But with a higher risk of introducing security vulnerabilities than most features.


why can't they just use a pin code robot https://www.youtube.com/watch?v=k_n5W69OdKM


There is a setting that makes the phone wipe all data after 10 failed attempts. While off by default, the FBI is worried that the setting is turned on.


Because they don't want to spend the time.

After too many incorrect pins, there's a time delay before another attempt can be made.

The FBI is also thinking about the next time. They want to be able to take a phone, plug it in and brute force the PIN to gain access.


I have a feeling it's more about setting the precedent, as others have said. There are strategies that can bypass the timeout periods and automatic wiping while bruteforcing, and I'd be surprised if the FBI didn't have at least one rig set up to do it.


it would take forever. there's a timeout penalty for entering the wrong code.


We need to create devices that RESIST (warrantless,unconstitutional) mass surveillance and hide all of our data and metadata from everyone. Obama set a terrible precedent by warrantlessly scanning 'metadata' about who we contact. We're innocent by default, period.

I agree with the first post. We need to be creative and find a way to resist government surveillance, and the piece where the engineering seems impossible is allowing an occasional breach of security for extreme circumstances.

What's extreme? Well first, physical possession of the device should be required. Second, it should take resources only a nation-state would be able to afford. Want to decrypt an iPhone? It's going to cost > $5million in processing power. Any criminal would move on.


The FBI's position seems entirely defensible. The phone data may yield important information - accomplices, contacts etc.

It also seems pretty disingenuous/hypocritical for Apple to plead "customer privacy" when the ENITRE BUSINESS MODEL of much of the smart phone and app industry (from which Apple directly benefit with a 30% commission) is predicated on abusing customer privacy.


> It also seems pretty disingenuous/hypocritical for Apple to plead "customer privacy" when the ENITRE BUSINESS MODEL of most of the smart phone and app industry is predicated on abusing customer privacy.

Apple uses privacy as a major selling point. Apple has also proven very bad at abusing customer privacy for profit, they have even shuttered their own advertising service.


Apple does not use customer data as part of their business model. This is a unique Google/Android feature.


also unique to Facebook, Yahoo, Amazon, all SEOs...


Sources to back up your claim?


How about this?

"...we never sell your data."

http://www.apple.com/privacy/approach-to-privacy/#personaliz...


It seems that the extraordinary claim requires data. I haven't seen the line item on Apple's earnings report that breaks out where they're making money from customer data.


We don't actually know that. The best we can say is that don't seem to use customer data. I'm not saying they do, but, we can't demonstrably say they don't either.


Sure we do, but you are welcome to continue to be skeptical.


so prove it then. I'd actually love to be proven wrong about that.


The burden of proof lies on the accuser...

Or so they say.


That's not really how that works. I think you should look into it a bit more. I'm not accusing them of anything. I'm saying we can't prove that they aren't using customer data. I'm basically saying you can't prove a negative.


> The FBI's position seems entirely defensible. The phone data may yield important information - accomplices, contacts etc.

They should already have this information from their 'metadata' collection programs. If the FBI was doing their job at any point this wouldn't even be a subject of debate.




Consider applying for YC's Spring batch! Applications are open till Feb 11.

Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: