Indeed, this is precisely why Apple is writing an open letter. If this was an iPhone 6, they would have simply told the judge "no" and that would have been the end of the story. But since its a 5C, it is possible and Apple doesn't want to do it to avoid setting a precedent. If they cooperate with law enforcement to backdoor this phone, then they would face much more pressure to comply with any future laws that require backdoors be built-in.
> Apple doesn't want to do it to avoid setting a precedent
Unfortunately that's exactly what they're going to end up doing with this faux resistance. It seems like in this case there is a master key that only Apple has. If the device's security is broken in this manner, then this is a terrible place to make a stand, as Apple will have no choice but to eventually comply.
Next time, with an actually secure implementation, the stance will be "you protested last time and gave in, do that again". And when USG realizes Apple isn't bluffing that time, their bolstered entitlement will result in the inevitable law for Apple to go back to the backdoored nearly-just-as-secure scheme.
To the first order, USG doesn't care about the argument that foreign governments could also compel Apple, since that simply reduces to traditional physical jurisdiction. And governments seem to be more worried about protecting themselves from their own subjects than from other governments.
We can only hope that the resulting legal fallout is implemented in terms of the standard USG commercial proscriptions based on the power of default choices, leaving Free software to continue to be Free.
Apple doesn't have the master key. Apple can't create a master key. Apple can, if they are forced to, create a variant of their software (which doesn't exist today) that will allow the FBI to try all keys and hope that they find one that works.
Apple could be forced to write software that removes the rate limiter and the FBI could still be stuck without access because it's possible the user used a password with too much entropy.
If this software variant can be installed on a locked device and post-facto modify its security model, then either that device was insecure or there is a master key.
If an attacker can modify the code, they simply nullify that check (as we're discussing here). The prior state was insecure because the whole point of crypto/trusted hardware is to prevent against such attacks, and the prior state should have never allowed a code update on a locked device (if we're talking about trusted hardware).
If we're not talking about trusted hardware, then naive code which calls sleep() is defective for the same reason - the security of the system cannot depend on running "friendly" code. See Linux's LUKS which has a parameter for the number of hash iterations when unlocking, which sets the work factor for brute forcing.
If this still isn't apparent, you need to try thinking adversarially - what would you require to defeat various security properties of specific systems?
An attacker can't modify the code. The code isn't public. Only the sole keeper of the code can modify the code, it's proprietary software. Further the code is signed by author's private key, so even if an attacker could modify compiled code (via a decompiler for example), they still can't inject that modified code into the hardware without signing.
> Only the sole keeper of the code can modify the code, it's proprietary software
LOL!
> Further the code is signed by author's private key
This is the crux - if Apple is in a privileged position to defeat security measures and you're analyzing security in terms of Apple/USG, this counts as a backdoor. It doesn't provide full access, but it does undermine purported security properties of the system.
It's quite possible to implement a system with similar properties that doesn't give Apple such a privilege. It sounds like they didn't.
> An attacker can't modify the code. The code isn't public. Only the sole keeper of the code can modify the code, it's proprietary software.
This is not correct. Reverse engineering is a thing. Proprietary software just makes it harder. People modify proprietary code all the time.
> Further the code is signed by author's private key, so even if an attacker could modify compiled code (via a decompiler for example), they still can't inject that modified code into the hardware without signing.
As long as anything like this exists, and can be used to flash a new system image while data remains intact, then Apple claiming they have a system secure against government is extremely negligent.
An OS signing key is never a replacement for a bona-fide user-initiated upgrade intent.
In designs with trusted hardware to prevent evil maid attacks, the boot trust chain should use a hash rather than a signature. This hash is updated only when the trusted chip is already unlocked.
To avoid creating useless bricks, said trusted hardware should allow the option to wipe everything simultaneously. But nothing more granular.
It absolutely is a game. And governments don't always succeed in winning the game against powerful coalitions. Besides "the government" is a large, fractured entity with political divisions and its own internal power struggles. If one subset of "the government" can look impressive by milking some showy support of Apple, and use this to defeat internal rivals, they absolutely will.
The vast majority of people in the US agree that, when it comes to the illusion of keeping them safe, that Apple should bend over and give up the info.
Now, I personally do not agree with this stance, but it's obvious to me which way the wind is blowing.
Even if they think security is more important than liberty, iPhones are more important than security, so it evens out. It's a gamble but Apple may win this.
Why are you saying this? There is always a choice. TC may choose to go to jail. His successor may also choose to go to jail. To the point that Apple shareholders may choose to let Apple go bust and burn the servers with the source code of iOS. You can argue that this is silly, idiotic and what not. But there is always a choice. Some brave people have choosen to go to jail previously.
This is not about going to jail. No one is going to go to jail. If Apple loses, they will have to comply (assuming it is technically possible to do so). They are going to appeal what is (at this point) an order from a magistrate judge. They have at least three higher forums (district judge, court of appeals, supreme court). If and when they get a final, adverse order they can't appeal, they are going to comply.
Apple also has a form of resistance not available to you or I. They don't have to fall on their sword. A sword as big as theirs can be used it to decapitate (or, more aptly, recapitate) the FBI. With a 12-figure bank account, you can get a lot of people elected...people who will be more than willing to replace those in charge at the FBI.
What's more, they don't even have to actually do it, they just need to make the FBI believe that they actually would do it if the FBI presses the issue.
Right, but the FBI knows they wouldn't. This whole situation is generating good PR for Apple. Getting people elected that can change things would be a waste of their money because if known, it is very bad PR (the public is generally against lobbying like this) and the good PR created by this "fight" ends immediately.
The Feds will always try, but in this case there is some push-back. The fact that this case has been made public and Apple is fighting against this is a victory in and of itself.
And if they do have to comply this time, it would be nice if Apple were very vocal about continuing to make it impossible to comply in the future with newer iPhones.
I'd like to know what would happen if Apple said "Looks like we can't guarantee the safety of our users as an American company anymore. We're moving our headquarters and signing key to Iceland. For now, we'll keep our US company as a subsidiary which will provide R&D etc. to Apple Iceland as a service since we can't just move the office, but Apple is now an Icelandic company."
What? This is literally a game. What are the feds going to do against Apple, one of the most valuable companies in the history of modern civilisation? Have them pay some fines? That's about the worst they can do. They can't shut down the company.
They do that all the time for not complying with edicts.
For those of you whom have never experienced it, even 48 hours in jail is a truly miserable and ugly experience, one that no corporate titan is interested in.
The implications are huge however. AFAIK most smartphone users use iPhones. People will be outraged -- and not only people but _MANY_ business depending on Apple tech, and many governmental agencies both in USA and abroad.
If they actually try to hit that hard they'll find themselves in a very bad PR situation. They might not care for that but the consequences of such an action will bite them really hard on the ass.
Does the federal government think this a game? Apple doesn't run things, the people do, and will, in the end, use their full power to get what they desire.