Hacker News new | past | comments | ask | show | jobs | submit login

I hypothesize that this is a coordinated yet simple ruse to rebuild trust in these brands post-Snowden [1][2]. There was similar press coverage regarding the DEA and iCloud encryption that was misreported in a similar way [3]. The Intercept (where Glenn Greenwald is now reporting from) has a story on what data Apple can still easily give away if you do believe they can't decrypt individual machines [4]. But maybe you don't believe it given the report from the ACLU on backdoors built into iPhones that circumvent encryption [5], and the Hope X talk on backdoors security researchers independently discovered [6].

More importantly, Apple's warranty canary was removed which either means they were served by National Security Letter or (if you're optimistic sort of person) that they are no longer committed to notifying consumers in the event that have been, which flies directly in face of all the PR talk of security commitment recently [7]. Plus remember, Apple can push whatever software they want to your personal device. That's how smartphones work.

We are led to two questions:

A) Why wouldn't the same tactics, National Security Letters and ORCHESTRA-type attacks work [8]? Don't we remember from the Snowden leaks that NSA agents infiltrate tech companies and backdoor software at the source when other avenues are closed or gridlocked?

B) Why all of the publicity about about how secure Apple's product are from snooping? Do we really think we can get away from ubiquidous global surveillance that easily?

I'm sorry. Investigative bodies don't publicly announce what technologies they can't track. There is no phone you can buy on the mass market that will keep your data safe with the exception of - perhaps? - the BlackPhone [9].

[1] http://www.businessinsider.com/apple-google-meet-with-obama-...

[2] http://www.huffingtonpost.com/2013/12/17/obama-tech-executiv...

[3] https://www.schneier.com/blog/archives/2013/04/apples_imessa...

[4] https://firstlook.org/theintercept/2014/09/22/apple-data/

[5] https://www.aclu.org/blog/technology-and-liberty-criminal-la...

[6] https://pentest.com/ios_backdoors_attack_points_surveillance...

[7] https://gigaom.com/2014/09/18/apples-warrant-canary-disappea...

[8] http://mirror.as35701.net/video.fosdem.org//2014/Janson/Sund...

[9] https://www.blackphone.ch/

Additional reading:

(1) https://www.schneier.com/blog/archives/2013/06/the_problems_...

(2) https://www.schneier.com/blog/archives/2013/10/defending_aga...

(3) https://www.schneier.com/blog/archives/2012/08/is_iphone_sec...




But the article isn't just talking about the NSA and National Security Letters - this is about law enforcement. If Apple and Google claim to be technically unable to comply with LE requests, it won't take long to see whether that is indeed the case or not - unlike national security-related demands, law enforcement won't be able to keep their successes or failures at demanding access to this data a secret.

If this was tech companies claiming to no longer comply with NSL's, I'd be at very least suspicious as there'd be no way to test these doubts short of another Snowden. But this is law enforcement, subject to all kinds of scrutiny in courts across the US. It'll be much easier to see whether Apple and Google are successful at resisting their demands.


A very good point. Local police will likely be able to access the device but only after calling up the chain. Previous versions of iOS and other smartphones had forensics kits that made it trivial for local law enforcement to grab data out of memory (via DMA/firewire for example) or by device management services. I expect those kits to continue working for devices that have not been powered off.


hhhhh, keys in RAM. Life is so difficult.


> But the article isn't just talking about the NSA and National Security Letters - this is about law enforcement.

No it's not. It's just that "law enforcement" sounds more comforting than "tyranny".

Here's how to interpret "law":

    Law = A written order issued by your rulers. 
    Lawful = Good = Anything your rulers want you to do.
    Unlawful = Bad = Anything your rulers don't want you to do.
    Law enforcement, verb = Forcing you to do what they want you to.
    Law enforcement, noun = People whom laws don't apply to.


"Law = A written order issued by your rulers. "

Don't be so obtuse. A law is a written order, adding the remainder is just an inflammatory accusation that undermines the public at large.

You may believe that the public or those governed by which ever particular law you select are too scared or ignorant and have the law imposed on them from outside interests but the greater numbers always win in the end. A particular rule of law may be arduous today and for many more consecutive days but one day the rule will change.

This is evident throughout history and will continue to be.


> A law is a written order, adding the remainder is just an inflammatory accusation that undermines the public at large.

Undermines the public how?

> A particular rule of law may be arduous today

"Rule of law" is a misnomer. It's actually rule by those who decide what the laws are. That would be the "elected representatives", ie. politicians of course.

In other words, politicians are our rulers because they make the rules that are ultimately enforced at gunpoint, if you don't feel like obeying at first.

But a law is just text somewhere. But even if the text contains a decree on what everyone must or must not do, that alone does not change people's behaviour one bit.

For example, if I write down on a piece of paper that you have to give 30% of your income to me, will you do it? OK, what if I threaten you with imprisonment if you don't?


> OK, what if I threaten you with imprisonment if you don't?

You and what army?

(Not trying to be snarky. That quote seems very apt.)


You seem to be overlooking the point. Would it be morally permissible for me to scribble down arbitrary rules and enforce them on you if I had an army with which to ensure your compliance?

Laws are just arbitrary rules decided on by a small group of people, much like they were with Kings and their inner circles. Laws are enforced in much the same way too - there's no practical difference between getting assaulted by the King's Guard and getting assaulted by men in blue costumes.


> unlike national security-related demands, law enforcement won't be able to keep their successes or failures at demanding access to this data a secret.

They can and they do.

http://en.wikipedia.org/wiki/Parallel_construction


The thing is, parallel construction won't work out unless there's another legitimate path which exists anyways.

Scream it all you want, but if there's no other way to the conclusion, there's still no way to use the evidence short of outright lying and falsifying evidence.

I don't think parallel construction is nearly as big of a threat as people seem to make it out to be. It gives law enforcement nothing more than a hint and some unusable evidence. There still needs to be a path that works legally.

And that's not even getting into the fact that iOS is heavily reverse engineered, often searching for backdoors and cryptographic vulnerabilities and Android is open source and publicly reviewable. I've reviewed some of the key derivation code myself as I was curious if it was being done properly.

I'm all for paranoia, it just need to be useful paranoia under a given threat model. Beyond that, it's nothing more than speculation and a waste of time.


It's not about the rules of evidence. It's about the rule of law. Parallel construction is an unconstitutional and dangerous abuse of law-enforcement power that cannot be tolerated.


I don't see why it's such a big deal for law enforcement investigators. They'll still be able to force Apple and Google to send trojan clients to targets, as happened with Hushmail, won't they?


Yes, but it still makes their jobs a lot more difficult.

I'm of the opinion that this is not a ruse or scheme, and that law enforcement are genuinely dissatisfied with this. Note that law enforcement and the NSA have a tenuous relationship at best.

Even if the NSA still do have privileged access after default mobile encryption is fully rolled out, law enforcement generally will not be able to tap into that except in extreme and rare cases.


Law enforcement is far over reaching in my opinion. This data is still hackable because all data is hackable. the fact it is inconvenient is a good thing. This is like saying you should only be allowed to have car or house locks that law enforcement has keys to. They can still acquire the data but they will need to spend significant resources to do it. Invading your citizens privacy should be difficult!


> This is like saying you should only be allowed to have car or house locks that law enforcement has keys to.

Well... there's no real need to say that, because it's already the case. If they want to open your locks, they will.


You're not obligated to make the lock unlockable for them, though.

The government can probably break many consumer-grade encryption schemes if they so choose to as well, but much like having to break in to your house through your locks instead of merely unlocking them, it raises the cost of law-enforcement doing so, and incentivizes them to make more restrained choices (eg, not taking literally everything they can get their hands on).


They can ask. A court order to reveal your password is enough. If you don't comply you usually run into troubles. This Wikipedia page http://en.wikipedia.org/wiki/Key_disclosure_law explains how it works in some countries.

There has been a ruling about that in the USA recently http://blogs.wsj.com/law/2014/06/26/mass-supreme-court-defen...


So how does that work if you are using an encryption method that allows plausible-deniability?

"Legally, you must give us the key which probably does not exist"


Or putting it another way, law enforcement will still be able to break into individual phones, but indiscriminate collection of data on a large scale will become that much more difficult. Which is a good thing.


I agree.


“What concerns me about this is companies marketing something expressly to allow people to place themselves beyond the law,” Comey said.

Sure, they're dissatisfied. But the roots of their dissatisfaction seem to be that they've tasted the forbidden fruit, and now believe that they have a fundamental right to watch our communications. Their fundamental attitude is that they should have visibility, and that anyone who wants privacy must be trying to hide something. It's just the old saw "you don't need secrecy if you've nothing to hide", restated from the law enforcement perspective.


The tech crowd always seem to come to the conclusion that because the powers are over-reaching and bad that they must be unecessary in a partical sense. It is perfectly possible that the surveillance powers really are needed to fight crime. Maybe organised crime and terrorism would increase if all the vulnerabilities that the government use are patched. Technology will undoubtly have a profound impact on crime, and the consequences could be a technical arms race between the public, governments, and criminals. A frightening prospect!


>law enforcement generally will not be able to tap into that except in extreme and rare cases

Like when they pull someone over for having a tail light out? I can't reconcile your statement here with what we already know about parallel construction.


Okay... so I guess someone here knows more about parallel construction than I do, however while they were kind enough to let us know this by downvoting my apparently wrong post, they were not kind enough to share their thoughts.

Are law enforcement agencies not getting data from the NSA to use in arresting and prosecuting defendants via parallel construction?


According to Binney they almost certifiably are. I think they downvoted because Binney cites the DEA, CIA and FBI - which are law enforcement - however in this thread posters have taken law enforcement to mean your friendly neighborhood municipal police officer.


I have to wonder if this is in part driven by a disdain for civic observance. Seems like every week I hear about instances of people getting their phones taken away at least temporarily for video taping police actions only to receive their phones back with the video deleted.


Apple has staked their reputation on the line and said "send us an NSL if you like, we have nothing we can give you". And the cryptographic principles, if executed properly, are sound.

Maybe they're lying. Maybe Snowden 2.0 will come out next year and tell us the truth and instantly destroy their credibility. That's a gamble I wouldn't take with my company, but it's plausible.

See I figure, if you're a threat to National Security, the NSA still has options. They just don't include monitoring over the wire or asking Apple or Google for it.

edit: The Intercept article [4] you mention above suggests to me more that they aren't yet finished implementing it properly and less that they are lying. I would take it as a work in progress.


What makes you think it's such a big gamble? People just don't care about privacy as much as the news would have you believe.

Don't believe me? Quick, think of one of the largest and most consistently flagrant private entities who violates the privacy of its users on a regular basis, and is well known for it.

Did you say Google? Facebook? Now quickly think of two of the largest companies on the Internet, both in revenue, and traffic volume.

Apple isn't taking nearly the risk you're suggesting, because people just don't care.


" Quick, think of one of the largest and most consistently flagrant private entities who violates the privacy of its users on a regular basis, and is well known for it."

And then you go on to claim Facebook and Google as examples of your "violation of privacy". That's like saying my email provider violates my privacy because my email goes through their servers. Or the post-office violates my privacy because my snail mail goes through them. Quite a bit of a stretch, if you ask me.

What people do care about is targeted invasion of privacy, for lack of a better phrase. It's one thing having anonymized data "abused" for targeted advertising and selling as aggregated statistics. But it's completely different if you have an entity that can read your emails at will, and decide to throw you in a cage if you say the wrong thing to the wrong person.


Until Apple promised it so explicitly, I'd have said you might be right. True, Google and Facebook make zillions off of monitoring people for advertisers. That will make it harder for them to follow suit, there will be far more weasel words and such when they try.

I think those of us that are willing to pay the premium for something that promises more have higher expectations. If they are lying, I believe we'll know within the next year or two and we'll get to find out if you're right.


Out of interest do you have an example of Google's weasel wording this issue? They're getting criticised by all the same people and seem to be doing the same thing - is there actually an advantage either way here?

Just to clarify, I'm skeptical on both sides - both companies were in the prism leaks after all. But I haven't seen weasel wording and I'm curious if I missed it.


Actually, no, that was just a prediction. I am making an assumption - and maybe not a fair one - that because Google mines user data on the web that they also mine user data on the phone.

Skepticism seems warranted. With Apple, I try to be skeptical, but with Google, I always assume that I am the product until they demonstrate otherwise.


I do believe in that claim.


I don't believe in that claim.


You could say that people don't take design seriously, and use google and Facebook as examples.

But Apple does. And though most people won't consciously notice that their products are well designed, it is a big part of what makes Apple "magical".

People do care. We care and while we may be a tiny minority right now, there are a lot more of us than there were 10 and 20 years ago.


Yeah, that's how I see it.

Since Apple's software isn't open source, it's possible they're lying. However, if they are, at some point the government is going to try to present evidence they gleaned through this lie, and that fact will leak out.

I think it would be an enormous risk for Apple to blatantly lie about something they put in black and white on their website. So I tend to think they're telling the truth.

Maybe one day, we'll get a real, bonafide, non-niche open source based phone where these things can be audited.


> However, if they are, at some point the government is going to try to present evidence they gleaned through this lie, and that fact will leak out.

No, they are not. That is the whole point of parallel construction.


But this isn't just about the FBI. It's about all of law enforcement. Sure, the FBI may be practiced enough to hide the facts of discovery, but will every law enforcement agency in the US be as adept. If Apple is lying, or law enforcement finds a way to circumvent the encryption, it will most likely leak at some point, an leaks are becoming more and more common, especially with highly visible targets like Apple and Google (both blackhats and white hats are itching to show up the tech giants).


Why do you assume the fact will leak out? I thought the FBI disclosed they used parallel construction as an MO.


DEA. Not FBI. Keep your agencies in check.

Granted, both are under the Attorney General (formerly Eric Holder)... but Holder's recent MO has been to improve race relations between the Police Agencies and the general public. (See his exceptional handling of Ferguson... but his less-than-exceptional handling of "Fast and Furious")


If Ferguson was exceptionally handled, merely competently handled situations presumably somewhat resemble the plot of Jurassic Park.


I appreciate the criticism actually. But lets look at this rationally.

When the FBI sent in 40 agents to a town with only 52 cops, to investigate a single murder... Eric Holder was sending a very strong message to the Ferguson community.

Remember, Ferguson Police and the St. Louis Police were the bad guys in Ferguson. The FBI came in with their Black leader (at the time: Eric Holder) and reassured the community that the African American President (and African American-led FBI) got their back.

Eric Holder then announced that the 40 Agents were going to hold an independent investigation and conduct a 3rd independent autopsy.

With the FBI's arrival, the riots immediately stopped and trust was restored.

Now true, the Ferguson Police and St. Louis Police were _absolutely_ terrible. But the Federal response (specifically FBI's under direct order of Eric Holder) was extremely effective and exemplary IMO.

Remember, we have a federal system in the US. Cities are independent of the county, counties are independent of the state, and states are independent of the nation. Eric Holder holds no responsibility for the poor behavior of Ferguson City cops or St. Louis County Cops. But... he was able to use the FBI's influence to pressure the local cops to do the right thing.

Remember too: Eric Holder does not have the authority to prosecute Darren Wilson on murder or assault. The best the FBI can do is prosecute him (and the police department) on racism charges. Murder / Assault is a charge that can only be delivered from the local government in this case. For the most part, the Feds don't have much legal authority over the situation.


They don't have to present evidence when they can just make their target disappear off to Guantanamo or some eastern European torture camp.


It's extremely difficult to implement cryptography correctly, and then cryptography != security != privacy.

You should also look through the other linked articles. Some of them include features built into iOS devices that already circumvent encryption.


I intend to. But my initial position is that it's far more likely that Apple will fail to be perfect than that they are intentionally orchestrating a coverup.

That said, I think all this syncing and cloud stuff - in its early days, at least - is way overcomplicated and more error-prone both in its features and its security than it needs to be. I expect mistakes from all implementations for quite some time to come.


> It's extremely difficult to implement cryptography correctly

The smartest CS undergrad at any vaguely reputable school could probably do this correctly. It shouldn't be a problem at all for one of the biggest companies in America if they actually care.

> then cryptography != security != privacy

Which is a more interesting point and assuming the left side was taken care of we could rapidly approach the right side by doing things like providing more granular permissions to applications.


> The smartest CS undergrad at any vaguely reputable school could probably do this correctly. It shouldn't be a problem at all for one of the biggest companies in America if they actually care.

You are very wrong about this.

> assuming the left side was taken care of we could rapidly approach the right side by doing things like providing more granular permissions to applications

It's actually an unsolved problem. Granular permissions have been tried before.


If some Gov Agency wants clone your phone's data, isn't the process as simple as get an NSL to own your email/phone number for a few hours, go thru the password reset sequence and own your iCould account and "backup" the data to a new phone they have access to?

An easier methods maybe to get someone from Apple Store to do password reset on a phone #. Any local police can probably do this - kind of scary now I think of it.

How can Apple or anyone prevent this?


Considering one of Apple's core axioms is "no comment", entirely sensible & expected that they close that venue for anyone, however influential, to compel Apple to say anything they don't specifically intend to.


It's not really a gamble when you're forced to do something. You simply have no choice but to do what the government tells you to or lose your business.


Then they better have perfect security so there are no leaks proving that, or they're gonna lose their business anyway. I don't buy it, they'd fight that and they'd leak that they're fighting it.


They have leaked it with Apple's warranty canary which was removed.


That definitely could imply they have received an NSL. But as a user, when I say "they would leak that they're fighting it", I would consider that completely insufficient. If they are pulling the kind of crap you say they are, then it will come out. Guaranteed. And then nobody, including me, will ever trust them again.

editing my comment since I can't reply to hellbanner: Percentage doesn't matter, it only takes one. One person to destroy a hundred billion dollar company overnight. I am not saying "Tim Cook can not possibly be lying." I am saying "If Tim Cook lied, he just made a hundred billion dollar bet that he can keep a secret." Personally, I doubt he's that dumb.


I'd love to believe that, but then: http://www.reuters.com/article/2013/12/20/us-usa-security-rs... Plenty of people were furious with RSA over that, yet they still seem to be in business.

Then again, RSA isn't in the business of making consumer hardware. I'm not sure what difference to expect that to make.


Sadly I bet most people would just shrug and say "well they are doing it for our own good".


(Clarifying I'm understanding your position): You're assuming that there are enough % of Apple employees who would refuse secrecy bribes || fear individual NSLs to leak something in public in a situation where they don't fear for their safety?


How and does this protect against attacks via the baseband radio processor. That thing is a black box with proprietary firmware that has pretty much unrestricted access to memory (the way I understand it, someone please correct me).

So now it is a bit farcical to say "these are all secure" now. But if you happen to know how the baseband processor works (say you are friends with Qualcomm), you can try to get the encryption password right from the memory.


If a (co)processor, firmware, battery, daughterboard, memory, (really anything on the north/south bridge), UEFI certs or code, harddrive, OS, microcodes, transistor doping amounts, protocols, touchscreen, drivers, services or crypto standards have either flaws or backdoors the encryption could be circumvented. Apologies for the incomplete list.


i think it's great to see ppl being skeptical of security claims from phone manufacturers. trying to secure a normal cellphone is pretty much impossible and if you're storing sensitive information on one, you are just waiting to get fucked.

i think all this "sound and fury" is likely a ruse to entice ios and android users into a false sense of safety post snowden disclosure. being able to encrypt your drive doesn't matter if your OS and its applications are exploitable. last time i checked, there is almost zero open source firmware out there, so your application processor can encrypt stuff hitting disk and the baseband processor can be used to get dma.

time to roll out the hypothetical child molester straw man...


> being able to encrypt your drive doesn't matter if your OS and its applications are exploitable

Because, you know, security measures that aren't 100% perfect are on equal footing as no security at all. Seriously? That's a huge fallacy.

In the end, it's all about the cost. When speaking of the NSA, we are primarily concerned with mass surveillance, because lets be honest, if you're targeted directly then you don't stand a chance, since they can always infiltrate your home then watch your fingers typing your password. And if these companies are raising the cost of doing mass surveillance, with encryption doing just that, then that's a good thing. It is in their interest to do so because the bad press they are getting is hurting their bottom line - you may not see it, but post Snowden at least governments and big corporations are starting to think of software/hardware stacks provided by non-US companies and now they have the ultimate argument for the balkanization of the Internet, which can't be a good thing.

But lets also think about things closer to home. I'm not from the US, I couldn't care less about the NSA. But I do care about my personal data ending up in the wrong hands - personal emails and photos, details on my accounts, projects, written down feelings and so on.

There are always organized crime syndicates looking for generating a quick buck. There are always incompetent clerks in your government institutions that out of an oversized sense of responsibility are doing stupid things. For example my personal identification details ended up in a local newspaper by mistake, because of a non-public contract leaked out of a public institution. Now how can I trust these people to handle my data? How could I let any cop inspect my laptop or phone on the spot as part of routine checks, which from what I hear, are becoming more common?

Yeah, encryption is not a good solution in the face of insecure apps, binary blobs and a potent global adversary. Thing is, for most people that global adversary is not the immediate threat they are facing and even for that global adversary, encryption makes surveillance more expensive.


GP is actually making a factual statement not just waving his hands around. Data at rest encryption doesn't matter if one can gain execution control with supervisor level privileges (or control another processor or part of the process which does.) This is why security is hard. These aren't NSA level attacks, they're what are used for the jailbreaks that come out for every version. It's a good idea to ask questions before getting fired up if it isn't your area of expertise.


Well, I disagree with his factual statement and I tried stating the reasons why.

I have my Android encrypted and I do feel safer, because I've got my 2-factor auth generator on it and now at the very least I feel safe about losing it. So why are we talking about a false sense of security, when an encrypted phone is factually more secure than one that isn't?


It's already there, in the third paragraph of the article:

“This is a very bad idea,” said Cathy Lanier, chief of the Washington Metropolitan Police Department, in an interview. Smartphone communication is “going to be the preferred method of the pedophile and the criminal. We are going to lose a lot of investigative opportunities.”


I believe that the decision to start encrypting data will just make it harder for local law enforcement to attain your data. When they subpoena Apple for your data they will get the encrypted data. The NSA will probably also receive your data encrypted however they probably won't have any problem unencrypting it.


When they subpoena Apple, they will get cloud data which will not be decrypted. The issue is whether they would be able to give Apple a device and say 'decrypt plz'.


If the device is backed up in iCloud, Apple already has the decryption key and will have to provide it if they receive a supoena. The only way you're safe, even in theory, is to only do local encrypted backups.


The whole point of the updated system is that they don't have your decryption key. They will still turn over encrypted data, but LE won't be able to decrypt. That might be a leap of faith though.


> The whole point of the updated system is that they don't have your decryption key.

It's a nice thought.[1]

> While Apple does not have the crypto keys that can unlock the data on iOS 8 devices, they do have access to your iCloud backup data. Apple encrypts your iCloud data in storage, but they encrypt it with their own key, not with your passcode key, which means that they are able to decrypt it to comply with government requests.

[1] https://firstlook.org/theintercept/2014/09/22/apple-data/


So you are saying that if you lose your device there is no way to recover your data?


No, he's saying the decryption key will be a simple passcode for an encrypted private key stored on apple's servers, or something even worse than that. Promising, ain't it? ...with the examples we have of the type of passwords people use...


Is the decryption key something the user chooses like a password, or is it random noise generated on the device that the user does not need to know?


Not sure. I suspect it is chosen by the user, going by what I've read.


You can be technically unable to decrypt the messages, but the metadata is still available.

Once they have your metadata matching, they can subject you to increased scrutiny.


indie phone might be another, in development: https://ind.ie/phone/ I suspect there will be some competition that Apple/Google/Samsung will be up against for privacy at lease changing some market leaders towards privacy a bit. But in the end there is no privacy on a network with enough time so all of this is surface PR.


What will really blow your mind is that Snowden is a PsyOp.

If Snowden is saying things governments don't want you to hear, it's highly strange that the government-controlled mainstream media keeps yapping on and on about Snowden and publishing "his" material.

Cue shadowban in 3.. 2.. 1..




Consider applying for YC's Spring batch! Applications are open till Feb 11.

Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: