Hacker News new | past | comments | ask | show | jobs | submit login

Despite the length of this article, I see no plan for the obvious rebuttal of terrorists: if you make whatsapp or some other app not end to end encrypted, the terrorists will make their own app. Its not difficult to generate your own symmetric key and use openssl to encrypt and decrypt information sent with that key. Sure, you can force apple to reveal the contents of the hard drive, but whats the difference if apple is the one generating the ciphertext or the terrorists own app is? The difference is consumer protection has backdoors, but the terrorist protection does not.



Apple could, for example, switch out the version of an app on my phone with a backdoored version of the same app. The backdoored version could upload a copy of every message I send to law enforcement. It wouldn’t even be difficult - just a keylogger would do the trick, or the OS could periodically take screenshots of the app while it is open and send them somewhere nefarious. Those latter techniques would also work with web apps. If this sort of thing was done selectively, I doubt anyone would notice.

If the government passes laws requiring backdoors in WhatsApp, terrorists can use another app. If the government passes laws requiring Apple and google to add backdoors to their phones, it will be almost impossible to avoid government surveillance. (And if the approach of the Australian government is anything to go by, we won’t even be able to tell this has happened. Instead they’ll pass a law demanding backdoors on request, then legally forbid Apple et al from making any of the details public.)

This is one of the reasons I buy phones from Apple. Unlike Google, Samsung, Huawei, etc they have at least made their opposition to government meddling very clear. And as this article points out in detail, they have put their money where their mouth is. If some of the sales price of my iPhone pays for Apple lawyers to fight for my digital rights, that is a tax I pay willingly.


Ironically, using a corporate store for app deployment could become less safe than downloading something through your browser at that point.

I give Apple the benefit of the doubt and appreciate their stance, but with a legislative basis they too would have to comply.

I also think that professional criminals wouldn't use their phones in the first place if it is tied to their ID in any way. Sure, you might catch some of the riff-raff, but I think strong encryption is more beneficial for any form of security in the broader picture.


I dont quite buy the "switch the app" argument as code signing easily fixes that. Intercepting at a higher level such as a keylogger or screenshotter would work, but then you'd apply the same argument: the terrorists would just go one level higher. In fact they could go all the way to open source processors running their own code doing all the same things. Yes, it would delay them a few years to get there, but once there there would be little the US could do to stop it. At the network level, we simply dont sign hardware, so without some major refactoring to the core system, theres little to do there. Still though, at every level of this back and forth, consumers have to give up more and more privacy to allow the government to step in and get the same level of information they had before about terrorists until the terrorists will eventually win (or we completely lose privacy and trillions of dollars in infrastructure). So why play the game?


> In fact they could go all the way to open source processors running their own code doing all the same things. Yes, it would delay them a few years to get there, but once there there would be little the US could do to stop it.

well as long as you can't easily build private fab out there capable of building such processors, good luck with that. There are not that many fabs out there. Fpga's aren't any better, since most of them you can't even make work without large quantities of proprietary blobs.


A new virtual machine underneath can see what every app is doing.


Every cryptographic protocol has a weakness in its implementation if not in its spec (and if not in the tool itself then in its various dependencies)


Uhh.. no. In fact it doesn't.

If your goal is to secure the entire internet via SSL or get people to use PGP signed emails in a mass market then the tremendous technical and cultural hurdles in place that create making a 'truly secure' implementation that gets widely accepted a near impossibility.

But if you goal is to secure the communication between trained people in a 'terrorist cell' or other small group then that is pretty easy. It's really as hard as you want it to be.

For example any decent programmer could write a program that utilizes a 'One Time Pad' of random data to encrypt communication. A program in a USB flash drive that is filled with randomly generated data and the program is all that would need to be exchanged ahead of time. The biggest challenge involved in that is making absolutely sure that the every section of the one time pad used is only used once and is destroyed afterwards. You don't need to really know anything about encryption or math or protocol details to make something like that work.

And if correctly done it'll be impossible to crack.

And we are dealing with threats and adversaries that are much more sophisticated then that. Even small terrorist groups are more often then not state-funded one way or another. Foreign threats are sophisticated enough to create their own encryption. Domestic threats and cartels are sophisticated to hire competent programmers to do work for them. Pedophiles are not idiots either. Many of them are talented technical people that will have no problem avoiding government backdoors in commercial software and hardware products.

The threats American face via encryption isn't that encryption is too strong. It's that Americans don't use it enough and don't use it properly.

I find the idea that Americans are under threat due to lack of backdoors a fallacious one. Legislating that backdoors need to put in place only increases threats.

The only thing that laws like that would accomplish is to make it illegal for Americans to be secure.

Take away legal ability to have secure software then it means that the only people who will have secure software is criminals. Criminalizing good software is never going to be productive.


USB stick? the entertainment industry provides an excellent source for the distribution of 1-time pads, just agree on some particular stream/CD/DVD/etc ("number 27 on this week's top 40") and use the LSBs in some agreed order


A one time pad must be truly random to be secure. A music stream is far from that.


'The Nth 4096 bits from this week's mod(N,100) top video on YouTube'

Can you tell me what's wrong with that approach? In my head, it seems reasonable.


There are recurring patterns in a video or audio stream. More often than not these coincide with recurring patterns in the message, revealing information. For the sake of an example, take some pages of a novel as the one time pad. There will be statistical variations (not all ciphers in the ciphertext come with equal frequency) in the ciphertext, where the combination ocuring most will correspond to an "e" in the plain text and an "e" in the one time pad because both occur most often in English language. You can calculate the probabilities for each combination and thus deduce the plain text provided it is long enough. For a one time pad to be secure it must be truly random or pseudo-random.

As a way out you can agree to a seed to a secure pseudo random algorithm but that's commonly called a password or pre-shared secret. In fact it's more secure to use just the string 'The Nth 4096 bits from this week's mod(N,100) top video on YouTube' as pre-shared secret.


The LSB are pretty close to this (unless it's Cage's 4'33)


Maybe, but I wouldn't want to rely on it. I don't know anything about real world music encodings but for the analog signal you have zero-crossings at fairly regular intervals.


USB stick... I'm sure it'll work out great. What if you have to give it up with a gun to your head?

<< Many of them are talented technical people that will have no problem avoiding government backdoors in commercial software and hardware products >>

Hand wavy as heck.

Then you meander. Not sure what you're responding to.


It's trivial to lock the USB stick in such a fashion as to be impossible to decrypt in a practical time frame.

Furthermore it's practical to communicate in such a fashion that grabbing one party only grants you access to communication intended for this party.

If really paranoid it might only grant you access to communication between compromise and his fellows realizing that he is burned.

Maybe nothing at all if you can't successfully coerce and all devices are locked.


> It's trivial to lock the USB stick in such a fashion as to be impossible to decrypt in a practical time frame.

That is definitely not true if your adversary has the ability to control the endpoint and might even reflash the firmware of your USB stick.

If you use OTPs in such a threat scenario it's safest to use old school easy-to-burn paper OTPs with manual encoding/decoding.


All public encryption algorithms have backdoors in their implementation and sometimes (as with Elliptic Curve standards from NIST adopted in the browser) in their spec.

You might get lucky if you have a cryptographer design you a custom algorithm but that's mostly security thru obscurity and if a state actor really wanted to defeat it they may just kidnap the cryptographer at gun point and have them reveal how to.

Cryptography as a weapon against state actors is NO LESS BRAINLESS than the right to bear arms to protect against the US gov. Just completely useless, if not brain dead.


> All public encryption algorithms have backdoors in their implementation ...

Okay, this is the first time I can recall having ever asked the following:

Source?

I mean, if you're going to make that type of absolute claim, I must ask for some referenes to support same.


Although you're right that anything but OTPs give you guarantees and that cryptography is still a black art (with lots of unrealistic conditional proofs), good cryptography can be completely open and will be no less secure if it is published, so there is really no need to kidnap the cryptographer.

There is also good reason to assume that if e.g. you make your own Feistel cipher out of existing cryptographic primitives without caring too much about performance, then it will be secure enough against state actors.

Nowadays side channel attacks seem to be the rule, and there is no way to secure the endpoints without developing the whole technology in-house - which is essentially impossible even for organized crime. So the whole discussion is essentially moot, the FBI can buy 0-day exploits on the black market or develop their own like everyone else.


If this was the case, there would be no push in government sectors to diminish cryptography and provide backdoors.

Or perhaps they are double-bluffing us? ;-)


>All public encryption algorithms have backdoors in their implementation and sometimes (as with Elliptic Curve standards from NIST adopted in the browser) in their spec

Lets see your proofs demonstrating this


Encryption is not trivial to implement right, but it is also not impossible to defend against reasonable threat models. You make claims without giving any proof.


The whole point of a one time pad is that it is never used again and destroyed after use. Even the sick fucks in the CIA don't think they can get a key sequence from you with torture or threats after it has already been stomped and put in a microwave.


Any tool can be used to undo itself... if not directly then by proxy.


This is one reason that NIBOR requires that a new virtual machine can always be installed to fix issues with the previous one.




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: