Jan Koum is a WhatsApp co-founder. WhatsApp is notable, I think, in this debate specifically because WhatsApp partnered with Moxie's Open Whisper Systems for end-to-end encryption.
WhatsApp is still closed-source though, and I'm unsure as to the veracity of the claim of end-to-end encryption, I sure hope that's the case. In the Panorama documentary about Edward Snowden on the BBC, when discussing the death of Lee Rigby, a representative of Facebook says they're capable of decrypting messages (whether that applies to WhatsApp or just Facebook messenger is not clarified.)
You might be interested in this set of videos[1] of public testimony by WhatsApp's Deputy General Counsel at a hearing on cybercrime in Brazil in December 2015, where he explains in some detail the type of information they have and don't have. He explicitly notes that they are incapable of accessing message content for end-to-end encrypted messages, and also explains that they don't retain copies of any communications once they've been delivered to recipients.
I think that only messages sent between two android devices are encrypted using E2E. If you send a message to your mom (who uses a blackberry), the message won't be encrypted with that protocol.
What about ios? And how does your device know whether to encrypt or not? It's useless talking about encryption if the code is hidden. What if tomorrow they turn off the encryption silently?
Sorry, I meant for the pre-compiled versions on Google Play and the App Store. The vast, vast majority of Whatsapp users aren't going to want to or don't know how to compile the app for their phone.
They have to. A phone number is not a key, so there's no way to avoid that.
However I've seen leaked screenshots that suggest at least the Android client is getting the ability to see what the keys are and pin them. Hence, TOFU encryption. It's about the best you can do in a consumer product.
Last time I checked whats app didn't require any kind of OOB key exchange, or ability to set your own password. So any claims about having end to end encryption are misleading.
> Last time I checked whats app didn't require any kind of OOB key exchange, or ability to set your own password.
Neither does Signal/TextSecure. Signal most definitely has E2E encryption.
However, when your primary device is wiped (or changed)[0] your Signal crypto keys are changed and your conversation partners are alerted to this fact.
Do you know what happens when the same thing happens to a WhatsApp user? (I don't. :( )
[0] Or -yanno- one just clears all Signal app data
If someone generates the keys, same someone can decrypt. You cannot establish secure connection without someone's trusted server. By that definition same server could MITM you.
> If someone generates the keys, same someone can decrypt. You cannot establish secure connection without someone's trusted server. By that definition same server could MITM you.
Every word in your first and third sentences are true. As written, your second sentence is iffy at best. [0][1]
However, the situation that those sentences describes does not apply to end-to-end encryption in Signal. Signal's crypto keys are generated on the end-user's device and never leave that device.
Signal neither requires OOB key exchange, nor does it require that you set a password to encrypt Signal data while on disk.
Because Signal has the same properties that you claim are indicators of a lack of E2E encryption, but actually is E2E encrypted, it's clear that these properties are not proof positive of a lack of E2E encryption.
So, I reword my previous question, which you failed to address:
What happens when a WhatsApp user gets a new phone, wipes his existing phone, or simply clears all locally stored WhatsApp app data? Does he get a new set of crypto keys, and are his conversation partners alerted of this fact as happens in Signal?
I ask because I don't use WhatsApp and don't know the answer to the question.
[0] Consider Diffie-Hellman key exchange. This is a method whereby two parties who wish to communicate securely over a hostile channel can do so without divulging any key material to eavesdroppers listening in on the exchange.
[1] To address the possibility that you're talking about key verification, rather than secure peer-to-peer session creation: if you like, Signal will show you the fingerprint of your encryption key. You can use that fingerprint to perform any OOB key verification that you want.
You contradict with your previous statement that Signal does not provide OOB key verification when they show you the fingerprints. Because that is essential for not being MITM.
> You contradict with your previous statement that Signal does not provide OOB key verification...
I do no such thing because I never made such a statement. I addressed your claim about mandatory OOB key exchange, not optional verification. To jog your memory, the two times I brought up OOB exchange were as follows:
>> Last time I checked whats app didn't require any kind of OOB key exchange, or ability to set your own password.
> Neither does Signal/TextSecure. Signal most definitely has E2E encryption. [0]
and
> Signal neither requires OOB key exchange, nor does it require that you set a password to encrypt Signal data while on disk. [1]
Notice that I talk about how Signal doesn't require OOB key exchange, not that Signal doesn't provide OOB key verification.
It's gotta be so frustrating. Here's this little thing that could answer all your questions, maybe. If you could just open it, you could easily solve many mysteries. And the only people who can help won't.
I don't envy the job of law enforcement. It must feel like at times that everyone is standing in your way. But where does it stop?
If only we could track everyone all the time...
If only we could watch everyone in their homes all the time...
If only we could open everyone's safes whenever we needed to...
Yes, you could solve many mysteries with all of the keys. But it's not your information. You're not owed it. No one is owed the answer to any question.
I hope law enforcement understands someday what a destructive request they've made, but I'm guessing like anything else addictive, that one taste will just lead to more.
Many of us often complain that our jobs would be much easier if only. This is a reminder to step back and realize that, sometimes, your job isn't supposed to be easy. Your goals are not necessarily your entire organization's goals, nor the entire nation or world's goals.
A programmer complaining about bureaucracy usually does not have the same moral high ground. Thats probably the difference. Nobody in society complains when the newest feature of your Webapp is delayed but when when its about crime and terror, its different. Thats where they are coming from.
It totally disagree but thats how many people see it (I would guess).
I really respect Jan. I saw him talk at Startup School a few years ago and what really impressed me was his non-traditional path to becoming a tech billionaire. He had a long career at Yahoo! before founding WhatsApp. He's a role model we should be looking at. Not young college kids who can dupe old investors into giving them millions for vaporware.
It's also important that he's speaking out in opposition to these government tactics. Hopefully Zuckerberg will follow suit but if history tells us anything it's that Facebook is rather compliant and doesn't take security seriously.
I don't think he should be considered a role model. Almost no one will be able to follow his route and become a billionaire with a relatively simple app. I also don't think that him becoming insanely rich automatically means he has to be great / a role model.
As opposed to older people who can dupe young investors into giving them billions for vaporware?
OK, that's mean. WhatsApp and its success is not vaporware. And they DID charge money at the time they were sold to FB, albeit pitiful amounts from virtually nobody. But the asking price was so high and so out of whack with the financial potential of the app, that I hesitate to describe Koum as a traditional businessman. "Sell my app to a company that's terrified of anything that smells social being successful without them" is still not a business model, not even now.
I agree that a good portion of the valuation must have been largely driven by fear and their potential to destroy that much value rather than create it. The factor that both challenges and supports your argument is how small their team was at acquisition (reportedly 32 engineers, 55 total). Charging $1 a year per user after the first year was bringing in revenue, and it wouldn't take much to be a "traditional business" ie. profitable. They also had zero-rated revenue-sharing deals with Reliance Telecom in India years ago as another potential source of revenue – I suspect relationships like this played a larger role in the acquisition than as they thought they would lay the groundwork for internet.org.
Your comment made me curious about what their actual revenues and costs were leading up to the sale. Putting (massive) stock-based compensation expenses aside it looks like their revenues and operating costs were actually pretty close[1]. That's not delving into potential revenue growth connected to payments and B2C interaction following the path which was already being laid out by wechat[2]-Facebook has decided to attempt this with messenger instead of whatsapp. I'm not saying that they would have quickly found enough revenue to justify a $20B valuation, but that it resembled a traditional business a lot more than many other social startups, and they were kept lean and probably better positioned to capitalize on growth than others.
[1] http://techcrunch.com/2014/10/28/whatsapp-revenue/ "For the year ending December 31, 2013, WhatsApp had $10.2 million in revenue … net cash used in operating during this period was only $9.9 million"
In all seriousness though, who is making a role model of young college kids duping old investors into giving them millions?
I'm not aware of that actually existing outside of an extraordinarily tiny enclave. The rest of America does not make such vaporware failures into any kind of role models.
Quite the opposite probably, I see far more mockery than anything. Such failed start-ups are constantly made fun of in the business press. For years now, essentially every major business publication has been waiting with bated breath for the latest bubble crash so they could mock what ultimately became the unicorn bubble.
Zenefits for example has been getting lambasted, and they're not even vaporware, they actually have a real business of some caliber in a serious business category. Color was vaporware, and it was endlessly, and universally mocked. I can't think of good examples of college kids becoming role models for pushing vaporware.
As of December 1, 2015, WhatsApp has a score of 2 out of 7 points on the Electronic Frontier Foundation's secure messaging scorecard. It has received points for having communications encrypted in transit and having completed an independent security audit. It is missing points because communications are not encrypted with a key the provider doesn't have access to, users can't verify contacts' identities, past messages are not secure if the encryption keys are stolen, the code is not open to independent review, and the security design is not properly documented
The EFF score card is an embarrassment which is essentially equivalent to one of those "comparison table of our competitors" on a SaaS website. That's a good analogy for it, because it uses the same questionable metrics and even more questionable ranking system that one of those tables would use. The score card gives Signal the same ranking as Cryptocat - that's an instant negative result for its usefulness.
EFF scorecard isn't meant to measure "how secure" an app is, it's supposed to measure how secure and free the application could potentially be.
Having key's held by the creator is a sure way to undermine any security model, having externally auditable code is a measure to ensure people can independently verify if code is secure or not (or if it's taking a carbon copy for it's creators) etc;
Why should I trust a company that works with NSA and makes money on information about me to protect my privacy and "fight" government. I see his post as nothing more than PR move - "we're the good guys" - before we find out that they cooperated with agencies. Am I alone in this opinion?
It's a corporation that's a point of living is making profit and selling overpriced stuff. It's government job to take care of people and give them safety, privacy and protect them from illegal and harmful practices of corporations.
Yeah, remember when users stopped using Facebook, Google et al in droves after the whole PRISM debacle, they still haven't financially recovered from that one, have they?
I doubt it. Google has shown little spine so far in standing up to the intelligence community or in securing user data. They spent some legal money to defend Appelbaum and encrypted GMail data in transit between data centers, but that's all that comes to mind.
Standing with Cook would be a good step for them but I'm not optimistic.
Google 'appeared' to have a negative reaction to the NSA jacking their inter DC lines which at the time were not encryoted.
Google will not make public claims on their infra regardless
Google may stand up to the .gov but it may choose to be more quiet.
Anyone that stands up to the police-state-mentality is honorable. The situation we have now is fucked.
Its not "Armageddon" or anything - but the situation is fucked.
Do you have any idea how many people at places like goog and fb and appl are very progressive - dont give a shit about pot, molly, lsd etc as they have all been partying and have had growth experiences due to this? a fucking ton.
It is disingenuous for these companies and their employees to not have the balls to stand up to the government's over reaching BS. EXCEPT that they all share a tenuous thread of "employment status" with a corporation... thus they dont speak up.
Hey Obama, you smoked pot, (That was the point), so release anyone who has ever gone to jail for pot -- or go to jail yourself?
This situation should have ended 20 years ago with the first round of the crypto wars. /sigh/
> Anyone that stands up to the police-state-mentality is honorable.
Exactly. They aren't alone, either - with Apple taking first-mover risk, it should be easier for others to stand up together.
It's important to realize that there isn't much neutral ground here. Now that the line has been drawn, you're either helping implement backdoors or you are making a stand, potentially at personal risk. Collaborators will not find much cover if they attempt to hide behind a Nuremberg Defense.
I strongly recommend that everybody watch Quinn Norton and Elanor Saitta's amazing talk from 30c3, "No Neutral Ground in a Burning World"[1]. Most people didn't ask for the responsibility, but it's irresponsible to avoid it by pretending that technology is somehow external to politics. Everything we implement has political consequences, and we (the people implementing all of modern technology) need to start thinking about those consequences.
The underlying debate isn't technological, its social and requires a more abstract paradigm of thinking - beyond "with us or against us mentality". Everyone involved has been and will be negotiating boundaries that fits with their own moral focus.
> Anyone that stands up to the police-state-mentality is honorable.
Theres a difference between a police state and a police state mentality. The latter is a subjective and abstract point of view, not so easy to stand up against.
I'm not sure if we're just seeing different headlines or what, but I've usually seen Google at the frontlines whenever the government oversteps its bounds in data-vacuuming.
Jan Koum is WhatsApp's co-founder and current CEO, and a Facebook board member. I had this in the original title but it seems to have gotten removed—sorry!
Whatsapp and it's founders are famous for avoiding drawing any attention towards themselves and their company. There offices in Mountain View didn't even have any signage outside (even to identify them as one of the tenants in the building) right up to the time they were acquired by Facebook. Looks like things haven't changed much since the acquisition.
What is suspicious about this Cook's statement is that he was able to publish it. I believe this kind of government requests is usually made with a strong non-disclosure agreement with severe consequences if disclosed. So, either the consequences were not that bad or Apple has chosen to ignore them for the good cause.
Or ... something "fishy" is going on and this article is just a bait by Apple to let its customers believe they do care about privacy ... while in reality the situation might be entirely different. And the "best" thing is that ordinary people will never know for sure, because with Apple's proprietary software philosophy, there is no way to tell.
Sorry, this isn't "suspicious": there is a publicly available court order[1] issued by the US District Court for the Central District of California for Apple to comply with, which you could have found in 15 seconds of Googling. This was not a FISA warrant or National Security Letter.
Just playing devil's advocate here (and I'm in no way full bottle on U.S. law) but from other posts here, one of the most contentious points about national security letters is the very strict gag order that is usually attached to them.
Yes, there's a publicly available court order. How would you know there isn't also an NSL?
> What is suspicious about this Cook's statement is that he was able to publish it.
No, its not.
> I believe this kind of government requests
Its not a request.
> is usually made with a strong non-disclosure agreement
An agreement requires someone to agree. You probably mean a gag order, not a non-disclosure agreement. In any case, whatever you believe is normal is irrelevant: this is a public order, which was widely reported on before Cook's statement.
> So, either the consequences were not that bad or Apple has chosen to ignore them for the good cause.
Its clearly the first: there were no consequences for disclosing the order, since there was no secrecy provision attached to it and, in fact, it had been published and widely reported prior to Cook's response.
This wasn't a national security letter; the request comes from a judge on a circuit court, as part of a regular pre-trail ruling in a 'normal' criminal trial. Nothing about this is unusual.
This specific encryption debate is THE TURNING POINT around this debate.
The NSA has [most likely] found a way to penetrate Apple/Google/MS/FB for specific targets i.e. they can get the info on any specific person/group covertly. The attack surface is just too large - TAO, Zero-Days, Insider threats, Financial threats, etc. The problem with that is things like needing "Parallel Construction [1]" to legally prosecute.
What the FBI is now doing is using a recent horrible tragedy to force SV companies to establish a precedent. Make no mistake, this has been long time coming. The Feds want to set a precedent both legal and 'cultural'.
Ultimately Apple might cave, but the fact that they are raising a stink is very good news. Time for other heavyweights to join the chorus.
https://news.ycombinator.com/item?id=8624212