Hacker News new | past | comments | ask | show | jobs | submit login
Piercing Through WhatsApp’s Encryption (thijsalkema.de)
230 points by xnyhps on Oct 8, 2013 | hide | past | favorite | 81 comments



A lot of cryptographic mistakes people make, you can blame on the 1990s. For instance, the ubiquitous CBC padding oracle (most recently of TLS "Lucky 13" fame) is the product of MAC-then-encrypt constructions, where attackers are given the privilege of manipulating ciphertext without having it checked by a MAC. We didn't have a mathematical proof to tell us not to do mac-then-encrypt until after the 1990s. So if you have that bug, you might consider blaming the 1990s.

But using the same RC4 key in both directions of an encrypted transport isn't just a bug known in the 1990s; it is the emblematic cryptographic attack of the 1990s, the one crypto flaw that even non-crypto pentesters could reliably deploy. For instance, bidirectionally shared RC4 keys broke the Microsoft VPN scheme, a bug discovered by Peter "Mudge" Zatko when there was still a L0pht Heavy Industries.

So my point is, this is a bit sad.

I should add, recycling the keystream of a stream cipher is worse than he makes it sound. The attack he's describing is called "crib dragging" and implies that an attacker has access to plaintext. But attackers don't need access to plaintext to attack repeated-key XOR, which is what a set of ciphertexts encrypted under the same stream cipher keystream works out to be.


Dan Boneh's Cryptography on Coursera covers attacks on key reuse in first week. Actually, the very first programming assignment gives a few messages encrypted using the same “one-time” pad and asks to decipher one of them (indeed, with no access to plaintext). It's cool to see a real-world example of such an attack.



A good question is how the engineer implemented a half-baked crypto system with 1990s style flaws in the first place. Considering the startup is relatively well funded.

Ignorance? Lack of research? Lack of industry best standards?

This seems to keep happening all the time.

Although most programming in general is full of hackery and rookie code used in production. So that itself isn't alarming. I'm just curious if it's the security industry itself is particularly in need of better communication of best practices and things to avoid.

Maybe more work with designers/writers to create online guides is needed.


From my experience, it's because security is a half baked priority #50 usually at startups. A sum total of a week was probably spent by some back-end engineer 2 years ago on message security since it's not a selling point.

They probably record every single message too since storage without transmission is cheap and avoids a class of bugs with deletion. The stored histories could become a business advantage later to mine chat histories for marketing and advertising data like facebook, twitter, google and everyone else. They say they charge now to avoid a marketing driven approach, but cable TV has made similar promises in it's starting days too.

Also they run on such platforms as blackberry, nokia s40 devices, windows phone and symbian where plopping in some C crypto library probably wasn't practical. From what I know, most crypto libaries are written in some C family language or you have bouncycastle for java. Everything else is relatively obscure or broken. So something they could roll by themselves might of been what they have to choose from.

For many companies, security is something you do the motions for like you would with government paperwork and compliance, not necessarily because it's important to you. Unfortunately the average person only cares about door lock security, not real crypto secured products.


WhatsApp is over 4 years old with over 300 million (claimed) active users. I would expect them to have addressed security by this point. Especially after their previous blunders, I'd imagine hiring a security expert or having a third party audit their code being on the top of their list. Apparently not.

I think these news and concerns often don't make their way to a bulk of their users, who probably aren't very tech savvy. If they don't see any user defection as a result of these issues being uncovered, then I'm not surprised about their lax stance on security.


I think companies start addressing things like security more seriously when they start becoming 'comfortable' companies. Which means securely profitable. Security is higher on the pavlov pyramid of software company needs than 'is a viable business'. I doubt WhatsApp is securely profitable, or flip a switch profitable like amazon.com.

When you are not viable as a business new features or begrudgingly addressing the huge amount of technical debt you generated in your early days so you can deliver new features faster is what you focus on. WhatsApp will only address security when it threatens the viability of their business, which by that time will probably be too late.


LOL. You would cringe at the amount of duct-taped infrastructure that plagues most large enterprises. Their viewpoint often is, "We have a firewall and antivirus so we should be good right"?


> I'm just curious if it's the security industry itself is particularly in need of better communication of best practices and things to avoid.

The thing is, there is no such thing as a security industry that would be interested in pushing such best practices. It's a little bit like occupational safety. There are companies that sell hearing protectors, i.e. solutions to some specific aspects of occupational safety, just like there are companies selling firewalls to protect against some specific aspects of IT security. But you cannot sell one product to protect against all IT security threats that emerge in a software company as there are security concerns in every aspect of an IT product. On the server side it is about protecting the user content against outside threats (firewalls, SQLi, password hashing, log protection and redaction, strong SSH logins, DoS, etc), on the protocol side it is about authentication and encryption (MITM attacks, content separation (think about the beast attacks), strong crypto, using the right crypto for the job, DoS), on the client side it is about folk security models and social engineering (strong passwords, password resets, two factor authentication, social engineering, UI design).

Occupational safety on a construction site has to be considered by every worker and in every part of the construction. IT security is just as much pervasive to software engineering. You think implementing a search feature for your website is not security sensitive? You might just have compromised your user login because of cross site scripting. Logging debug information is harmless? Not so much if your database password is output to the user in case of a poorly written exception filter. Heck, even simply storing some user data in a dictionary can be the cause of a DoS [1].

Basically what I saying is that IT security is not the responsibility of some specialised security industry, but of the whole IT industry.

[1] http://events.ccc.de/congress/2011/Fahrplan/attachments/2007...


I only took two crypto courses during my degree and these attacks were demonstrated as examples of simple things no one should get wrong. Its just plain embarrassing for these devs to screw this up.


Yea, a lot of MS's protocols and file formats from the 1996/1997 era has this flaw for example.


Here's what I'm sad about. Does every single web and mobile app that gets made by anyone these days now require an extensive knowledge of how to do security right? If so, that sucks, given how big the field is. Or do we all need to go and hire tptacek for a quarterly security audit? I imagine that can get quite expensive. It really gets in the way of just making things and putting them up; I think kind of kills the spirit of creation and entrepreneurship. :( I mean, it's great for people who are truly interested in security, but what if you're not? Are you doomed to fail at the startup game if you don't know security well?


I've been told to just use NaCl (http://nacl.cr.yp.to/) and its related libraries in other languages. Apparently it "just works" for most basic crypto.

What's harder is systems security, which you can't just abstract away into a library.


Does NaCL automagically solves the problem of streaming encrypted data? I don't think so. You still have to know quite a lot to not mess up.


Yes, NaCl automagically solves this problem. No, you don't have to know anything not to mess NaCl up. That is the premise of the library.


When in doubt, don't reinvent the wheel. If TLS covers your use case, use TLS.

Unfortunately, it appears that you are not at all doomed if you don't know security well. I say "unfortunately" because I'd rather have fewer secure products than the massive array of insecure products we enjoy today.


And I think a large part of that is because of the unbelievably irresponsible legislation on the matter - where the punishment for "hacking" is often punitive, but the punishment for negligence is often non-existent.

I think it's kind of like cars and the corresponding (lack of) traffic & vehicle safety rules a century ago: right now it's everybody for themselves, but what you really need are rules that encourage those in a position to cause improvement to actually improve.

Developers like us and firms that employ them need to be liable for security exploits - preferably with good enforcement with many small fines rather than a few "examples" that get the book thrown at them. And hackers, even if dubiously motivated, shouldn't be driven underground by punitive measures because that causes a really unhealthy head-in-the-sand dynamic.


According to wikipedia this company has 300 million users each paying some form of currency every month.

Is it that much to ask that a company that large performs a security audit quarterly?


Fair enough, but I'm asking about side projects.


> Are you doomed to fail at the startup game if you don't know security well?

Absolutely not. WhatsApp has a history of poor security yet it is a wildly successful startup.


On the other hand it depends on how you "market" your product. Interestingly WhatsApp doesn't seem to mention their software to be secure on their web page. Yes, it says "Simple. Personal. Real time messaging." People can argue that personal is a guarantee of secure implementation, but there is no claim of extensive security audits or super ciphers.

Maybe it is people's false expectations, that what ever I send with my smartphone is secure.


Until mid 2012, WhatsApp didn't even have any encryption: https://www.demworks.org/blog/2013/05/instant-messaging-smar...

So yes, clearly security is just not a priority for that product.


Actually doing good enough security* is fairly easy even if you don't know how crypto works. You just have to remember few important axioms:

1. You are not smart enough, so don't try to be clever. 2. Use off the shelf audited tools, implementations and protocols.

And that is ... while I admit that there is chance for "any crypto product/expert to be compromised" I distrust my brain to get everything right way more than that.

*Disclaimer - good enough for ensuring sexted stuff remains private, not good enough for nuclear launch codes.


Every single business doing really good needs to take great attention to security (as well to other areas).

But, if you are a startup, just starting I mean, I don't think you need to be a security guru at all. Whatsapp started sending messages in plain text and look how many users they got. That's because users didn't matter about security until the company was so big that security became a real threat.


I sure hope so. Whoever does security right has a competitive advantage in my book.


I guess for most users good UX or a service they want to use trumps security most of the time.


> Does every single web and mobile app that gets made by anyone these days now require an extensive knowledge of how to do security right?

Everyone should know the basics, and should know enough to know when to consult someone else or learn the not-so-basics. They should also know the key rules:

* Security should never be an afterthought unless you are doing proof-of-concept development and even then as soon as you move from PoC to prototype/alpha then relevant security should be baked in to the design

* Relevant is the key word above. Everyone should know when security is needed/desirable. Not being aware that bad security is a problem is not acceptable these days, if it ever was

* Don't try be clever and roll your own

* Don't be lazy or NIHy and roll your own because you have trouble integrating with something else

* Get someone else to review the design and code occasionally, how-ever informally - mistakes will be made, particularly in security, and finding your own is sometimes more difficult than finding someone else's. Also your users won't find security issues like they will easily spot functionality ones.

* Don't claim any security claim that you can't definitively back up. You make the rest of us developers look bad when you get called out on the matter.

* Better still, make sure that your users know where your security is intentionally lacking ("this is not designed as an NSA secure service, we do not encrypt data server-side, if you are storing a secure document here encrypt before sending" or such).

* Absolutely everyone who implements something that needs to authenticate a user needs to know what good password handling is, even if you use third party services for authentication you need to understand what is going on in order to judge if a given third party is good enough at what they do.

> It really gets in the way of just making things and putting them up; I think kind of kills the spirit of creation and entrepreneurship.

If what someone is just putting up" is something that holds personal or otherwise sensitive data, then I don't think this is a bad thing. If they are collecting that sort of thing (actively or passively) then they should be careful about it or should reconsider doing it at all (or should warn their users that what they are "just putting out there" is not production ready yet so may not be secure).

Of course if you are not doing anything with sensitive data these requirements reduce greatly. If you are doing anything that require authentication then you still need proper auth handling code no matter what though.

> Are you doomed to fail at the startup game if you don't know security well?

If your startup handles personal or otherwise sensitive data you should be, if by "startup" you mean that you intend to make a living (or just make a name) with what you are doing. This sounds harsh, but that is just the way it is. The world is harsh.


In other news, WhatApp's website got hacked, well, defaced this morning:

Screenshot: http://i.imgur.com/wY2zDl7.jpg

Source (German): http://stadt-bremerhaven.de/server-von-whatsapp-gehackt/


Well the update says it were the DNS-Servers, that got "hacked" not the WhatsApp-Servers. Not that this makes me like this thing any better, but I thought the correct info should be put into the discussion. ;-)


I wasn't aware of this important difference, thank you!


I'm surprised they'd make such a rookie mistake when there are hundreds of good encryption methods online to crib from, just a Google search away.


Here's a fun thought experiment: find the first "good" encryption method you could crib from off of a simple Google search, and provide the Google search that found it.


"Encrypting data: Use AES in CTR (Counter) mode, and append an HMAC."

Google search: "cryptography answers" :-)


[cryptography answers] gives me:

* Hottest 'cryptography' answers [stackoverflow.com]

* Hottest 'cryptography' answers [bitcoin.stackexchange.com]

* 'cryptography' Answers By New Users [bitcoin.stackexchange.com]

* CISSP Exam Cram, Second Edition [safaribooksonline.com]

* CEH® Certified Ethical Hacker Study Guide [safaribooksonline.com]

* CISSP Rapid Review [safaribooksonline.com]

I think you wanted to suggest searching for [cryptographic right answers]. But of course, nobody will search for that.


Looks like Google is reordering results for us. When I'm logged in, daemonology.net is the first result I get for [cryptography answers]; when I'm incognito, it's 3rd, after two crypto.stackexchange.com pages.


Sadly, not sure that would have helped. Aes ctr is vulnerable to the exact same reused key vuln. Picking the wrong stream cipher isn't the problem.


Picking the wrong stream cipher isn't the problem, but it is a problem. RC4: Just Say No.


Does it have advice for selecting an IV?


CTR mode doesn't have IVs. It has a counter: Start at zero and count upwards.


Depending on if the key may be reused having a non-zero nonce may be a good idea too...

---

And I just realised who I am replying too, off course you know this.


I was being a bit facetious in discriminating between "IV" and "nonce"... they're almost two sides of the same coin.


And don't overflow.


If you need to send more than 2^68 bytes of data, you've got bigger problems than your crypto breaking.


You're assuming a correct implementation with a 64-bit counter, though.


Yes. I'm also assuming people have correct implementations of AES.


I have never exploited an incorrect implementation of an AES core in a real application, but have exploited "broken" counters.


Or spend the money to hire somebody who knows what they are doing. Whatsapp has 300 million monthly users and charges $0.99 per year I'm sure they can afford somebody.


We are talking about a company that makes a crappy product with obvious and blatant limitations (such as not being able to use the same account in different devices, or with different SIMs, or on an iPad, let alone on a PC... and if you have a bad connection messages can arrive out of order - is it so hard to use timestamps? Why are messaging apps getting wrong things that IRC was doing fine back in the 80's?)

To be honest, I'm not surprised.


Yes, everything they do feels rather sloppy - of course that doesn't stop any of my "normal" friends from almost requiring it for any kind of messaging. A necessary evil in my opinion.


They don't have a subscription model on the iPhone and the first year is free on the other devices. Then you puncture the 30% the app store gets and you're left with a mid-low 8-digits revenue.


They got $8 million in funding a couple of years back. One would hope that would be sufficient.


Surely the proper answer is: use TLS!

Then they just have to google to get the latest TLS best practice cipher suite settings etc.

People shouldn't be using crpyto libraries directly; they should just be using TLS to talk client-server, and then its just a configuration problem, not a code problem.

And think how much time they'd have saved too :)


I've heard that openssl enc -none is blazingly fast!


Agreed. It can be fun and/or sometimes necessary to roll your own, but, at least get it reviewed by someone who knows what they're doing - anyone on the IETF Kitten working group, for example (for a SASL mechanism).


Or, use RFC 4121. It's not just for Kerberos: GSS EAP (Moonshot), BrowserID (Persona), and SAML EC all use it for message integrity and privacy. And those specifications were all written by smart people (save for myself, at least) who could have rolled their own if they wanted to.


For the love of almighty and merciful God please do not use GSSAPI. MAC-then-encrypt? CRC checksums? All-zeroes IV? CBCMAC? This stuff died in the '90s for a reason.


GSS-API says nothing about any of the things you mention, it's an abstract standard (like SASL) that can be profiled for many different authentication and message integrity/confidentiality systems. The DES-based Kerberos GSS-API mechanism you're thinking of should be dying out by now.

Granted, RFC 3962 does MAC-then-encrypt. This is being addressed in draft-ietf-kitten-aes-cbc-hmac-sha2.


Those are all things I took out of 3962. But furthermore, I'm now even more confused by your recommendation. If it's a metastandard (which has always been the rap on stuff like SASL), how does it help them design good crypto?

I can't see any reason in the world to adopt GSSAPI. I strongly recommend that nobody else do so.


I wasn't recommending GSS-API per se, just the message protection services from 3961/4121.

It was just a suggestion. Feel free to roll your own!


I love how exactly this mistake is covered in detail in the first week of Dan Boneh's crypto course:

  https://class.coursera.org/crypto-008/class
The Russians made the same mistake in WWII, but Whatsapp shows the relevance today.


https://heml.is/ currently looks like the best concept of a solution to the problem - if they keep their promise:

> Will it be Open Source?

> We have all intentions of opening up the source as much as possible for scrutiny and help!

But it's not done yet.


And it only targets iOS and Android...


Which are both bad platforms if you want to maintain privacy.


Exactly. And it sucks donkey balls.


Alternative...Google Hangouts?


More secure in a sense, but why should anyone need to give the keys to their instant messaging castle to Google?

I'm shocked that a startup with a similar approach to WhatsApp hasn't made a reasonable rigorous application yet.


There are already several messaging applications that are much better than WhatsApp (Line or WeChat, to name a few without including the Big G), but people keep sticking to WhatsApp due to the network effect (it's what your friends have).

It's rather sad, given that with practically any other application (Line, WeChat, GTalk, Skype) you can talk via phone, tablet or PC, and with WhatsApp I have to message my friends with a tiny phone screen while my wonderful desktop keyboard and dual monitors sit there laughing at me.


> why should anyone need to give the keys to their instant messaging castle to Google?

Equivalent to cc'ing the NSA.


Yes, that was exactly my point :)


Let's be honest, I'd rather have NSA snoop on what on my texts than a criminal. Both are bad but NSA is at least slightly better.


The trick is to let nobody snoop on your texts.


Which is, as we can see in the GP, is hard ;).

Whatsapp - broken encryption Google Hangout - NSA Texts - NSA (and others) Facebook Messenger - NSA iMessage - probably NSA, though I don't know.

and so forth. Most commonly used text services are woefully leaky.


Threema comes to my mind:

https://threema.ch/en/

Unfortunately, Threema is not FOSS either.


Unfortunately, like Whatsapp, it's also only one device per account.


Silent Circle is a good alternative. I work for them, so I can recommend them, but I can't speak for how secure the other products are.

The apps developed by Moxie Marlinspike also seem good to me, I've used RedPhone and it's pretty nice. The texting app sends SMS, though, it doesn't go through the Internet, so it might not be what you're looking for.


I agree that Silent Circle is a good alternative. There's just one worry:

Silent Circle's mail service was shut in the aftermath of the Lavabit shutdown. Why shouldn't that happen to the messaging service? What's the difference?


The mail service was impossible to transparently make end-to-end secure (because of the nature of email), the other services already are. SC felt that, if it can't be end-to-end secure, it shouldn't be offered at all.


TextSecure should release a new version soon that is able to use the device data channel instead of SMS.


Why not create an account at a Jabber service (like dukgo.com) and use a free Jabber client like Xabber, Beem, Gibberbot? Separate contact lists, standards compliant, OTR support…


jabber is, for various reasons, not a good solution for mobile platforms. to be able to receive messages a client has to be always online, not something you want ;) ...and also quite battery consuming.

if jabber worked, we wouldn't be talking about whatsapp/hemlis/threema.


Not until they implement real end to end encryption, like OTR.




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: