This argument keeps coming up, and while its premises are valid, its conclusion never sits right with me, namely: "Don't use cryptography." That advice isn't practical for developers. There are plenty of systems we have to design where crypto is not optional. Examples:
* Storing passwords. You can't store them in plaintext.
* Signing requests (like in the OP's example). What are the alternatives? You can store some kind of authentication in the cookie, which then gets sent with each request. You can use HTTP auth. But in any scenario, you're definitely using SSL, and you're probably using some other kind of crypto on top of that.
* Encrypting cookies. You should store as little as possible in cookies--ideally just a single identifier. But even so, it's very important to make cookies tamper-resistant. You can't, e.g. just have a plaintext cookie that says "userid=42."
So what does the OP propose we do? If this article were making the usual case that we shouldn't implement our own crypto algorithms, I'd agree 100%. But he seems to be taking it a step farther, and saying we shouldn't even use existing crypto libraries, because we'll misuse them. But what are the alternatives?
* Despite all the press password hashing gets, bad password hashes are rarely the worst mistakes people make in their applications. I'm glad there's a meme now about using bcrypt or scrypt, but I don't think people should be afraid to deploy password hashing.
* The alternative to signed requests is to credential the requests directly and not build delegation features. Many applications that use signed requests to solve authorization problems are in fact overbuilding; they're in effect building the infrastructure for federated authorization to solve simple point-to-point authentication problems. Just use HTTP Auth and a long random credential.
* You said it right here: just use random tokens to key a serverside session store.
So, I'd suggest those are four bad examples.
But backing your point out a little: so what if you feel like you need to use crypto in your application? Cryptography doesn't get easier just because you want/need it to. I find myself tearing my hair out at the circumvention/liberation tech people, all of whom respond to the same argument with, "so what if the engineering is hard, people need this stuff*. Well, they need it to work, too, don't they?
> I don't think people should be afraid to deploy password hashing.
Neither do I, but the whole "don't use crypto" meme says the opposite.
> Just use HTTP Auth and a long random credential
Yes, that is a viable alternative. But it is also one which uses crypto. You're generating a random sequence, and you're using SSL. So once again you run up against the "don't use crypto" argument.
> You said it right here: just use random tokens to key a serverside session store.
But again, you've already introduced secure random number generators and SSL.
The point I was making with all these examples is that they are commonly encountered and can't be solved without crypto. Some of the solutions are easier to screw up than the others, and that's definitely worth talking about. But "don't use crypto" is too simplistic for these kinds of use cases.
Just accept the fact that people who say "don't use crypto" aren't saying "don't hash passwords", nor are they saying "don't generate random numbers", and move on.
Sure, but then what are they saying? "Dont use cryptography" is a quotation taken verbatim from the OP, and I've heard similar statements all around in the last few years. A reasonable person reading that statement would interpret it at face value: "Don't use cryptography" means that very thing.
So what I'm suggesting is that the "don't use crypto" meme should go away and be replaced with something more helpful and more specific. For example, the responses you've given on this thread have been both helpful and specific. I'm arguing that people should say the kind of stuff you've said, rather than the unrealistic "don't use crypto."
Isn't the actual meme "Don't build your own crypto", rather than "don't use crypto"?
I remember at least 10 years ago getting (and seeing a lot) the advice "use SSL for data in transit, use GPG for data at rest". Those two principles, combined with the somewhat more recent "just use (b|s)crypt" for password hashes would still provide your average non-crypto-expert developer with a pretty good fundamental starting place.
The former ("don't build your own crypto") is clearly correct.
The latter ("don't use crypto") is what I disagree with. One example of this meme comes from the OP, who ends his article with: "Save yourself the trouble. Don't use cryptography. It is plutonium."
It really seems like this is an argument that seeks to make it harder to understand a problem, rather than easier. I'm just not interested in the semantic debate, sorry.
"Don't use crypto." - you
"That's terrible advice, we need crypto for x,y,z"
"I don't really mean don't use crypto"
"You just said that!"
"You're just talking semantics."
Few do, but I genuinely thought that the OP was, so no semantic debate was intended :) (And I have actually seen people say that elsewhere.) While the OP didn't mention random numbers specifically, his prohibition on using crypto seemed general enough that I assumed it to include random numbers. Ditto for hashing. Certainly, his method of argument--describe the technique, show a vulnerability the reader might not have heard of--could be applied to both password hashing and random numbers.
I realize, however, that you have a more charitable interpretation of what the OP was suggesting. I think you and I mostly agree about the substantive issue: Crypto should be used, but as you said, developers owe it to users to make it work. My disagreement isn't with you but with the OP and others who seem to suggest tossing aside crypto because it's too hard to get right.
I just realized I haven't proposed a positive alternative to the "don't use crypto" meme, though. I honestly don't know what the answer is, I'm afraid. Realistically, lots of devs need to use crypto, and we can't all develop your level of expertise in that area. (The founder of a security consultancy will always know a lot more about security than generalist app developers. Only so much time in the day.) So becoming a true crypto/security expert can't be the solution, even if that would be the best one. The best realistic solution I can think of is for authors of crypto libraries to provide enough documentation that devs can use it safely.
> * Storing passwords. You can't store them in plaintext.
Why not. Just write upfront on the signup form, "we don't hash or protect your passwords in any way. Do not reuse passwords from other sites. Create a unique password and retrieve it using a password manager."
Password hashing is a losing battle:
* the users who aren't educated enough to use unique passwords are the same ones who will always use a really weak one anyway, and those will always be crackable
* salting is nearly pointless. the GPU killed the rainbow table, the "bad passwords" keyspace is small enough it's easy enough to recompute, and in most hacks I've seen, the database has been compromised together with the source code so the salt is never really secret.
* the first advice after a hack is still always going to be "change your passwords".
> Just write upfront on the signup form, "we don't hash or protect your passwords in any way. Do not reuse passwords from other sites. Create a unique password and retrieve it using a password manager."
For one, I'd consider that bad business. If you want to make money, it's not a good idea to declare to users that your system is insecure.
More importantly, you're now putting all the responsibility on the user. Yes, it's good for users to think about security. Yes, it's impossible to 100% guarantee the security of your users' passwords. But disclaiming all responsibility? We're the ones with more technical knowledge, not our users. We should bear as much of the burden of security as we can.
> Password hashing is a losing battle
Your argument here seems to boil down to the idea that GPUs are now capable of cracking any password hashing scheme we have, assuming weak passwords. My understanding was that this was not the case, but perhaps I'm wrong. As far as I know, you can set the difficulty factor in Bcrypt high enough that it's impractical to crack on any commodity hardware. There's also scrypt, which is supposedly even stronger in this respect, although I don't know if it's been adequately vetted yet.
> Just write upfront on the signup form, "we don't hash or protect your passwords in any way"
Can you legally absolve yourself from all responsibility in that way? Say Toyota sell cars that explode when the engine is reved beyond the red line. If the sale contract says, "Toyota is not responsible if you do not follow the instruction manual" does that mean they cannot be sued?
> Encrypting cookies. You should store as little as possible in cookies--ideally just a single identifier. But even so, it's very important to make cookies tamper-resistant. You can't, e.g. just have a plaintext cookie that says "userid=42."
Note that you possibly don't want to _encrypt_ the cookie (particularly if it's transmitted over SSL); you certainly want to _authenticate_ it.
I may not care if the user knows that he is user ID 42; I very much care if he's able to tell me that he is user ID 43 instead.
Yes, that's what I'm driving at: You have no choice but to learn the crypto libraries and use them. I've argued before that authors of crypto libraries have a sort of professional duty to document their libraries well, including info on all the mistakes developers are likely to make. I think if you're going to claim to offer crypto for average developers, you owe it to them to document it properly. It's always seemed a little unreasonable to me to blame non-cryptographers for being unaware of obscure vulnerabilities. We can't all be experts in everything.
Yeah, agreed about the cookie. I guess I should have said "a plaintext cookie only containing 'userid=42.'" Because clearly you can't just let people edit their cookies and take on any user ID, session ID, or other such identifier.
> Also, about the cookies, you can definitely have a plaintext cookie saying userid=42, as long as it's signed.
Not necessarily good enough. If you just sign a cookie that says userid=42, anyone who ever manages to read it can now authenticate as your user whenever they like. You just turned any session hijacking attack into an account hijacking attack.
Do what tptacek says, "just use random tokens to key a serverside session store" and don't write your own crypto.
Wouldn't the random tokens also be vulnerable to a replay? If I copy a session cookie from one client to another, regardless of how that cookie authenticates itself, won't most servers accept it? Seems to me that once the client has been compromised, session hijacking is going to happen.
The server can be set to check that the client IP address and HTTP fingerprint match. That makes the replay attack harder but not impossible. It also introduces some usability concerns--valid sessions may expire sooner than intended.
The token is just a key to a server-side session that will be deleted eventually, for example when the client logs out. If someone can hijack a session, that's obviously already a problem, but unfortunately XSS attacks are fairly common as they are something that can happen in any number of places on a website, and even vigilant developers will probably miss some. (See, for example, the recent XSS attacks reported on Paypal).
If you use a random token to a session that will be invalidated, then the attacker is at least time-limited. If you just naively sign an auth cookie and trust it later, he has a replay attack that will work permanently until you change your code.
As for you other suggestion, restricting sessions by IP address can be done no matter where the session data is stored. It will help to mitigate session hijacking in general, but it also degrades the experience, especially for mobile users, who will have to re-login every time they change IP addresses. It's particularly a problem if a form submission fails for this reason, because they'll probably get redirected away to a login page and lose any information they entered.
Yeah, both this and the original article are vulnerable to replay attacks. A token is much better, but there are times when you don't want to hit the DB, such as for non-critical data like displaying the user's name on the page. Signing a cookie is reasonable then.
* Storing passwords. You can't store them in plaintext.
* Signing requests (like in the OP's example). What are the alternatives? You can store some kind of authentication in the cookie, which then gets sent with each request. You can use HTTP auth. But in any scenario, you're definitely using SSL, and you're probably using some other kind of crypto on top of that.
* Encrypting cookies. You should store as little as possible in cookies--ideally just a single identifier. But even so, it's very important to make cookies tamper-resistant. You can't, e.g. just have a plaintext cookie that says "userid=42."
So what does the OP propose we do? If this article were making the usual case that we shouldn't implement our own crypto algorithms, I'd agree 100%. But he seems to be taking it a step farther, and saying we shouldn't even use existing crypto libraries, because we'll misuse them. But what are the alternatives?