Hacker News new | past | comments | ask | show | jobs | submit | jwarkentin's comments login

Being aware of exploits and protecting against them comes with the territory. Luckily there are things like owasp.org to help developers keep up on web security. However, security is hard and it can't be done absent mindedly. There is no getting around that.


If the standards were more strict, some of these issues would not exist. I see this as exploiting a lot of slop in protocols. It should not be possible to interpret a URL as anything but a URL, yet here it's being reflected back and interpreted as something else entirely.


It has nothing to do with the standards being strict; the standards can be as strict as they want. If the standards are strict and useless, no one will follow them, instead implementing something less strict and more useful.

For example, when downloading a file from a website, what default name should you use for it? There is a header to tell you, but not ever page supplies such a header; so the browser needs to do something. It chooses to pick the last component of the URL as that filename. However, URLs are somewhat more complex than you might expect, so this becomes more complicated and can lead to attacker controlled ways to manipulate this filename.

Now, you could make a more strict spec, for example by forbidding downloading files unless the filename is properly specified, or forbidding using any kind of default filename and making the user choose it themselves, or something of the sort. But if any browser vendor implemented this more strict spec, they would instantly annoy a lot of users who would find things breaking that used to work, and they would be likely to switch to another more permissive browser.

Security, compatibility, and robustness are hard factors to balance. Just blaming this on "slop in protocols" is a vast over simplification.


>> For example, when downloading a file from a website, what default name should you use for it? There is a header to tell you, but not ever page supplies such a header; so the browser needs to do something. It chooses to pick the last component of the URL as that filename.

Yeah, and that's slop in the protocol. If the header was required everything would still work, web sites would just have to fill in the header. What's easier to do, comply with a protocol where your site brakes if you don't, or to have swiss cheese and then make site developers learn a bunch of security best practices and hope they get it right?

Also in there is the good old "this site wants to blah blah" and ask the user to decide. If you have to ask, the answer is "No! fix your site so it's not on the user to decide". Broken certificates? Not my problem, browser should just say "sorry site security is busted" and leave it at that. It's an old debate, but AFIAC there is no debate, only lazyness.


I get that. I just dread the days when malpractice for programmers is as common as it is for doctors. I like building functionality, not fortresses.


"If this was such a beneficial mutation, why isn’t it more common?"

Based on natural selection it would only necessarily be more common if it affected reproduction. It may begin to affect reproduction now that people are sleeping less and those that don't cope well (like those that don't have the gene) may die before they reproduce; but until recently when sleep deprivation has become a larger issue, I doubt it ever affected reproduction enough for natural selection to do its job.


I think if people who need to sleep more is at a competitive disadvantage compared to other people, they will be less able to provide for the next generation, which will allow natural selection to slowly weed out those who don't have the gene.


I don't know the history but if that were the case then the non-compete should be between the owner and buyer, not employer and employee. It makes no sense to go after the employees to prevent former owner competition.


Haha, indeed. Who would ever have thought to put a picture on a white background? Gotta protect that IP!


If you're not bothering with the HTTP protocol then you don't even need a certificate at all. You just need to have something like SSH's 'known_hosts' file. All you need to know when initiating communication over a secure channel is that the key the server sent you is in fact the key for the server. In fact, if you're including something with the app, just include the servers public key directly. Forget the whole key exchange process. Then use the server's public key to send a symmetric key generated by the app itself (sort of like TLS does) for the duration of the connection. This provides forward secrecy as well. Simple. Elegant. Secure.

The only catch is that if you change the server's key it will break all clients. However, you could always hard code a second public key that isn't even contained on your servers that is used as a fallback for catastrophic incidents where your private key gets exposed. Once you fix your security hole you swap out your key to your fallback and clients are secure again.


Actually, when I said 'certificate', I actually only meant the public key, that's basically what you were describing. I rely on RSA-OAEP to detect if someone's been trying something funny (I've been told on StackExchnage Crypto that I can rely on it to tell me if a message I'm decrypting was encrypted using a non-matching public key). However, sending a client-generated symmetric key using the server's public one is not perfectly forward-secret: namely, if a server's key ever gets compromised, prior recorded communication can be trivially decrypted. I want to minimize the damage done if a server key is exposed.

Furthermore, a compromised pseudorandom number generator on the client can compromise the security of communication. To mitigate that, at least partly, both parties in my scheme contribute half of a session AES key.


You're correct about the forward secrecy. That's what I get for commenting at 2am. I was thinking about TLS where there can be a frequently rotating asymmetric key pair. The symmetric key in TLS isn't used for PFS, only for performance. In practice those keys don't actually get rotated almost ever though.


> mixing of HTTP/HTTPS content

This is ALWAYS a bad idea. A few points:

- The only purpose of HTTPS and encrypting communications in general is to prevent MITM attacks.

- If your server is even willing to serve unencrypted requests it exposes your users to sslstrip attacks (http://www.thoughtcrime.org/software/sslstrip/). In reality you still should serve HTTP requests but only to force a redirect to HTTPS. In addition you should use the 'Strict-Transport-Security' header. It's the ONLY way to prevent future sslstrip attacks.

- Even worse, if you don't set the 'secure' flag on session cookies at a minimum (thus forcing logged in users to HTTPS only) you expose your users to session hijacking without even putting up a fight.

- If you aren't going to bother with forcing HTTPS all the time then there's not much point as you've opened up your users to simple sslstrip attacks followed by session hijacking or even worse, script injection or redirects.

One fundamental concept many people seem to forget as well is that the dangers of not encrypting communications extends beyond snooping. Attackers can actually modify the data stream to inject and/or replace content or even redirect users entirely.

Also, even if you only serve secure content yourself but you include insecure content on your page it exposes security vulnerabilities. Let's say you include unencrypted images on your site and you have users that are using a client with an image rendering engine vulnerability that allows remote code execution (this has happened in about every browser). All a MITM has to do is replace the requested image with different content that exploits the vulnerability. Now imagine if you included unencrypted JavaScript that actually does execute on your site (like from a CDN). The possibilities are endless.



It's not necessarily "advanced", but it is unexpected behavior I think. I've run into this same sort of issue myself. It didn't take ages to figure out, but it was annoying. It's one of those things that separates an experienced JavaScript dev from the rest.


    "You could stare at that function for hours without seeing 
    the problem. It requires knowledge of how JavaScript’s
    String.replace() works:"
Okay, but "knowledge of how JavaScript's String.replace() works" is showing that the experience of those guys is not the greatest. I'm pretty sure every decent JS developer knows this after a short amount of time, because "String.replace" is used pretty often.


See, this is what bothers me. Too many people seem to think that there is a conflict between evolution and creationism. The thing is, both can in fact be true. Who's to say that there isn't a creator orchestrating processes? Such a being wouldn't work by magic, but by a profound understanding of natural processes. How would this be any different from us discovering how life was created and then using that understanding to create it ourselves? The whole back and forth over evolution between religious and non-religious folk is ridiculous. Neither side will ever get anywhere because the whole premise of the debate is a non sequitur.


You might be less bothered if you took creationist in this context to imply "young earth creationist," which is almost certainly what's intended.


Yeah, that's what I meant, sorry. YEC are really the ones making most of the noise and raising most of the money for stuff like school board electiosn, so...


There isn't sufficient (any) evidence to justify belief in a creator, magical or not. You are free to believe that there might be a creator, but it isn't any more rigorous than believing in unicorns or flying spaghetti monsters. I realize this isn't r/atheism, but it bothers me that otherwise scientifically minded people allow themselves a soft spot in their ontology to incorporate cultural myths.


I never stated whether I believe in a creator being or not, nor will I since it is not relevant to this conversation. The belief that a creator is orchestrating things is what's known in science as a theory. The point I was making is that neither theory is mutually exclusive. It does not follow that if one is true, then the other is not.

Also, if you are indeed scientifically minded then you must be open minded to all theories that explain the state of things unless there is sufficient evidence to the contrary. It is not enough to say that there isn't sufficient evidence. Remember, the absence of evidence is not the evidence of absence. To rule out a theory that could explain our observations without evidence is not scientifically minded at all.

To have a productive conversation, if you believe that the two theories are mutually exclusive then please explain what about one of them being true, means the other cannot be. If you don't believe they are mutually exclusive then my original point stands that the statement by 'spiritplumber' is a non sequitur.


I'll simply repeat what I said already: the belief in a creator is as arbitrary as belief in unicorns or the flying spaghetti monster. The only thing that distinguishes it is its long running place in cultural mythology. I highly doubt you'd be presenting the point were it not for your personal belief.


So if I believe I have the right to drive really fast and I think the laws need to be changed; but you believe I shouldn't have the right to go fast, does that mean that you believe I'm a second class citizen for believing I shouldn't have a right that I believe I should? Seems a bit like a logical fallacy and a strawman. Regardless of where you or anyone stands, let's at least frame the debate right. No one thinks that LGBT people are second-class citizens. To frame the debate like that is an appeal to emotion (also a logical fallacy) and ultimately fruitless, as it avoids the real debate.


Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: