Hacker News new | past | comments | ask | show | jobs | submit login
Unhappy Security Dialogs (usersinhell.com)
63 points by nantes on July 12, 2011 | hide | past | favorite | 25 comments



To be fair, there is a very good reason for the "wait three seconds before you click install" thing in firefox:

http://www.squarefree.com/2004/07/01/race-conditions-in-secu...


Finally, an explanation. Thanks.

That delay has driven me nuts since it was implemented. It's the one thing I hate about Firefox: could never understand why the one button I wanted to click was never active when I wanted to click it, then became active after some flurry of activity trying to make it work.

Time delays are not a suitable solution to bugs. Time is money; don't waste mine for some obscure simplistic solution when there is another solution (even if the latter is harder to implement). If you _are_ going to use it, do something to indicate to the user that the system knows time is passing and the user should be patient: if the "yes" button is going to activate after 3 seconds, show a countdown timer or "thermometer" or fuse burning or something so I know "wait until an indicated time is reached". Don't just do nothing for a while.

At least now I know why it happens. I do appreciate that.


> Time delays are not a suitable solution to bugs.

"People trying to be evil" isn't really a bug. Is a safe with a timer not a suitable solution for a McDonald's or a bank?


Oh wow, very interesting. I had definitely always thought that it was because Mozilla wanted the user to read the dialog. That's quite an innovative method to trick the user into installing the bad code.


Fascinating! I always assumed (as did everyone else, I think) that the countdown, coupled with the stern warning in bold text, was to make you think twice about downloading extensions, not to thwart malicious websites.

So why's it gone in Firefox 4?


You know, I've clicked on those warning messages for the better part of 10 years now, and never once noticed that some of them were red and some were orange.

It's exactly the same dialog, with exactly the same fields. The only difference is an icon that you'd never realize was different unless you had two dialogs open side by side. If a guy who writes software for a living can't tell them apart, I don't think the average user stands a chance.

If they actually want to fix this, they need completely different looks for the two popups. Or better still, simply skip the warning altogether for signed apps. If you downloaded it off the internet with the intention of running it, you're going to want to run it. Popping a dialog in your face isn't going to change that fact.


If you skip the dialog for signed programs you may as well skip it for unsigned programs as well. Anyone can sign a program, it's only relevant for security if the user reads the signature and makes a decision based on what it says.

From that perspective, maybe the dialogs for signed and unsigned programs don't _need_ to be very different; most users don't care. The interesting distinction is between "this a program, which may take over your computer" and "this is a music file, which can do no harm".


Just my two cents: I think that the updated version still doesn't do enough to make the two dialogs look different. For something like this, I would like the two to be distinguishable even when you're standing six feet away from the screen.


Or, more importantly, you need to be able to tell them apart when they're not side-by-side at all, shown 3 days apart. I thought that was the author's point, but the solution seems to have the same problem.


The redesign looks better, but conflates "this program is by example.org" and "this program is trustworthy" even more than the original.

The basic issue, of course, is that those are completely different statements, and only one of those can be easily checked.


Or you can go iOS route and allow users to install only the apps you've approved. Here, the user doesn't have to worry about installing a virus/malware but it results in a walled-garden situation.


IE9 doesn't even let you open unsigned binaries any more without going through quite the amount of questions and expanding collapsed UI elements, so that might actually be a bit better.

On the other hand: the only thing that accomplished is that I now stopped using IE even if I quickly have to download something inside my development VM.


The author's main issue seems to be with the explanation that running software from the internet is risky, since he removes that in both dialogs. And yet it's unquestionably true - running code from the internet be it signed or unsigned is a pretty risky thing to do unless you know what you're doing. His main argument for why you'd want to remove it is that most people ignore your warnings anyway. However true that is, it's certainly not true for everyone.

The advice really doesn't seem to fit the real world. Suppose you had a place with a dangerous undertow and a sign to warn people about it. If five people died one year after choosing to ignore the sign few people would agree that a sensible course of action would be to remove it.


Every time I see a dialog like that, I think to myself,

"Yes, I want to run it. That's what I told you to do, stupid computer. What kind of moron do you take me for, anyway?"


"a SSL certificate isn’t a 100% gauruntee of safety"

What have SSL certificates got to do with code signing? I have a feeling it involves speaking authoritatively on a subject you clearly don't understand ("properly-encrypted checksum" indeed).


They do have a big overlap. Both SSL and code signing use signatures to confer 'trustworthyness' of a certain source.


Not really. SSL solely exists to authenticate domain names. Having 15USD and a domain name does not make you trustworthy.

Likewise, code signing is an anti-tamper and publisher identification mechanism rather than anything which gives the end-user a reason to trust any signed executable. Again, having 15USD does not make you trustworthy.


I should have added "to the end-user" using happy/sad iconography.

Nothing in the world can make you trustworthy, I don't want to get into that discussion, which is also why I put it in scare quotes.


No but knowing who you are makes you trustworthy because if the code installs viruses, you can be arrested. You are no longer some anonymous person in Eastern Europe.


SSL cert != knowing who the other party is


I can fix the security dialogs to help dumb users even more. I see a solution where the operating system runs an MD5 Sum against the executable code, then updates the user satisfaction results to a common website that other operating systems can query.

When you download a program, you will get one of three results, a big green bar, indicating lots of users who used this code were happy, a red bar, indicating lots of users who used this code was unhappy, or a grey bar, where nobody has ever used this code.

Red = Do not run this unless you want pain. Grey = It's new because it's probably new malware. Green = It's probably good, (though it could be a virus masquerading as a useful app)

You could get around this system by spoofing positive results, but it makes the job of the virus writer harder, because they have to work hard to "make it look legitimate".


How many people will know to go and rate their downloads after using them enough to have an informed opinion? I'd almost rather the person who ran the site certified software rather than crowdsourcing it.


MD5 sums only verify that the binary you downloaded is not corrupted (in other words, the bits you got are the bits the server intended to send). This is not the same as cryptographically signing executables, since anybody can still replace the original file and md5sum with a malicious file and an md5sum for the malicious file. Cryptographic signatures can't be forged by technical means short of stealing someone's private key.


His suggestion involved the operating system computing the md5 digest of the executable file before running it. In this case you would not be looking to see if it matches the digest that wherever you downloaded it from provided. There is no foo.md5sum for them to replace.

This scheme is easily subverted many ways: by having the executable only being bad a few times, or much later than when the user could identify it as the source, or a collision attack against the hash function.

Also note that digital signatures are only as strong as their components. It doesn't matter that you signed it if you use md5sum in the process of signing it. An attacker just needs a binary that has the same digest as the one the signature says it does, which results in attacks that completely ignore the bits that involve the asynchronous keys.


Sounds like a package manager.




Consider applying for YC's Spring batch! Applications are open till Feb 11.

Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: