Hacker News new | past | comments | ask | show | jobs | submit login
Firesheep, a day later (codebutler.com)
77 points by cdine on Oct 26, 2010 | hide | past | favorite | 15 comments



I'd be interested to hear thoughts on HTTP digest auth as an alternative to full end-to-end encryption for avoiding these attacks.

Personally I'm hopeful that Firesheep will be what it takes to persuade browser vendors (and the HTML5 crowd) that real usable support for HTML login forms based on HTTP digest authentication is a necessity.

There are some pretty significant issues involved in rolling out full-on SSL which while not insurmountable do lead one to wonder if a more lightweight solution like HTTP digest auth might be sufficient for most non-security-critical cases.

On this topic http://www.cgisecurity.com/2010/01/weaning-the-web-off-of-se...

is worth a read.


Definitely. The hardest thing to sell, when it comes to regular HTTP digest auth, is the user interface. People (clients) truly prefer pretty and insecure over ugly and secure.


For sure. Hence the need for browser vendors to provide hooks - via new HTML5 form attributes, a Javascript API, or ideally both, to enable us to develop our own, usable HTML-based session login and logout forms with HTTP auth.


This is incredibly necessary for HTTP authentication to be useful, and bizarrely, has been given little to no attention over the years. There's a proposal for HTTP authentication in HTML dating all the way back to 1999:

http://www.w3.org/TR/NOTE-authentform

With only some complex Javascript hacks making it a possible solution:

http://www.peej.co.uk/articles/http-auth-with-html-forms.htm...

Also, HTTP authentication is a benefit to RESTful service design. Cookies have always been the wrong solution to this problem, having all web services to move to HTTPS feels like missing the point.


It's all been discussed by WHATWG/W3C and the conclusion was that cookies have too large momentum, Digest is not secure enough, and everyone should be using HTTPS.

http://www.w3.org/html/wg/tracker/issues/13?changelog

http://lists.whatwg.org/htdig.cgi/whatwg-whatwg.org/2008-Nov...

Digest is vulnerable to MITM attacks. It's only secure against passive sniffing, but if you can sniff, you usually can also modify response, spoof DNS/ARP/base station. If everyone switched to Digest, it would be only a matter of time when someone writes Digest-stripping Firesheep2.


1. Even if Facebook used HTTP digest auth for people logging into facebook.com, I don't see how I could use HTTP digest auth for people logging into my website using their Facebook ID. (Also for OpenID, signing in with Twitter, etc.)

2. As a pragmatic matter, it seems more likely that Microsoft would update the various versions of Internet Explorer on Windows XP to support SNI, than that all web browsers (including Internet Explorer on Windows XP) would be updated to support HTTP digest auth with a customisable UI.

Having said that, I'm in favour of web browsers supporting both technologies.


> In the past, an SSL service required a dedicated IP address. This isn’t true any more with the advent of Server Name Indication (RFC 3546) and improvements in TLS.

If any of your users are using Internet Explorer on Windows XP, then this seems to still be true, alas - http://www.alexanderkiel.net/2008/04/22/status-of-tls-sni/

This isn't an issue for the likes of Facebook, of course, but it is a problem for sites small enough to be on shared hosting.


About 60% of client computers are running Windows XP, according to http://marketshare.hitslink.com/operating-system-market-shar...

Clearly they're not all running Internet Explorer, but equally clearly it's far too early to lock out clients that lack SNI.


Since all of the browsers on Windows XP use the Windows SChannel API none of them support SNI. That includes the beloved Google Chrome, FireFox, and Safari. (Not sure about Opera on Windows XP)

http://en.wikipedia.org/wiki/Server_Name_Indication#Support

That is unfortunately an issue that only Microsoft can rectify unless developers on the Windows platform want to take the time and effort to re-implement parts of the SChannel API.


> none of them support SNI. That includes (..) Google Chrome, FireFox, and Safari

You're wrong about Firefox! I'm writing this from an XP box and I've just tested a SNI site with Firefox 3.6. It works with SNI just fine.

> Not sure about Opera

Opera also works with SNI.


My apologies regarding FireFox. I don't use FireFox and when I do it is from a Mac OS X machine.

Either way Internet Explorer doesn't support it and as such it is still a no-go from a usability stand point since XP still has such a large market share.


He's just saying that Facebook can deal with it much better than a small shop.


Would a mixed (http + https) site with cookieless http traffic (from a different domain) be secure? Could something like this be included in the spec?


If you used cookies with HTTP-only and secure flags, then cookies alone should be safe, but the site wouldn't.

Someone could sniff login forms via injected scripts (XSS = game over) and modify page's content via unprotected CSS and images (which aids phishing, clickjacking).


ahh ok, that's true. Hadn't thought of that. Is there a caching solution that supports https?




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: