I'm interested in how they managed to cleanly remove the entire meat of the blog post without damaging any other aspect of the site. Is JavaScript really that ... petty ... or did they have to work hard to achieve that effect?
Judging by what w3m and lynx show me, they're doing User-Agent sniffing to serve up pages. Isn't that a violation of the Geneva Convention at this point?
> The latest version of Safari, Chrome, Firefox or Internet Explorer is required
I see two browsers on a fairly fast release schedule. Do they really update their User-Agent filter on Firefox's release cycle? On Chrome's release cycle? At that point, they might as well be specifying patch levels; pick a minimum version and keep with it.
Or, you know, design a site up to modern standards, because if you absolutely need everything in the HTML5 spec to host a blog you might as well actually listen to a web dev once in a while.
Judging by what w3m and lynx show me, they're doing User-Agent sniffing to serve up pages. Isn't that a violation of the Geneva Convention at this point?