Hacker News new | past | comments | ask | show | jobs | submit login

Given the size of Google, I wonder if there was a whole team tasked with this development. A 404 page for a company as big as Google is an awfully big responsibility for just one person :)

To be honest, I prefer the idea of a more intelligent 404 page. You'd think Google would have sufficient horsepower to make a good guess at what you might have been trying to find.




Google's most frequently accessed pages (most prominently, the homepage itself) are designed by a team of pretty high-level engineers, to reduce latency and bandwidth requirements, by stripping bytes. Every change is reviewed with extreme scrutiny. I'm sure the 404 page was designed with similar care.

If you view source, for example, the image is base64 encoded, and they don't even bother closing their tags on the page, because that's more bytes and the browsers don't notice. This was very carefully engineered.


Seems weird to have Arial in the font-stack, as Arial is the default font for sans-serif on Windows. (And Helvetica on OS X which has the same metrics.)


Google uses Arial in the font stack across all of their pages, for consistency. I assume the following rationale is at least mostly correct.

In case a computer doesn't have that as the default for some reason, they don't want their page to look any different, if it is possible for the computer to display it correctly, because if it doesn't, it affects their brand image.

Imagine if you did a google search and everything looked just a little bit off (and you weren't a developer so you didn't know why): you might think someone was hacking google and stealing your information or changing your results. Consistency builds trust, and this is important to them.


Why did they not remove whitespace too?


I'm guessing it's not much of an issue when it's sent compressed over the wire.


Neither are closing tags...


Who in the world downvoted this, please own up? Can you show that gzip does not, in fact, perform well at compressing repeated strings of text such as the likes of closing tags?


Wasn't me, but why make gzip do the work when you can do it once, easily, yourself? Sure it can do it, but their servers can serve the closing tags, and google strips those. The discrepancy is weird is all.


No idea, I saw that too and was puzzled.


don't knkow if it's firefox inserting them, but i do see closing tags, apart from <p> elements, but that's allowed.


There is no <head>, </head>, <body>, </body>, or </html>.


This was something I was thinking too. I watched a video from Matt Cutts yesterday where he mentioned they have a "team" (I guess it could just be 2 people, but it sounds like more) who are entirely dedicated to parsing 404 pages that return 200 response codes. I too wonder if they had a team just for this, or maybe they have a "usability" team who were tasked with this?


This would be a completely different thing though - parsing 404 error pages that incorrectly return a status 200 OK could cause broken links to appear as duplicate content when crawled. Didn't think about this before but I am sure this does require q dedicated team to be able to distinguish this kind of content.


>> I prefer the idea of a more intelligent 404 page

There's a script for that: http://googlewebmastercentral.blogspot.com/2008/08/make-your...

Example:

  <script type="text/javascript">
    var GOOG_FIXURL_LANG = 'en';
    var GOOG_FIXURL_SITE = 'http://www.example.com
  </script>
  <script type="text/javascript"
    src="http://linkhelp.clients.google.com/tbproxy/lh/wm/fixurl.js>
  </script>


This is already shown by the rareness of a google 404. You hardly ever get it these days. Except if a server crashes for about 1 second, (where my preference would be "you are the lucky winner to click your mouse at the same second as our server crash!"


Making the new 404 HTML page was probably someone's "20% project" for a year.


I've heard about the 20% things but look a the page and ask yourself. Anyone could have made that in 5 minutes + maybe a week for discussion.


Of course. I was being sarcastic.


O, ok haha




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: