Hacker News new | past | comments | ask | show | jobs | submit login

making a website 2 to 10 times fast is not that hard, put them on a CDN. The usual suspects in increasing performance are:

- compression

- caching (origin, edge, browser)

- persistent connections

- tcp-optimizations




Caching is easy. The hard part is knowing when to expire an object in cache.


That's only the hard part if the site doesn't tell you when to expire it. Squid (and many others) have excellent cache expiry and replacement heuristics; but Squid can only cache 20-35% of the web, because it's simply not safe to cache the rest of it, either because it's session-based and could be different for every user, or because it is SSL-encrypted and can't be seen by the proxy, or because it explicitly disallows caching with Cache-Control or Expires headers, etc. And, even if the site doesn't tell you how long you can cache something, it's reasonably safe to guess based on the age of the object (a five year old object is probably not going to change in the three days it takes to run through a full replacement in your cache; while one that is 30 minutes old could possibly change dozens of times, so Squid will send an If-Modified-Since a few minutes later, with gradually lengthening periods between checks as the age of the object increases).


yeah, i wonder if they're just systematizing the YSlow suggestions




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: