Hacker News new | past | comments | ask | show | jobs | submit login

> If they serve me a slightly stale version of the remote resource (5 minutes/whatnot) that's fine.

Not all sites are configured to do this. Some pages are expensive to render and have no cache layer.




I get that, my point is it's the problem.

They solve the DDOS issue by requiring JS captchas (which fundamentally breaks the way the internet should work), rather then serving a cache of the page to reduce load on the real host.

Requiring JS doesn't disambiguate between well behaved automated (or headless. I used a custom proxy for a lot of my content browsing) user agents and malicious users, it breaks /all/ of them.


Some people shoot themselves in the foot, yes. There is no reason to not have some amount of microcaching even if it is very short and that puts an upper limit on the request rate per resource behind the caching layer.




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: