Hacker News new | past | comments | ask | show | jobs | submit login

For sure, but the approach is quite viable. If 19 out of 20 searches by a user are almost instantaneous and single novel one requires a few seconds, they'll assume a hiccup in their internet connection and still view the site as "really fast". It's certainly useful for limiting demands on expensive hardware.



20% of all Google searches are brand new


Then 80% aren't, and those ones will perform very well.


The other 20% also perform well. I have quite literally never encountered a Google search that took more than some tens of milliseconds for a round trip.


You're absolutely correct. Caching common search queries allows the site to allocate hardware for processing "expensive" queries, with the objective of having both complete at near the same time.

Without caching, the cost of operating the site would dramatically escalate.


And use 80% of the resources?


Eh, a non trivial % of those are likely just brand new misspellings.




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: