Hacker News new | past | comments | ask | show | jobs | submit login

We're using Next.js at my current company with a custom MongoDB based CMS.

Next has a thing called Incremental Static Regeneration[0] which allows us to grab the top ~100 pages from the CMS at build time, generate the pages, then cache them for however long we want. The rest of the pages are grabbed when requested, then are cached for the same amount of time. After the time, they're re-grabbed from the DB, then re-cached. Overall I think we're down to around 5-10% of the way things were done before, which was -- you guessed it -- hit the DB on every page load _just in case_.

Sit the Next.js site behind CloudFlare, and then we also don't really pay data transfer costs. Our servers are just low-tier GKE nodes, and we run around 3k/visitors at any given time, sometimes spiking up to 8k concurrent.

[0] https://nextjs.org/docs/basic-features/data-fetching#increme...




Even database queries aren't that slow on reasonable hardware, as long as the queries are simple. The problem appears once you have dozens of DB queries per page. It's really not a fair comparison to the site this topic is about, but for trivial queries you can easily get a few thousand requests per second out of Postgres on desktop hardware without any real tuning as long as the DB fits into memory.

But static content is of course still much faster and also much simpler.




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: