> The basic problem with this approach is well understood: As the corpus of content on a given site grows, render time quickly approaches time between updates. In other words, you start waiting for renders to complete before you can change something else on the site.
Is this limit still likely (honest question)? Hugo seems to benchmark at around 1ms/page.
What sort of scale is common for large sites, what speed would need to be hit for this to become a problem that's simply irrelevant at even the largest scales?
Is this limit still likely (honest question)? Hugo seems to benchmark at around 1ms/page.
What sort of scale is common for large sites, what speed would need to be hit for this to become a problem that's simply irrelevant at even the largest scales?