Hacker News new | past | comments | ask | show | jobs | submit login

I'll take a stab at this, Kyle just shoot me if I get something wrong below :D

1) There's a server-centric approach and a client-centric approach:

--a) hand-maintained HTML + php falls into the first camp

--b) React (/Angular/Vue) fall into the second

2) If you go with the second camp (b), you end up having a higher initial page load time (due to pulling in the whole "single page app" experience), but a great time transitioning to "other pages" (really just showing different DIVs in the DOM)

3) Gatsby does some very clever things under the hood, to make it so that you get all the benefits of the second camp, without virtually any downsides.

4) There are of course all kinds of clever code-splitting, routing & pre-loading things Gatsby does, but I hope I got the general gist right.

If not, Kyle, get the nerf gun out! -- how would you describe the Gatsby (& static sitegen) benefits? :)




Gatsby can also let you use react components to do some pretty clever things around image resizing, effects, etc that you might expect from a static site generator but couldn’t achieve with just a frontend framework.


(3) is incorrect, Gatsby initial page load times are mostly really bad.

(2) is both overstated and overvalued. It's overstated because loading a static HTML page from a CDN is extremely fast. Too many people who point at this advantage for SPAs are thinking back to pre-CDN usage with slow origin servers. Of course there are still use-cases where going to network is not wanted, but these aren't the primary use-cases that Gatsby covers.

It's also overvalued in that most users are not getting to a page by navigating in a loaded site, they are coming from a social or search link (again, for the sort of use-cases that Gatsby pages are built for).


> (3) is incorrect, Gatsby initial page load times are mostly really bad.

This has not been my experience, considering all HTML is ready to go from last byte so that other than blocking CSS, rendering can begin ASAP. At this point, no JS is required to interact with the page so things are generally pretty snappy while we wait for React hydrate to kick in.


Pick a few random sites from the Gatsby showcase on their homepage and run them through webpagetest.org Simple Testing.


shopflamingo.com, ideo.com, ca.braun.com, and bejamas.io are all blazingly fast for me.


Ideo has a pretty bad insight score (41): https://developers.google.com/speed/pagespeed/insights/?url=...

First paint: 4.1 Seconds. Time to interactive: 11.5 Seconds.

I wouldn't say that's very fast.

Edit: I didn't check the other three.


shopflamingo.com gets 47

ca.braun.com gets 77

bejamas.io gets 96

So implementation dependent I guess


You mean your computer or on webpagetest.org?


So, for (2) above, not sure I understand:

camp 1) at best, a TCP connection is re-used, and the HTML for "page 2" is fetched over the network, parsed, the CSS OM is applied, and then the whole caboodle* is "painted on screen".

camp 2) the CSS OM is applied and "page 2" is painted on-screen (possibly even faster if the browser cached "page 2" in a texture on the GPU, so the CSS OM application step may be optimized away)

So I genuinely don't understand how fetching a "page 2" from a CDN

(we use Cloudfront & GCP's CDN at https://mintdata.com, so I'm basing my experience on this)

is faster than the SPA approach?

I am genuinely curious on the above -- not trying to start a flame war :D

* Yes, apparently caboodle is a word?! I had to Google it just like you to make sure :)


It's not faster than the SPA approach. It's just not very much slower. It used to be much slower below using CDNs was common.




Consider applying for YC's Spring batch! Applications are open till Feb 11.

Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: