Hacker News new | past | comments | ask | show | jobs | submit login

THIS. If you want a fast website, remove all JS and make everything simple. For maximum performance put the static site on a CDN. It isn't hard.



The OP isn't talking about rendering (at least as I understand it, which isn't well either), it's talking about why, after rendering, it takes so long to switch between tabs or why creating a new clean tab takes more than a millisecond.

I, like the OP, don't understand why that part is hard. Does it have to rerun and redownload all the javascript each time you switch to a tab? Are we misunderstanding how websites work? Will removing all JS solve the behavior the author mentions? Obviously doing that will make pages load faster. But that's not what they seem to be asking. Is the OP question not meaningful because of some aspect of page rendering?

Honest questions, I have basically no knowledge about this topic, but have wondered the same thing.


And, obviously, websites apparently don't want a fast website. Empirically, they want the slow, JavaScript frameworky, image heavy, ad heavy, social media link-ful they have. Unfortunately, as a user, I want their website to be the opposite of what they want their website to be.


Reader mode is such a relief. I wish there were a way to enable it by default.


There are extensions that enable reader mode by default, for example [Automatic Reader View][0].

[0]: https://addons.mozilla.org/en-US/firefox/addon/automatic-rea...


Thank you! I had never heard of this, and I'm installing it now.


I've been ad-blocking for a long time, but I've recently started default blocking all JS, remote-fonts, and even CSS. It's made my browsing significantly faster


This only works if you have a reasonably small site. If you have a lot of pages, like a news site, you start running up against the fact that your filesystem is not a great database. You can get a significant speedup by putting content in a database with a good index.


I don't follow this. It's pretty hard to get any database to be as fast as serving a file off the disk, which is why caches on web servers are pre-rendered pages stored as files on the disk.

Filesystems are a great "database" for large quantities of information, as long as it doesn't change, and you don't need to do any queries on it. I think there is quite a lot of range between "reasonably small site" and "news site" for which static pages work well. You could even have a successful news site with static pages, as long as you didn't want things like comments and trending articles, because while there are lots of articles being created, news articles don't usually change.

Changing layout and themes, however, could be no fun with static files.


Lots of filesystems struggle with hundreds of thousands of tiny files. You also have to make sure that every tool you use can handle lots of files without blowing up. Searching without an index becomes difficult too.


I'd imagine a hybrid could also be a good choice. If you're a news site you could keep the last weeks worth of stories in a CDN for instance.


Except then you will have to render the HTML on the server. Which increases the latency significantly.

I'd prefer to generate all the HTML files for a site every update, just like Jekyll or Hugo does.




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: