Well, yes, how well this works depends on what sites you tend to use. The vast majority of the sites I frequent either degrade gracefully in the absence of JS, or never used it in the first place.
I think there are three or four that both require Javascript and that I want to use badly enough to allow JS.
When I encounter a web site that doesn't work without JS, I just move on. But I understand that others may not want to do the same.
I have an ephemeral container (systemd-nspawn with -x switch) with almost-default configuration of firefox (only uBlock Origin added) for such websites. After I'm done, I close the browser and everything gets deleted.
Mine has the advantage of the browser processes not having access to anything important, in case of a RCE vulnerability. An attacker would just see a vanilla debian install with no juicy user data, and only the ~/Downloads directory linked from my "real" system.
Mostly, yes. The bonus is you don't use your normal settings + extensions. Which also means ALL websites work, regardless of privacy related settings that might break some websites.
So it's a handy alternative.
I do not, sorry. I have to set up a blog some day, but I have been putting it off because most of the time I can't think of anything interesting to say. :)
Anyway, I was mostly inspired by an article on how to set up Steam in a container - it has all the details, including how to pipe Pulseaudio inside (so in my case, Youtube videos in Firefox can have sound). Except mine is debian-based (so debootstrap instead of pacstrap to populate the container).
I've found that about half of the sites that render a blank page without JS are just setting style="visibility: hidden" on the <body> element. I cannot think of any good reason for browsers to continue to allow that CSS property to be set on that element. "Flash of unstyled content" is not a valid concern here.
"Flash of unstyled content" is not a valid concern here.
Except it really is. Most websites are attached to businesses in some way. If you see statistics that the flash of unstyled content means 10% of your traffic leaves the site after less than a second you do what you can to fix it, and unfortunately that's often hiding everything until it's ready.
Pragmatically, most businesses would give up users who don't like tracking long before they give up users who care about styling, because there are just a lot more people who care about styling. It's an unfortunate fact of web life.
> and unfortunately that's often hiding everything until it's ready.
These sites usually aren't successful in that goal. Sure, they may be able to prevent content from showing up until the webfont has been loaded, but in my experience it's still extremely common for content to seriously jump around as the ads continue to load, especially on smaller screens (mobile).
So using "flash of unstyled content" as an excuse just doesn't hold up: users still have to learn to give a site a few extra seconds to settle before it's safe to interact without the content reflowing under your finger to put an ad where you wanted to tap, and breaking accessibility "for the sake of preventing FOUC" is dumb when you still have FOUC.
A similar tactic I've also seen that is even more unjustifiably anti-user is when the <body> element has "overflow: hidden" set until some heinous script that does it's own poor implementation of smooth scrolling can get up and running. These sites are universally improved by blocking such scripts and enabling native scrolling. This is one of the reasons why I believe browsers should be bundling together a large number of permissions that are off by default for every site the user has not flagged as being a web app. Google Maps has a good reason to interfere with scroll behavior; a news article does not.
Some people decide whether or not a site is worth looking at extremely quickly - if it looks "old" or "broken" they hit the back button immediately. They will wait for the first paint though, so it can be better to delay anything appearing until everything is ready. The impact depends a lot on demographics.
A site that renders completely blank without JavaScript is a site that I don't enable JavaScript for. They don't want me to view it, and nine times out of ten I can find the information elsewhere.
Yet I'm convinced you probably do enable js for various payment portals and govt/financial websites, and they often tend to go blank or loop out far more than the average site.
Apart from carefully cultivating a working noscript over years, the simplest solution may be to use a different browser for these sorts of interactions.
Reminds me to backup my whitelist. It's actually quite valuable.
What would be the ways to keep JavaScript virtually “off” but viewable?
I mean, it’s all the same framework-min.js working like a modern COBOL, not like custom mathematically significant compression or advanced dynamic P2P webpage distribution, right? Feels to me like a meta rendering engine could be created so none of the functions needs to be runtime evaluated or content be fetched by it or whatnot.
Sites that function without it are the exception, not the rule.