So fundamentally, due to mitigatable-but-impossible-to-fully-patch hardware exploits like Rowhammer and Spectre, client-based sandboxing cannot be relied upon.
Seems that a good approach for the average HN user is to disable Javascript for all sites, but whitelisting trusted sources (of course with HTTP MITM over compromised routers, never execute any code with questionable integrity, so anything delivered over a non-TLS connection)
Goes without saying one should only use the minimal possible amount of native applications too: ideally none except perhaps the smartphone OS-vendor's trusted applications (your OS-vendor can already arbitrarily execute code and read memory). Using the OS-vendor's web browser with Javascript disabled shouldn't be too much of a security risk.
Even seemingly trusted web sites are often pulling in JavaScript libraries from third party CDNs, analytics services, and advertising networks. It's almost impossible to be certain that none of those are compromised.
Decentraleyes[0] is a great plugin that intercepts many popular 3rd party CDN requests and replaces them with local copies. It's not a silver bullet, but it reduces CDN-based tracking and the chance of the attack described in the parent comment.
Subresource integrity[1] is a technology that's been available for a while that also mitigates this attack, if used properly by website owners.
True, but at least the attack surface area will have been reduced by a large amount which is worth something.
Disabling Javascript served over HTTP is such a good security improvement that I'm surprised no browser vendor has implemented yet, in the age of Let's Encrypt. It'll break a lot of websites, but better than compromised routers and ISPs serving JS malware.
JS broke the presentation vs data abstraction. It's an obfuscation layer used to make it harder to use data the way you want. It's also a linkrot catalyst. To scrape many sites you must exec an arb program, that may never halt, and save the state. For the majority of cases it's unnecessary. I cant render text on many sites unless I linearlize or execute the JS. Minor additions to HTML could remove the majority of reasons to use JS. It's entirely possible to make a maps site that still works right now. It's also a side channel that makes it near impossible for browsers to fingerprint the same.
I don't know what the solution is other than users understanding the problem.
Seems that a good approach for the average HN user is to disable Javascript for all sites, but whitelisting trusted sources (of course with HTTP MITM over compromised routers, never execute any code with questionable integrity, so anything delivered over a non-TLS connection)
Goes without saying one should only use the minimal possible amount of native applications too: ideally none except perhaps the smartphone OS-vendor's trusted applications (your OS-vendor can already arbitrarily execute code and read memory). Using the OS-vendor's web browser with Javascript disabled shouldn't be too much of a security risk.