Hacker News new | past | comments | ask | show | jobs | submit login
Please disable JavaScript to view this site (heydonworks.com)
252 points by abusedmedia on Nov 28, 2020 | hide | past | favorite | 268 comments



I must say, I don't understand the disable JS movement. I browse with JS on, and uBlock Origin to block ads. It's rare that I have any javascript-related problems in my web browsing. On the other hand, I definitely use a number of sites that rely on javascript for useful purposes.

If you're worried about tracking, you can block ads and tracking scripts without disabling javascript. If you're worried about viruses, well, all I can say there is that in my experience and understanding, if you keep your browser updated, the odds of getting a virus via browser JS are exceedingly low. Doubly so if you're not frequenting sketchy sites.

I don't know, it seems to me like advice from a time before security was a priority for browser makers, and high-quality ad blockers existed. At this point, I really don't see the value.


On a great many web sites I have to spend the first 60 seconds on the site clicking on "X" boxes in popups to make them go away. In many cases there are actually several layers of popups obscuring the content, and some are delayed so they only pop up after you start reading the content. No I do not want to subscribe to your mailing list. No I don't want to take your survey. No I do not want to "chat" with your bot-pretending-to-be-a-human. Yes this is my sixth article from you this month but I will not be paying for a subscription, because I have enough of those already. Yes cookies are OK but I want the minimal set. And no I will not disable my adblocker because doing so makes this whole bloody nightmare even worse.

Disabling Javascript makes most of this insulting crap go away, and sometimes it is the only way to read the content.


I personally visit "a great many websites" daily, and rarely have this problem. Maybe it's uBlock doing it's job, or maybe it's the kind of site you go to?


Or maybe I need to try uBlock again rather than assuming my Pihole is as good as it gets. Thanks for the suggestion.


Of course it isn't. Pihole and similar DNS-based blockers do nothing for those kinds of spam which require performing DOM manipulations to be removed. Check uBlock settings after installing it. It has a separate "annoyances" list. I enabled everything in it a few years ago and never had a single problem. It removes all the GDPR banners, "please give us your email" popups, useless "oh I am so original" plates in forum signatures, etc. etc.


Thanks for the tip. I wasn't aware that uBlock had all those extra options.


I wasn't aware of this option in settings. Thanks.


If the default annoyances lists aren't enough for you you can also add this one[1]

[1] https://github.com/yourduskquibbles/webannoyances


I've never been bothered by ads/want to support sites I visit. so never considered using javascript blockers.

But I must say I hate GDPR banners and this could convert me.


I would suggest ublock origin instead of ublock.


Pihole blocks server requests but ublock can do a whole lot more to the page by blocking specific html, css, and scripts. Ublock will prevent YouTube ads, for example, whereas pihole cannot as they come from the same server as the content.


That may have been true whenever you last evaluated your choice, but I find that today, on-load modals are mostly burned into the HTML. With Javascript disabled, rather than not being there, they're instead always there, impossible to get rid of.


> With Javascript disabled, rather than not being there, they're instead always there, impossible to get rid of.

Inspect element -> remove

Browsers should really add "remove element" directly to the context menu.


Agreed. I also would like a browser that would let me automatically disable any element with a z-index greater than 1.


I just hit reader mode on FF. I didn't know modern sites would even load without JS enabled.


But you do get the irony, right? It's like a browser is not for reading by default anymore, like stepping into a car and putting it in "car mode".


Oh yeah. Browsers are application runtimes. They have been for 20 years now.


>In many cases there are actually several layers of popups obscuring the content, and some are delayed so they only pop up after you start reading the content.

I swear I've had ones that popup as I move the mouse to close the tab.


Personally I usr stock ublock origin and sponsorblock. Thesr 2 together work reasonably well.


This is practically me


> If you're worried about viruses [...]

Not to mention that a host of vulnerabilities were image related a few years back (one of the original rookits exploited a TGA bug).

> uBlock Origin

Honestly, this is the antivirus of the web. I helped my niece set up my old computer for Minecraft today, and she was explaining how her friend had installed viruses (adware, really) 3 times. Every one of those instances was caused by download link confusion for Minecraft mods. Disabling JavaScript isn't going to save you from being tricked into downloading shady software, only an adblocker will.


Yeah! After I install uBlock Origin, people seem to stop downloading malware for some reason. Funny how that works...


> one of the original root kits exploited a TGA bug

As a lover of old image formats and the security issues they can cause* this sounds fascinating, but some quick google searches don’t seem to surface what you are referencing. Can you share any more details?

* I once fell into discovering a memory disclosure flaw with Firefox and XBM images


There was the github ddos that existed (iirc) as an image that made a request when viewed (I think it actually ran a script) and a couple smaller botnets that used similar functionality in 2018.


Wish I could edit that, I meant jailbreak - which did give root access, technically allowing a host of nefarious use cases.


In my experience Brave Browser (chrome based) runs circles around uBlock origin FYI


No it doesn't. I use Firefox with ubo and Brave (without extensions) for work and I notice no "running circles around" by either browser. While I'm sure brave's native blocking is faster, in human perception the time difference is essentially nil.


Can you say more? I’m a happy uBlock user — what am I missing by not using Brave?


Forced client updates



In case the OP wanted to know exactly how Brave's adblock is different from uBlock Origin instead of a link to the marketing page with links to other things like cryptocurrencies:

Brave's browser claims a speedup over AdBlock plus, but was inspired by UBO, so the performance is fairly similar, but is baked into the browser instead of being an extension.

> We therefore rebuilt our ad-blocker taking inspiration from uBlock Origin and Ghostery’s ad-blocker approach.

https://brave.com/improved-ad-blocker-performance/


It's also written in Rust for performance.


I use a 6 year old desktop which wasn't that great even when I built it, and a pretty terrible 8 year old laptop, and I don't have any problems with uBlock Origin's performance. I have almost every filter list enabled (except for some regional ones), which results in 153486 network and 173646 cosmetic filters total.


Same here. My take away from these replies is that I'm not missing anything by sticking with uBlock.


With uBlock origin you're at the whim of the changes Google pushes through to chrome/chromium to nerf it.


What are you talking about? uBlock Origin is not tied to Chromium, and is not controlled by Google (unlike Brave which is just a fork of Chromium).

Jesus, why does everyone these days automatically assumes that everyone else is using Chrome or Chromium? It's almost as crazy as calling Windows a "PC".


Google controls the frontend APIs uBlock origin is allowed to use, and they have pushed changes numerous times in the past (covered on HN) to intentionally nerf uBlock origin because it hurts their bottom line.


I don't understand this way of thinking. Rust isn't magically fast. You can use most languages to write both performant and lazy code.

Also just so you know, Brave isn't "written" in Rust alone, it is a big software with a lot of parts, including but not limited to a rendering engine, a JS VM and a WASM engine.

The Rust part at most (unconfirmed) would be the glue that connects them together, and I doubt that's where the bottleneck is for most browsers.


Is that's the new algorithm that made the rust rewrite x69 faster

>The new algorithm with optimised set of rules is 69x faster on average than the current engine.

https://brave.com/improved-ad-blocker-performance/


I meant Brave's adblocker was written in Rust, as the context was about how Brave's adblocker differs from others.


"In my experience", followed by "for your information" - what an empty comment. You've not breathed any substance into your opinion with this post.


Developers have been reimplementing W3C specs in Javascript. Forms don't work, password managers have no idea what's going on, back buttons are hijacked, and sometimes even scrolling is thought to be better handled by a 4MB JavaScript file than by the browser.


Whatever one thinks of that though, aren't those good reasons not to disable javascript?


They're an even better reason to find alternative sites.


I must say, honestly, I don't understand the JavaScript (JS) movement. I am not a web developer, perhaps that is the reason.

IME, 9 times out of 10, web developers are using JS for non-necessary reasons. The user configurable settings of popular browsers make it easy to designate the small number of sites that actually require JS and keep JS disabled for all other sites. They anticipate that the user will not have one default JS policy for all websites. In other words, these web browsers do not expect that all users should just leave JS enabled/disabled for every website, they acknowledge there will be situations where it should be disabled.

However as we all know most users probably never change settings. Doubtful it is a coincidence that all these browsers have JS enabled by default.

The number of pages I visit that actually require JS for me to retrieve the content is so small that I can use a client that does not contain a JS interpreter. Warnings and such one finds on web pages informing users that "Javascript is required" are usually false IME. I can still retrieve the content with the use of an HTTP request and no JS.

There is nothing inherently wrong with the use of JS. It is nice to have a built-in interpreter in a web browser for certain uses. For example, it makes web-based commerce much easier. However, I believe the largest use of JS today is to support the internet ad industry. Without having automatic execution of code by the browser without user review, approval or even interaction, I do not believe the internet ad "industry" would exist as we know it.

I believe this not because I think having a JS or other interpreter is technically necessary, but because these companies have become wholly reliant upon it.

That's why disabling JS stopsa remarkable amount of ads and tracking.


The idea that some remote server needs my processing power to display images and text is so ridiculous because it has become normal. These things don't need my processing power, nor do they have a right to it by default.

I browse the web with JS disabled by default. If I encounter a site that has trouble with that, I enable it for that site until I can determine if it is worth leaving it enabled, which usually means at some point I'll be back there again and need it on.

For the most part, it is a superior experience to what I was seeing before with just an ad blocker. The most noticeable thing about it is probably how many images simply don't load because developers lean on JS for loading and scaling them.


I ain’t need no Turing complete sandbox on my box run random code from Internet while I skim through HyperText


CSS is Turing complete.


That's not the point of this. This is not about disabling JS as a user of websites.

Heydon is a developer, and an influencer of developers. He's saying: web development is now absolutely obsessed with JavaScript, and it in no way has to be. The basics, HTML, CSS. That's what's important.


That's a fair take. I was mostly reacting to the other top level comments I saw here.


I keep it disabled just so sites load faster. Except for 1 or 2 sites, I don't care about anything except for the main text on a page, so it's a waste of time to let a ton of javascript run and load/format things that I don't care about.


I generally visit web pages with JavaScript disabled and them immediately click "reader view". But there are certainly a lot of sites where content is loaded with script, and for those pages I must enabled JS. Q: Does anyone know of a Safari iOS add-in that allows for whitelisting of specific sites to run JS?


Not sure if this solves your entire problem, but you can disable all content blockers for any sit in Safari for iOS[1].

[1]: https://www.macrumors.com/how-to/disable-content-blockers-sa...


So what I should then be looking for is a good content blocker - one that supports disabling JavaScript. Any recommendations?


No clue on that one, sorry. I’ve been using Purify[1] for ages, but have no clue if it blocks JS - I suspect it blocks only some JS, because my experience of the web isn’t trash while using it, but I do have to disable it sometimes to use the heavily animated navigation systems that some sites implement.

[1]: https://apps.apple.com/us/app/purify-block-ads-and-tracking/...


> I don't understand the disable JS movement

Javascript is a privacy and security nightmare. It's almost equivalent to downloading and silently executing untrusted code on your machine. I say "almost" because Javascript code is virtualized and sandboxed. Though I have no doubt people have already discovered vulnerabilities that enable code to break out of the sandbox.


The cost-benefit analysis of JavaScript usage comes down on the side of enabling it for most people, because of how much of the web is completely broken without it. Sandbox escapes are rare but extremely valuable, and they absolutely exist: https://www.computerworld.com/article/3186686/google-patches...


> Javascript is a privacy and security nightmare.

AFAIK, JavaScript the language has neither privacy nor security issues of "nightmare" level.

> It's almost equivalent to downloading and silently executing untrusted code on your machine.

No it's not. The code is run in a VM, which is run in a browser. So, the code is limited in doing things to the browser, which itself is limited in what it can do to your computer (files and whatnot). So it's not at all like running untrusted code "on your machine".

> I say "almost" because Javascript code is virtualized and sandboxed.

It's virtualized (in the browser) such that all the code will run almost the same on different browsers and chipsets. Again, the browser code is what keeps the computer safe from any code it runs, including CSS code or other VMs it may use, like Java or Flash. Also the OS keeps the computer safe from the browser (or at least it should).

So, no it's not JavaScript that is the boogeyman here.


My understanding is that JavaScript is the primary mechanism used in browser fingerprinting and cross-site user tracking/"analytics". Isn't that a rather large privacy and (personal, if not specifically "cyber") security risk?


Yes.

The "security features" of popular browsers will never protect the user from the tentacles of internet advertising. Companies/organizations that author popular web browsers generally rely on the success of internet advertising in order to continue as going concerns; as such, they are obviously not focused on internet advertising, and collection of user data, as a "security threat".


Actually, the main and most used mechanism of cross-site user tracking and "fingerprinting" are cookies, which do not require any JS to work.


How does cross-site user tracking do its thing?


Set a cookie using HTTP headers.

Use a tracking pixel (eg. image) to make further requests and cookie will be included in the request.


I wouldn't say javascript serves a useful purpose anymore as it's being used to generate entire UIs. It's more a requirement, and performance is taking the heat.


Seeing as how this site is in part for job opportunity self-promotion, it seems like a decent way to weed out non-technical employers.


really don't you really understand?

this is not about how "you" do things, It is more about how it should be done! js is almost never provides want I want when I browse, I expect to get some information! I am not on the circus looking for adventures!

people who love to use javascript to prove that they have some kind of taste about how ux etc. I think these people should use some other platform for people who are insterested in show bussniss. think of this as public transportation is designed based on who the driver is that day! is this sound ok to you? do you understand this one?

web is just connection to other people, not a tool for others to bully you just bein' smart about "the code" they wrote is brilliant!


As someone who has JS off by default for a long time (ever since I discovered how much it could remove annoyances, and this was back when SPAs were basically nonexistent) and is thus often subjected to "Please enable JS" messages which more likely than not will simply make me click the back button[1], I am delighted to see this exists --- I've thought of the idea before, but never did anything with it:

https://news.ycombinator.com/item?id=11411982

[1] I once enabled JS on a site that claimed it would provide "a better experience", and was bombarded with a bunch of ads and other irritations that just made me turn it off again. It was not a "better experience".


It didn't say who it was better for...


>make me click the back button

I forget about the back button. By default, I always open links in new tabs which means back button has no data. Also, SPAs have hijacked the back button or just broken it completely, so I've been trained to not count on it behaving as expected. There's also mobile experience where getting to the back button itself is often painful after the UI hides navigation from you.

Otherwise, I am 100% in agreement. If a page is so user hostile to not making a friendly non-JS page, the tab gets closed


> By default, I always open links in new tabs which means back button has no data.

I really wish, even if it was an optional setting, browsers would copy the past history of the source tab when you did that. If it hit back in a tab I opened that way, I still want “where I got here from” not “stay here” or “new tab page" or especially “close the tab” (thanks a lot Android Chrome).


Safari does this, at least on iOS. If you open a new tab, then hit back, it just closes the tab and takes you back to the origin tab. If you’ve closed the origin tab, it leaves the active tab open but goes back to what the origin tab was.


The macOS version has this behavior as well.


That sounds like an amazing idea that they should implement!


If an SPA is hijacking your back button it is poorly coded and the coders should be ashamed.


As someone who has JS off by default (via uMatrix) I see a blank page.


You need something to enable the `<noscript>` content to be rendered, like a uBO dynamic rule `no-scripting: $hostname true`. uM doesn't do that.


I think it depends on how you disable it. Anecdotally, I think it’s “if no scripts are executed on the page, render <noscript> content, but if any scripts at all are executed, don’t”.


Arnavion was actually right. No scripts at all were executed.


Does uMatrix still exist (maintained)? I remember reading it is being discontinued.

(I occasionally used it for curiosity, but found it too tedious in the long term. I have settled on CookieAutoDelete, which seem to address most tracking. Not many seem to run a completely server based fingerprint database.)


It is not maintained but the latest version[1] still works fine, and I will continue to run it until that is no longer the case.

[1] A beta that you can download from the github page. I assume the latest stable version also works fine, but the beta had a few additional bugfixes and features and I haven't encountered any instability.


I use NoScript and it works fine for me.


This website does not track you. Correction -> This website does not track you via JS. You have no idea what's logged on the backend.


It logs requests to the site, which is far less invasive than the fine detail of browser fingerprinting and tracking that JS allows: JS can see your mouse pointer's position, how long you spent on each area of the page, which parts of the text you selected, and many many other things.

Things like this are seriously creepy: https://www.crazyegg.com/blog/mouse-recorder/


You can implement similar mouse recording via requests for :hover psuedoelements in CSS. Also, I’m not sure you need JS to get fine fingerprinting and tracking in 2020— https://wiki.mozilla.org/Fingerprinting


Heatmaps have legitimate, non-tracking, purposes.

eg: hotjar.com sessioncam.com

Legitimate tools for measuring effectiveness of pages with little in the way of nefarious tracking afaics. Also very useful for replaying user errors/problems.


There are many companies with similar products: Inspectlet, Lucky Orange, probably more. This is a cat that will be quite difficult to put back in the bag.


There are also ways to track you on the front end without JS but I think "This website will not… track you" is just a promise from the not something that followed from the lack of JS anyways.


It could also be a canary in case the site gets bought out, and the new owner wants to implement invasive tracking. If a site has "will not track you" when you visit it, but the next visit it is removed...


Ye olde days of tracking just used invisible .gifs and every click was a different webpage so they just tracked which ones were requested to gain interaction metrics.

JS doesn't have any magic to it, location information is opt-in, but your IP is a much better advertising identifier.


IP behind NAT or CGNAT is not that useful, but many mobile browsers (especially cheap Androids) leak so many trackable details through headers that makes easier to uniquely identify devices/users


back in the '90s I was into connecting to IRC servers using spoofed IP addresses. The way it worked is you told the software what OS you were connecting to (or it would figure it out itself, I can't recall). Each OS had a unique way of generating TCP sequence numbers, which allowed the software to guess which number would come next.

Nowadays OSes have protection for this sort of thing. But I'd imagine you could still fingerprint an OS like that. Combine that with TLS, HTTP, etc. specifics and you could narrow it down quite a bit I bet.


How are you going from guessing TCP sequences to spoofing IP addresses on TCP connections? Did you breeze over a step or am I missing something obvious?


TCP packets contain sequence numbers that must correspond to the ones sent by the other side. This is an issue if you're spoofing packets because you don't receive packets (containing the sequence numbers) from the other side (they will go to the spoofed address, rather than yours). Without the other side's sequence numbers, your replies will be considered invalid, which means you can't complete the handshake[1] to establish a connection. However, if you can successfully guess the sequence numbers, you can complete the handshake and also write arbitrary data to the stream. You still won't be able to receive data, but for simple protocols like irc, it can still be useful eg. connecting to a server and then sending spam to an user/channel.

[1] https://en.wikipedia.org/wiki/Transmission_Control_Protocol#...


The mitigations for spoofing sequence numbers might be different for each OS, and that would allow the OS to be fingerprinted. See nmap's OS fingerprinting, for example.


Yep, `nmap -O` works pretty well!


> JS doesn't have any magic to it

Canvas fingerprinting, WebGL fingerprinting, GPU, fonts etc etc etc.

Please, stop arguing, JS is a nightmare for privacy. Period


So are DNS and HTTP caches.


>DNS

most people don't run their own resolvers, so at best you're fingerprinting DNS server of the ISP.

>http caches

can be easily cleared, or mitigated entirely by extensions or browser (eg. multi account containers).


> most people don't run their own resolvers, so at best you're fingerprinting DNS server of the ISP.

That’s not how it’s tracked commonly. Similar to HTTP caches, you can fingerprint visitors by how quickly a domain request resolves for them. Sure, all of this can be mitigated. But you have to even know what to mitigate. And given the most fanatical privacy folks aren’t aware of basic timing fingerprints is a good indicator that no one is mitigating it nearly as well as they might think.


If js were removed from the web tomorrow, the people currently working on tracking protection against js could instead focus on these other mechanisms. Because privacy is an arms race, reducing attack surface is not pointless even if the same tracking can be achieved by other means.


I don’t think JS (or some other runtime) could plausibly be removed from the web on any time scale. People have a (reasonable) expectation that they can do app-like things on a network, and that surface area will find its way to manifest one way or another. At least having it on the web has somewhat of a limiting effect on the entrenchment of the biggest (and worst) privacy offenders, because the barrier to entry is lower than building a wide array of native apps.


> your IP is a much better advertising identifier

Citation needed?


To be fair, it might be illegal for them to keep logs. Probably just timestamp, source IP and requested resource.


For those looking for an easy way to disable JS, uBlock Origin has a button to disable it


There's also a Firefox extension called "Disable JavaScript" that places a JavaScript toggle switch on your address bar and remembers whether you want JavaScript on or off on each specific site that you visit.


Brave browser has a fairly simple process as well. It requires two clicks. Click on the shield on the top right, then click on the "Scripts Blocked" toggle button.


If using Chrome, you can use Cmd+Shift+C to open DevTools, Cmd+Shift+P to open command palette, and then type 'disable javascript' and hit enter.


If using Safari, use System Preferences to assign a shortcut to the "Disable Javascript" menu command (I use command+K).


Or click the lock in the address bar, "Site settings", set "javascript" to "block". This will be sticky across the site, for better or worse.


Can also be default disabled, then permitted for selected sites.


that button has a strange icon... but thanks for pointing it out


It's a code block icon - what's strange about it?


This is also a great resource for getting around JavaScript-based paywalls.


I prefer making all my sites work just as well without JavaScript as with it. This is cool though if you need a quick check to see if JavaScript is enabled.


In which occasion a software ootb like a browser won’t have JavaScript enabled? I’m genuinely curious, the only time I saw something like this was using the distro Kali having no script in Firefox.


I sometimes read HN in a text web-browser, but ultimately most people will have JavaScript enabled.

It's worth saying that most people don't even know what Javascript is, full stop. Weirdly enough my now mother does, but my younger sister doesn't - we now have a generation that has effectively grown-up post-smartphone, which is fascinating to me.


Tor used to have noscript set to the highest level, but in recent years they've changed that.


lynx

links

w3m

that's just off the top of my head.


Dillo and Netsurf won't have JS too, and they are graphical browsers.


Yeah, my first instinct was lynx. Works fine...


On my own site, any JS flourish has a noscript fallback. There's nothing major there in the first place, but it makes me happy to do that.

The other rule is that the JS is all hand-written. No frameworks or other dependencies.


I've also been tinkering more with no framework js, although it's by way of typescript and webpack. And it's really fun, I think there are a lot of cases where you can just opt out of react or bootstrap if you know what you're doing


noscript is sadly, not perfect, but works if you stay 1st party.

A great way to make your browsing better is to disable 3rd party scripts by default and whitelist when needed, but <noscript> fails to work in those conditions.


You're thinking about Noscript the plugin. <noscript> is also an HTML tag that can contain content for use when the browser isn't running JS, but which would yield a cleaner page if not present when JS is running.


no, im thinking about noscript, the html tag.

if you disable 3rd party javascript, (using ublock origin or others) noscript tags don't trigger because scripting is still technically turned on and noscript tags aren't assigned to the script they compliment so the browser has no way of knowing which ones to run or not run in the 3rd party situation.


I've found alpine.js very suitable for this.


Thanks to this I learned how to disable JS on iOS Safari. An annoying number of clicks. So thanks for that I guess


Is there a way to disable JavaScript per website on iOS? Or an alternate iOS browser that supports it?


I use Purify ad blocker only for its JavaScript blocking option. I can re-enable JavaScript on a per-site basis by going through “Purify Actions” from Safari’s Share Sheet.

I combine this with another ad blocker (Wipr) to block everything else.


Do either of those deal with YouTube ads? The old iOS ad blocker I've been clinging to has stopped blocking YouTube ads recently.


Yes, Wipr blocks the ads. You get white nothingness instead of ads, and hyst have to click the “skip ads” button.


Excellent. I’ve been looking for a way to do this.


I do that with Firefox + uBlock Origin. FF is available on mac.


You can assign a keyboard shorcut to that easily (I bound it to Cmd+Shift+J and find it super-handy):

https://support.apple.com/en-gb/guide/mac-help/mchlp2271/mac

If someone knows how to achieve the same on Linux for Chrome and Firefox, I'd love to hear it (browser plugins are a bit of a security and stability shitshow, so non-plugin solution would be preferred, all else being equal).


He said iOS, not macOS. Most iOS devices don't have a keyboard to do combos like that.


Ugh, sorry, I misread that. The workaround I would recommend on iOS is just turn javascript off in Safari and install another (Safari-skin) browser like Chrome for the times when you need javascript -- or vice versa, depending on what your default preference is.


Mozilla removed the normal settings toggle a while back, so your options are to manually set javascript.enabled in about:config or use an extension. I agree in general that extensions aren't great, but I trust uBlock.

I think Chrome actually is fine although I don't know of a keybinding for it: Don't they still have a toggle right in the site menu (click the icon to the left of the URL to toggle all kinds of these things)?


In Vivaldi (which I'm 99% certain matches chrome for this part) you click the icon to the left of the URL and then click "Site Settings" at the bottom of the resulting menu where you can toggle a slew of permissions, including javascript.

It's about the best UI I could come up with for this particular knob and the other things adjacent to it.


You can still disable JavaScript via the developer tools. Although it is this session only and auto-reload, so you must tick that checkbox every time you navigate to a new link.


From a security stand point you should at least have uBlock Origin installed for Chrome and Firefox. It can block JavaScript easily. I would trust NoScript as well which provides fine grain control over scripts. I get not wanting any extensions, but blocking scripts will unfortunately never be viable in a user base that doesn’t know what a “script” is...


On iOS Safari?


So running untrusted code on your computer is a bad thing. Well, it should be in a sandbox, but that has many intentional and proably some less intentional holes to do stuff on your machine.

So how do I trace what Javascript is doing on my machine?

Generally mitmproxy gives a feeling what sites the browser talks too. And strace gives often a good feeling what a Linux binary does. But the browser is too big and complicated to read strace output in most cases.

Can anybody recommend a tool to look what Javascript code loaded by a certain page is doing?


> Can anybody recommend a tool to look what Javascript code loaded by a certain page is doing?

Open your browser's developer tools, go to the Script/Debugger tab and have at it. It's just about as obtuse to use as a tool as gdb, but you'll see exactly what it does. Chrome dev tools has automatic formatting of the code, maybe firefox too. But you'll be stuck with shitty variable names if they been mangled. Although you could try http://www.jsnice.org/, I had variable luck with using it.

It would be interesting to have a browser tool that is like strace and you could filter by calls, so you can see exactly where window.navigator is being used for example, or localStorage.setItem. For now best you can do is searching for "navigator" which works, but can be minified/hidden away by coder as well.


> It would be interesting to have a browser tool that is like strace and you could filter by calls, so you can see exactly where window.navigator is being used for example, or localStorage.setItem.

Exactly, that's what I meant.


Because JavaScript is dynamic, you can often rebind things like window.fetch to trace what’s going on (store the old copy and replace with a new one that does something before delegating). If you can arrange your shim to be loaded before any other JS, I guess you could implement something like object capabilities for JavaScript?


Note that javascript can access whether developer tools are open and change it's behaviour. Or so I've heard.


>Can anybody recommend a tool to look what Javascript code loaded by a certain page is doing?

Almost all modern browsers have debug panels that will list all of the requests made by a page, assets cached, cookies and local db, and of course it's trivial to just view the source of a site and read the javascript if it isn't compressed.


Yes, I am aware of dev tools. I use them to look at network requests or to "steal" my own credentials for curl usage.

Haven't really used the Javascript debugger, but my guess would be completely infeasible to follow everything a random "modern" Web site might do. And as you say some Javascript might be compressed or obfuscated. What I really would want is a somewhat higher level / more filtered approach: Like strace lets me just trace file operations for example.


I'm not sure if it always works on all APIs or browsers, but you can wrap and replace DOM API objects with logging proxies.

Additionally, you can set breakpoints on event handlers and Chromium has deobfuscation built in. You can usually tell approximately what's going on by stepping through the code and watching the variables in local scope.


> but you can wrap and replace DOM API objects with logging proxies.

Right, so you are describing the implementation of the tool I was looking for. Obviously I don't want to do that manually while tracing a page.


Chrome dev tools? You can step through every JavaScript call


I’d love to get rid of JS for my personal sites, but I know of no clean and simple way to render dynamic content otherwise. Server-side rendering seems a bit messy. I want a clean REST API separate from the UI.

How can I generate pages with dynamic content easily? Ideally with absolute minimal dependencies.


Depends on what you mean by "dynamic". Some "do something when the user clicks"-style things can be done with CSS. The best example I know of that is https://git-send-email.io/


Abusing HTML form inputs as a way to store application state in DOM is hostile to usability and accessibility - please don't do this for any sites humans need to use. HTML does have progressive disclosure elements built in with `<details>` and `<summary>` which work without JS (I was hoping the demo was showing that or something like it)


Those HTML elements are radio buttons being used to choose between options of OS, mail server, etc. Each radio button is associated with a label that contains the correct text. Selecting a radio button unambiguously makes a `display:none` element no longer be that.

What part of it do you think is bad for usability or accessibility?


Everything you just said - these form inputs aren't related to a form, and assistive technology won't have any clue about the clever CSS selector hacks and what their effects are.

If you want to build something that's nice for humans and machines, look up best practices for this sort of thing - plenty of information is widely available on how to build things in usable and accessible ways (and it's simpler to do it correctly than to use these 'hack'-like workarounds anyway!)


A REST API is purely a server-side issue, so you don't need JS for that (unless your server is Node of course).

You still need Ajax (at least) for truly dynamic content. The bad news is that Ajax requires client-side JS. The good news is minimal Ajax functionality only takes about 30 lines of pure Javascript. The bad news is you will need to write those 30 lines yourself because every JS library is at least 100x bigger than that and packed with cruft you don't need.

My advice is to pretend React doesn't exist, learn Javascript, and write just the Javascript you need and no more.


That still uses Javascript, and that seems to be the parent's point.


It does feels messy if you insist on using rest api for your server-side pages (rendering react components in server-side?). Just abandon it and use html template-based system instead, like what most server-side frameworks do (django, rails, laravel, etc).


While it can definitely be a bit messy, out of the box SSG/SSR support on Next.js is trivial (it’s on by default and there’s an indicator that shows when a page will be built static and clear/simple APIs for dynamic SSR). Certainly not minimal dependencies, and you will hit a webpack wall if you need to change the build. That said, I’m working on simplifying this for my own site (Preact instead of React, static Markdown content, separate style loading, hopefully soon automatic partial hydration for whatever JS does end up on the wire) and I plan to break it all out into simple libraries so people can do similar with the same toolchain without the webpack nightmares I’ve experienced.


How is server-side rendering messy? It's just putting html strings together based on data, no frameworks or libraries needed.


That's what I mean. It feels much messier than JSX, where "putting strings together" is a clean first-class citizen.

With SSR, you need some component that's aware of every change, and that triggers those re-renders at sensible times (every render takes server resources). This all feels messy, compared to rendering just-in-time on the client-side.


I wonder what use case you're talking about, I don't have a lot of cases where re-render needs to be dealt with, unless I'm making a web app like a multi player photoshop in the browser, in those cases I won't use SSR. For most information based sites, e.g. Hacker News, Twitter, there's no need to track change and re-render, all the data can be ready on the server side like querying database or making request to third party APIs, then it'll just be piecing HTML strings.


Check out next.js. You can write vanilla React basically but fetch the props from an API prior to rendering on the server.


How is that not JS?


It runs on the server. The browser can have JS disabled.


But there's nothing on this site once you do disable JavaScript. I was hoping for a 2020 take on a server-side rendered web app.


I see what you did there....

In other news, possibly the best designed website of 2020: http://www.muskfoundation.org/


Similar a company with 250 billion USD revenue (2019) https://www.berkshirehathaway.com/


"Reproduction or distribution of any materials obtained on this website or linking to this website without written permission is prohibited." https://www.berkshirehathaway.com/disclaimer.html

What? :)


From one of my previous comments in another thread (this past week, probably):

> IANAL

> What I think they mean by this is that you shouldn't link to resources on their website to make it seem like they endorse your (product, website, whatever).

https://news.ycombinator.com/item?id=25153873


Haven't the courts ruled you don't need permission to have a link?


Didn't even know this was a thing.


Just make sure that all your links are copyrighted works (authored by you of course) and then DMCA anywhere the link appears.


That's so crazy it might just work. I checked an article about copyright protection for short phrases[0] and learnt that a court ruled that the text “I may not be totally perfect, but parts of me are excellent” is protected by copyright. It would thus be possible for the author of that phrase to register:

i-may-not-be-totally-perfect-but-parts-of-me-are-excellent.com

and sue anyone who links to them. Hopefully the author will be so grateful for this insight that they won't sue me for reproducing their copyrighted work in this comment.

[0] https://fairuse.stanford.edu/2003/09/09/copyright_protection...


There's an ad for Geico insurance at the bottom, lol. What's the deal with that?



Right, still a little odd to have...

> FOR A FREE CAR INSURANCE RATE QUOTE THAT COULD SAVE YOU SUBSTANTIAL MONEY WWW.GEICO.COM OR CALL 1-888-395-6349, 24 HOURS A DAY

...on the homepage of a quarter-trillion dollar company, with no other ads.


The website is really old in the early days of the web this was normal as you didn’t have the ability to check for these things via google.


Berkshire Hathaway owns Geico.


Even that website uses Javascript though (to add a Yahoo tracking pixel, it looks like. What is there even to track?)


Number of visitors per day, country origin, etc...



Doesn’t work with https, which i guess shouldn’t surprise me given it’s sparse nature.


The crazy thing is, it's been like that for YEARS.


No HTTPS tho.


It doesn't need https.


Of course it does. Otherwise intermediaries can inject ads, tracking, spoof the content, or even redirect it to a malicious page.


https does not 100% prevent any of those things.


How can someone spoof the page/inject ads if the site is served over https?

They would need to have compromised one of the root certificates on your machine to not give you a giant security warning.

In modern browsers there’s not even a button to bypass them (although I know I chrome you can type “this is unsafe” to a hidden input in the error page and it will let you bypass it temporarily).


MITM - https termination at a gateway or proxy.


It's worth noting that this is the personal website of Heydon Pickering, a prolific (and opinionated!) writer and speaker on web design. He's the force behind Inclusive Components [0] and half of Every Layout [1].

Watch his videos. Check out his articles on A List Apart and in Smashing Magazine, among others. Pay attention, he's very thoughtful and you'll probably learn a lot.

0: http://book.inclusive-components.design/

1: https://every-layout.dev/


I wish that page showed or linked to a guide with the fastest way to disable js with a given browser + os combination.


Wow, something is wrong with uMatrix. I have uMatrix with 1st party javascript disabled by default. Yet this site says "Please disable JavaScript to view this site." To be sure, I curled the source and hosted it elsewhere, and my browser (firefox 84.0b4 on linux with uMatrix) still runs the JavaScript.


The JS is inline. Neither uM nor uBO static rules block that. The only way to block it is a uBO dynamic rule `no-scripting: $hostname true`, which you'll also need to be able to render the `<noscript>` content anyway.

BTW, you probably want to move off of uM given gorhill has abandoned it in favor of uBO. (I converted all my rules to a mix of uBO dynamic rules for JS and static rules for everything else, except for cookies which I still use uM for because uBO can't manage them.)


Is there a guide for moving from uMatrix to uBO? uMatrix works exactly how I want/expect...

BTW, the site works as expected with my Linux/Firefox/uMatrix setup... the inline scripts are disabled by default and I see the page content. I'm not sure why GP had issues.


For a uM rule like:

    foo.com bar.com css allow
which means "allow foo.com to fetch css from bar.com", the corresponding uBO static rule is:

    @@||bar.com^$domain=foo.com,css,allow
The full list of things that can be allow/block'd by uBO is at https://github.com/gorhill/uBlock/wiki/Static-filter-syntax#...

I have a "block everything by default" rule at the top that's:

    *$css,font,frame,media,object,ping,script,websocket,xhr
    *$image,redirect=1x1.gif
    *$csp=worker-src 'none'
    @@*$1p,css,frame,image
which means:

1. Block a bunch of things by default.

2. Block images by replacing them with the built-in 1x1 GIF instead of canceling the request.

3. Disable web workers by setting the CSP worker-src.

4. Override the previous rules by allowing first-party CSS, frames and images. (The @@ means it's an override rule.)

(The fact that my default is to block everything is why the first example I gave above starts with @@ too.)

Web workers can be allowed on a per-site basis by overriding the csp directive with a reset:

    @@||foo.com^$csp
Lastly, I have a dynamic rule to allow `<noscript>` tags to be rendered:

    no-scripting: * true
Then, for every static rule where I enable JS for a domain, I add a corresponding `no-scripting: $domain false` in the dynamic rules.

It's annoying to have to move between static and dynamic rules when deciding to enable JS on a site, but I'm not sure there's a better way. Neither static nor dynamic rules individually support everything that uM could do - static rules can't block inline JS nor render `<noscript>` content, and dynamic rules can't block every kind of request.

Static rules are also nice in that you can have empty lines and comments and arbitrary ordering of your rules, so it's easier to group rules in sections based on the domain names, add comments, etc. Dynamic rules however are like uM's rules and are forced to be sorted by domain name with no empty lines or comments.


I expected this to be implemented with something like document.body.textContent = "Please disable JavaScript to view this site" on page load. Which would be enough to work. It's actually exactly it, but with the extra precaution of wrapping the entire page inside a noscript tag. Hilarious. I guess it's useful to avoid having a chance to see the content during a flash, before the script executes.

A good contrast with web pages which are not apps telling you "This app requires Javascript to run".


Most developers don't even care that much about their work, their sites just silently break without Javascript.


A small snippet to add to your nojs sites. https://ghostbin.com/paste/kupyj

Shows a banner "You Don't Need JavaScript to Run This Site (turn it off here)"

It's a response to all the "You Need JavaScript to Run This Site" banners we see everywhere even on plain text/image sites.


For those using Chrome, this is easy as clicking the lock icon next to the domain name in the address bar, then "Site Settings".


Stop using chrome


I wonder if there's a "please downgrade your browser to view this site" variation...


Please take a seat before clicking this. And yes, it is still actively used by Walgreens.

https://webapp.walgreens.com/SupplierNet/login.htm


I can't even load the page because it requires TLS 1.1, heh.


"This site is designed for IE6."

And it's mostly a big ActiveX control.


Stop using the fastest browser with the best feature set, dev tools and compatibility


Joke's on you, I viewed the content without disabling javascript


It ought to be possible for the page to disable JavaScript with a meta tag:

  <meta name="noscript">
The browser could put an extra icon next to the https padlock so the user would know they were viewing a document rather than an application.


There seem to be a HTTP header[1] you could use. No icon though. And I'm not sure how much the restriction could be evaded by interactive Turing-complete CSS or similar features.

[1] https://developer.mozilla.org/en-US/docs/Web/HTTP/Headers/Co...


I wanted to read the source, but it was all on 1 line, so I fed it to https://validator.w3.org/nu/?doc=https%3A%2F%2Fheydonworks.c...


Right click -> View source -> Ctrl+A (select all) -> Ctrl+C (Copy to clipboard) -> Open your preferred text editor (mine is Notepad++) -> Ctrl+V (Paste from clipboard) -> Replace ">" with ">\r" -> enjoy the source reading.

Takes max 10 seconds, on any site. Can you do it in less than 10 seconds using that validator?


That would give you new lines, but not indentation..

But my point was that the markup is invalid.


Any guesses as to why the source code look the way it does? I was expecting a traditional website inside the noscript tags, but instead I found an obfuscated soup letter between them.


It's an inline SVG image. That alphanumeric soup is the coordinates of the points (and handles) of a line. Scroll past that, and you get the website.

I have no idea why they made the SVG image inline but the CSS style external, though. That same image is used on every page.


Only inlined SVG can have its elements styled by CSS.


And in-line SVG can have it’s DOM modified, allowing the image to be interactive.

Of course, you’d need to have JavaScript enabled to do that...


Ancient webserver that didn't serve the correct mime type for svg?


Perhaps… but calling the SVG a `.png` and transcluding it into a smaller inline SVG file would still have worked in the vast majority of browsers.


Yeah, putting out obfuscated content while making a statement against client-side JS is very ??? to me


I am not a developer, and I learned how to disable JavaScript in Chrome to read this site, out of curiosity. I then checked a few other sites and learned

* Washington Post no longer has a paywall

* anandtech.com is seemingly unaffected, but tomshardware.com is very different (and less pushy)

* SoundCloud says "Oh no! | JavaScript is disabled | You need to enable JavaScript to use SoundCloud"

* nationalreview.com is broken -- I can only read a few paragraphs in a NR PLUS story, and there's no way to keep scrolling (I can read the article fine in another tab)

* An article I co-wrote, published in a Cambridge University Press journal, is now sans tables and figures, but the console reports no errors or exceptions.

An interesting experiment! Overall, it seems my internet experience is better without JS (but reading an academic article online is way worse).


I must say I had to look for how to disable JS in Firefox + uBlock.

As someone pointed out, there's a button on uBlock to disable it.

I didn't quite spend so much time on Firefox preferences but I didn't find the option. I'm sure it's there somewhere

You can open developer console, go into it's settings and look for Disable Javascript. Checking the box will disable JS and reload the page.


"I have spoken on stages all over the world, including at Fronteers, Beyond Tellerrand, and JS Heroes"

Would love to have heard that final talk!


After disabling js I get a blank page in Firefox. Is that right? CBA to decipher the source or work out how to disable js in Chromium.


It's a personal CV-type website.

Here[1] is the link to disabling JS for a site in Chromium.

[1] chromium://settings/content/javascript


When I enter that into my address bar in Chromium I get a duckduckgo search, but I managed to turn off js from "site settings" and see the site.


Ye whoops, it's actually:

chrome://settings/content/javascript


I disabled it with 4 taps in uBlock Origin, Firefox Android. Open menu, Add-ons, uBlock Origin, the JavaScript icon. Then another tap to reload the page, the back button and the site did show.


Fair enough. For me, using the "Toggle js on and off" extension on Firefox 83.0 on Ubuntu, I see a blank page.


The reason you see a blank page is because the website wraps content into a <noscript> tag and by "disabling javascript" it wants you to render <noscript> content, not actually disable javascript. But it's a bad idea in practice. Most websites work better out of the box if you disable javascript, but not render <noscript>, and also disable CSS, and inject some custom style to fix size of embedded icons. I'm not sure if there is a public extension that does that though.


I really don't think that's true. For one thing, anything beyond static content sites are just not going to work. And even content sites could well be unusable without their layout CSS. I really don't see the point in subjecting oneself to such hassle.

I would suggest that most websites work better if you have javascript and CSS enabled, since that's what they were designed for, but use an ad blocker like uBlock matrix to remove the ads.


> This website will not [...] tell you fibs

That's literally the only thing it did until I reconfigured my browser to access it. It's a misuse of `<noscript>` and it's completely unnecessarily intruding on how I use my own computer to access the content. I thought that was the kind of thing people here (especially the anti-JS people) frown upon.


I’ll add that I’m currently building a website, taking great care that it will be fully operational without JS, and progressively enhance components where there’s some interaction I want to provide to people who accept it. I wholeheartedly agree that there’s way too much JS trash on websites, and that it’s all sorts of vectors for abuse. But this kind of hamfisted response is really painfully unhelpful. You don’t have to browse with JS enabled, and you’re welcome to encourage people to make the web work without it again. But making content inaccessible without finding an obscure browser setting isn’t going to appeal to anyone except the people who are already locked and loaded.

My philosophy is: set a good example, make the benefit clear, and communicate with other devs who might not realize the horror show they’re sending down the wire/executing in their users’ browsers. But forcing people to figure out how to disable something increasingly hard to disable before they can even hear you is not good communication.

I only clicked the link because I was hoping there would be a regular site under default config and some special treat with JS disabled. Progressive enhancement. That would have been a clever and compelling execution.


For a personal website for showcasing their works I think requiring JavaScript turned off is a little too far.. I removed JavaScript from my sites to improve accessibility and user experience, but for this I have to go to settings and turn off JavaScript, which is worse user experience than having JavaScript on the site.


You have to turn off js to see the site, but then turn it back on to see the linked gallery. That seems a bit inconvenient.


No thanks, bye.


Ok.


Worth mentioning: Twitter legacy, which does not require Javascript, will be shutting down in 15 days.

https://news.ycombinator.com/item?id=25088561


Erm... I have no idea how to turn off JS, globally or for particular sites, in Firefox for iOS.


In my view, micrototalitarian actions like this are every bit as bad as the ills they seek to cure. Insisting on infringing the liberty of others, before even so much as talking to another person is the very core of what ails our society in this time.


I'd argue that wild histrionics as you have just demonstrated are worse than whatever this is (which certainly does not infringe on anyone's liberty).


It infringes my liberty to browse with JavaScript enabled. Further, when the argument is lost, ad hominem becomes the tool of the loser.


He may have insulted you, but not all insults constitute logical fallacies.

Ad hominem: You're wrong because you're an idiot.

Just an insult: You're an idiot because you're wrong.

Furthermore, concluding that somebody is wrong because they used a logical fallacy is itself a logical fallacy. If I said "2+2=4 because you're an idiot" my reasoning would be fallacious, but to conclude that the answer must therefore not be four is also fallacious.


You don’t have a right to browse someone else’s content.

Calling out your histrionics for what they are is not ad hominem. I’m attacking your statement, not your person.

Further, that’s not some sort of axiomatic law, that’s just a phrase. Even if it was, losers using ad hominem doesn’t mean winners don’t, that’s not how logic works.


Spot on.



micrototalitarian, adjective: relating to a system of government that is a tiny bit centralized and dictatorial and requires a microscopic degree of subservience to the state.


Good things are indeed just as bad as bad things.


But how many disable their ssl to view the site once blocked for not running on https


Some Onions do this.

I think it's a conflict between Tor trying to be mainstream as possible and Onions trying to reduce users attack vectors, JavaScript has been used against users on Tor.


no


Is there a NoScript equivalent for Chrome on Android?


Yes, it's built into the browser.

First, go to "Site settings" and disable it globally.

To make a site-wide exception, click on the icon in the address bar and enable it for the site there.

It used to be crippled and unusable, but they fixed it recently.

I also recommend Bromite on Android.


Yes, I'm using Bromite actually. Thanks for the note, it's been a while since I looked.


Nice, I wouldn't have known without the title :)


This really nice work. Even the audios and videos are using the browser built-in standard features. Look at the /artifacts page.


Safescript blocking scripts by default didn't do the trick. Using Ublock Origin to block javascript did though.


Its quite an amusing counterpoint to the interventions required to make the average JavaScript laden site usable.


Anyone have a screenshot or mirror of the page with Javascript already disabled?



Thanks. I'm using the latest Firefox, and for some reason, this Javascriptless site isn't working at all because I can't see anything.


And the colors are inverted with dark color scheme.


How nice! I hope others support this trend so that the web is clearly split into two: (1) the traditional one, document-based, and (2) applications. Both are very useful, but (2) comes at a price not everybody wants to pay.


Isn't this one to steal Facebook session? Didn't look at the code but the fact that you get Facebook page "translate" seems weird. I have seen same link posted on Facebook.


javascript-golf challenge: expose the content of heydonworks.com from the browser console using only javascript.


This must be the most unnecessarily awkward way of doing it (have to do it from the console after navigating to the site, so the request is allowed).

    fetch("https://heydonworks.com").then(x => x.text()).then(x => { 
      var f = document.createElement("iframe"); 
      document.body.append(f); 
      f.style.left = "0px";
      f.style.top = "0px";
      f.style.width = "100%";
      f.style.height = "100%";
      f.style.position = "absolute"; 
      f.style.border = 0;
      x = x.replace(/\<\/?noscript\>/gi, ""); 
      x = x.replace(/\<script\>.*\<\/script\>/gi, ""); 
      f.contentDocument.write(x); 
    });


currently exploring view-source:https://heydonworks.com/ by deconstructing the rendered source and injecting the output back into the body..


Well, I went through the trouble of disabling javascript and was treated to an utterly mundane blog site with astounding insights worthy of a 22 year-old, such as "if you're a libertarian you're just being exploited by the capitalists, man."




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: