Often when I use number-input it's because I want numerical keyboard when people interact with it on a smartphone.
Yes, input type="number" is not the best, but when I ask for 2FA codes or credit card number a metric keyboard makes sense. And this is currently the best way I know to achieve it.
But zoning is also part of the issue. Residental areas many places in NA is often strictly residental - designed around that people can just take the car to go grocery shopping.
In europe it is more common that you have things like a supermarket in walkable distance (or atleast short trip on a bike).
I think this is the same with every sport, in my experience, Seen in with swimmers cycelists. But also if you go cross country skiing during the winter here in Norway you meet the same kind of people.
Even trusted governments can't guarantee that bad actors cannot access the data captured by the surveillance system.
If a government need total surveillance, it is because the government don't trust its people, which is because the people does not trust the government.
This cannot fix trust issues with surveillance, you must solve this with more/better democracy, not police.
> You don't actually want these changes to end up in the main branch before they are done
I see no problem with that if it is a small change that can de deployed safely. (for instance hidden behind a feature-toggle where applicable).
We tend to strive for small, short lived branches. We usually don't want PRs with more than a couple of days work - to keep them small, easy to test. It also makes the amount of changes going to production at any time small.
The best would be if any specific feature is small enough to only be a few days work. But out experience is that it is not allways easy to do that - and sometimes some features will take weeks before they are done.
I agree that code review is not QA. But I read it as the writer of the post comes from a team where they do QA on pull-requests. And QA is assumed to be done on a per PR basis. But here I think different teams will have different approaches and pipelines.
On my current team, we want most tests to be automatic. But the manual regression testing that might be needed is mostly done by other developers. A separate test-environment is created for each PR, and the link posted in the pull-request. So when looking at a pull-quest we expect people to do both a code-review and QA.
In my team approving a PR means you approve for the code to go to production. So you need to consider both code quality and whatever QA is needed. But thats just our way t do it - other teams do things differently. And I think thats OK, there is no silver bullet.
I often see people speak about requiring users to have JS enabled as a problem. But is that really an issue? I remember 5-10 years ago I would use things like no-script to disable javascript, and only enable on specific sites. But with the way modern frontends are, I would expect most website to not work at all if I disabled JS. So I would expect most users to have JS enabled anyway.
I can't remember the last time I created a website in a professional setting that would work without JS. Everything is written in things like React, Angular and Vue.js these days. And that seems to be the case for most modern websites. Atleast things made here in Oslo.
People that are in the web since forever and understand how it works and complain about JS are the weirdest form of tech ludites there are. Such a weird line in the sand.
It's the kind of thing where you have to try it for yourself. No javascript == best user experience possible.
* instant load times
* stable performance
* no tracking, and sometimes even no ads
* no stupid animations that break scrolling, ask you to sign up for newsletters, etc
* no hijacked back button, or disabled highlighting/copy/paste, or pop ups/unders
Unless you're trying to play games, or use some complicated web application like an IDE, javascript offers absolutely nothing of value to you. It only exists to benefit the owner of the website. For 99.999% of web browsing use-cases, HTML/CSS is all that's needed.
The only situation I can think of where javascript has the potential to offer an improved user experience is with infinite scrolling. That's personal preference though, so if it's something you value, I could understand wanting to enable javascript.
(also, I just checked and noticed that Twitter refuses to work if you disable js. Fuck you, Twitter.)
I'm not that old nor opiniomated, but I dont think disabling JS for the majority of browsing is weird. I had a mid-tier Android phone phone (a 250 euros costing Nokia) that after a few years of use just simply crawled when using normal websites that were a bit JS heavy. I'm not even talking about react/vue SPA apps.
Sites like the new Reddit design, Facebook, twitter etc. are just incredibly slow even on my beefy desktop and it its 100% because of how long the JS takes to execute.
We're off loading the performance hit in developer experience and server side rendering to the end user at the cost of a crummier user experience with many websites these days that strictly wouldn't need JS for most of the functionality. I'm not happy about this trend.
I see people comment this on HN on a daily basis and it always boggles my mind. For me this reads as: 'This thing that was initially conceived years ago when computers were incapable of much more than display text should never evolve/change to take advantage of current capabilities'.
Yes, initially the web could only display hyperlinked text. The same can be said for many technologies/inventions, should we therefore never expand the capabilities of our tools? What is the difference between the web and your operating system in that regard? Why are OS APIs so different?
We can also look at the positive effect this evolution has had where what used to be platform specific tooling is now often simply available via a URL. I much prefer that over random executables that are not sandboxed and by default have full access to all your data. Yes, this can be mitigated, but the average user won't.
No it's not all sunshine and roses, we've made trade-offs with regards to performance and UX among others, but this is still an ongoing process as the modern web is still relatively young and changing.
The problem is that, usually, turning off "current capabilities" leads to a faster loading, lower distraction, less ad-contaminated, less janky, and all around better experience. In an effort to squeeze every ad dollar out of the eyeballs crossing the page, sites are using modern capabilities as weapons against their users.
If sites continue to work js-free, there's a simple switch to enable to improve my browsing experience.
And it even defeats the anti-adblock crap reasonably often.
We have HTML and CSS, which evolved quite a lot and already allow for what 90% (estimate) of the websites do.
Usually it is people using JS, where they should not and tracking from FAANG and others, which are the reason to block JS. You are painting a wrong picture there.
If 95% of the web devs used JS in appropriate ways and it was not used so much for spying on people, well, then it would be a different story.
It's possible to create javascript-less sites with partial hydration, e.g. by using Svelte and Elder.js. They are loading fast and working fast. They are favored by search engines. They have much fewer issues with accessibility.
These people, who advocate to use aged JS libraries to poorly implement modern HTML5 features while breaking accessibility, are the weirdest form of Luddites ever.
JavaScript's a massive security threat. It's really weird to me that people seem to just assume it's fine, and isn't the most dangerous damn thing in common use on computers. Every time someone (usually Google) pushes another way for it to touch hardware, I'm surprised that most developers are like "oh good, so glad, can't wait 'till Safari catches up in 5 years". Um... no? It's a terrible idea? Please don't ever?
We ought to be reigning in what JS can do and removing access, not adding more. For one thing, it shouldn't be able to send data without our say-so. It's insecure and spying-enabling by design—why does clicking a link mean the page that loads gets to send my mouse movements and keystrokes to its master? That's crazy, and has been a major contributor to the new norm that all kinds of privacy-invasion is fine. "It's just 'telemetry', what's the big deal?" Ugh.
"That's alarmist, JS is super secure" right, and most folks weren't worried about their CPUs betraying them until Meltdown and Spectre—smart money says there is a vulnerability we'll find shocking in one or more JavaScript implementations, right now, waiting to screw us.
We spend a ton of time locking down OS API/ABIs to prevent sandbox violations and I wouldn't trust a shared server with sensitive data unless I had an IT team working on it. JS seems to be a lot better with sandboxing though. You still have to really worry about CSRF though. I use multiple profiles to ensure sketchy sites can't get at my data.
I think JS gets a bad name when people use it to make crazy modal popups or inline video ads or change the way the page scrolls. Beyond that it's cool that devs can get really creative with a website and I love coding in JS. But also you're adding a lot of complexity for that. HTML/css are fine for creating a website that communicates information and maybe even looks nice. And they aren't actually a programming language, they're just data. JS is a full programming language and gives you enough rope you hang yourself and I think developers kinda go off the rails messing with their sites and ruin the user experience.
It is cool that I can have whoever execute code on my machine without worrying if it will get privileged access to it. That is a pretty amazing feature of JS/browsers.
It is a weird hang up because JavaScript/scripting was already ubiquitous in the 90s! I programmed "Dynamic HTML" pages as a summer job in '98 or so when I was in high school.
There was never a technical limitation to running software over a network, unless you go back to before computer networking was invented, far far before the internet. Putting all the capability into the browser is what people have a problem with.
The modern web is anxiety-inducing and incredibly scary to people that pay attention. I don't want to spend an hour checking the js on sites before I use them to make sure they aren't malicious/mining bitcoin/whatever, so disabling JS is an easy out that preserves my sanity, and gives me a better experience. No cookies, no popups, no paywalls, no ads, no lag.
There's nothing wrong with using a school bus to carry 30 children to a school, just like there's nothing wrong with using JS to render a highly-interactive SPA.
But the vast majority of websites that use React are the moral equivalent of driving an empty school bus to the store to buy a loaf of bread: It's massively wasteful and frankly stupid.
> should we therefore never expand the capabilities of our tools?
Yes that's exactly what the luddites are arguing in favor of, rolling back progress, and dramatically stripping away capabilities. They don't consider any of it to be a net positive, they don't think of it as being progress.
It's not specific to JavaScript, it's far broader than that. It's an ethos.
In my observation they also typically want to go back to not having graphical user interfaces. They like a nice command line interface as a way of life. It seems silly to stop there though. The computer should be gotten rid of just the same to be philosophically consistent.
I think you are right in many respects. I would not consider myself one of the accused luddites, but in their defense, this ethos isn't baseless.
First and foremost, the luddites you speak of are programmers or sys admins. They have been using terminals and are still using terminals daily, and they see the benefits that those tools have to offer. Namely, they see composable, interoperable programs that abide by the philosophy that programs "should do one thing well" as the bench mark for real progress. I would say I agree with the merits of this perspective. But I still use VSCode in addition to vim, because I'm not a zealot and there are times when I want to edit something in a very flexible way that VS Code better facilitates.
Both ways of doing things have their merits I suppose. It just hurts a bit to see something simple and powerful be wrapped and rewrapped in progressively less helpful proprietary systems and given a JS front end that lacks all of the focus and freedom that charmed us with the systems to begin with.
You just restated the belief as if the reason is self-evident. What's the reason? Security? Even displaying a JPEG has had security vulnerabilities. You can't really seem to escape that just by not executing code. And no PDFs too, I guess, because they contain code?
Reducing the attack surface is the pragmatic thing to do and it just happens that js alone makes a several orders of magnitude difference on its own. Don't let perfect be the enemy of the good.
Amen, I think if memory serves correctly HTML was SGML inspired and once XML became popular there was a XHTML standardization push for a brief time.
I think you could argue that the ubiquity of HTML led to people see value in something like XML back in the late 90's / early 00' and HTML drove XML invention and adoption and not the other way around.
Yes, but JavaScript has been a thing since the 90's and was widely used by web sites even back in the the days of GeoCities. So it's really a bit weird to complain about something that has been part of the web since almost its beginnings, and arguably has been one of the main drivers for the 'rise of the web' itself.
You do realize those people you call weird mainly refuse to enable js by default because of the amount of tracking it imposes right? Not to mention those “designed” articles (the verge style) that load like 3 megs at least of a god awful amount of crappy slider libraries and crappy stuff?
This is such a basic thing to refuse that what’s actually weird is that you find it weird.
I worked with someone not so long ago that was on a 750 KiB connection. I myself remember that speed at being blazingly fast at one point - so fast that we could start listening to a song before Kazaa finished downloading it.
3 megs of unnecessary data meant that he would sit 3 unnecessary minutes waiting for a page to load.
I had a bad connection for the majority of my internet usage life. People don’t understand that excessive data loading is superfluous. But to call people who worry about that weird is truly strange to me.
Same here. I'm still on a 3Mb DSL connection, and to be quite honest, according to my graphs, it's closer to 196KB/sec. Why in the world would I enable JS when I mostly don't need to? I'm just trying to read text!
If you chose to flat out disable JS in 2021 (by all means, go for it), understand that you are the Amish of the Internet -- so, please, behave like the Amish of the Internet. Be cool with the consequences of your deliberate choice. Do not expect other people to work around you or, worse, resent them for not doing so.
This is not to say that JS isn't being abused a lot -- but then again, it's soft tech, it's the internet and the abuse comes at little cost (compared to most industries). I always thought it's a huge part of what makes it fun and speeds all progress along like crazy.
Calling someone Amish of the internet because they refuse to be tracked, refuse to take part in voluntary remote code execution or just do their computing on a computer from 2010 on a poor internet connection is just sad.
If you want to live in a world where everything is a SPA by all means go for it, but be prepared to be criticised for making a blog post or a news story that is literally a bunch of text paragraphs interleaved with images require Javascript.
I don't know turminal, that really doesn't sound like me.
Anyway, I thought I jotted my point(s) down quite amusingly, but oh well, hit and miss. Restated more plainly:
- I am acknowledging what the world is and where that leaves anyone disabling JS indiscriminately (sic) in 2021, solely because of the technology and not because of individual usage.
- I think it's important to acknowledge what the world is to have any serious discussion about the world. How about you?
- I made no statement about what I want the world to be. You have been inferring (wrongly, but, oh well, it barely matters)
- Being compared to the Amish is not "sad" to me and it's not meant to be offensive. What I think is sad is if you chose to be disconnected (like the Amish) but unclear that you are disconnected because of personal considerations (unlike the Amish) and then get angry at the thing you are disconnected from because it doesn't care about your personal considerations.
> I can't remember the last time I created a website in a professional setting that would work without JS.
A website which does not work without JavaScript is by definition unprofessional. That doesn't apply to web applications, which of course use JavaScript heavily, and need to (at least until WebAssembly is ready for prime time — and even then JavaScript is required as a shim).
The Web is a web of hyper-linked pages, documents. It does not require JavaScript, although of course many pages are improved with the dynamic behaviour enabled by JavaScript.
An odd definition of unprofessional. Building and testing a site for running without JS is a non-zero cost. But I've never seen usage stats that show that demographic above 1.5% of the browser market share – and those are old stats. It was sub 0.5% for the sites that I ran at the time (2018).
I don't mean to sound flippant, nor do I mean to insult anyone's favorite browsing style, but insisting on this might be shouting into the wind? I could never make a business case for it, at least.
> That doesn't apply to web applications, which of course use JavaScript heavily
I propose that rather than ‘Web applications’ these be called ‘browser applications,’ because they are applications that run within the browser runtime. It is basically a coincidence (and a convenience) that they happen to rely on the same technology used for the Web.
What I don’t understand is why folks write browser applications for things that the Web platform already handles well. The Web is pretty cool! Not perfect, but then nothing is.
The web is an open specification that has evolved greatly over time. Pretending like these principles are religious are exactly the antithesis of an open standard. It requires debate and discourse.
FWIW, I do still totally use NoScript, and am somewhat picky with what I enable; quite many e.g. blogs (and news sites too) are in fact readable enough without it. And in case of blogs in particular (especially on the blogger site), I do quite often just skip reading them if they don't show anything with JS disabled, as this is plain dumb first of all, but also tends to be a strong signal that the particular site will be super slow, annoying, and hard to read with JS enabled anyway.
If the website is a blog post, or some news it should work without JS, You don't need an SPA to show some rich text.
I use Vivaldi because it makes easy to have JS off by default and enable for each website that really needs it. (I do not know if Chrome/Chromium made this thing easy too)
> I use Vivaldi because it makes easy to have JS off by default and enable for each website that really needs it.
So you fill in this entire form, hit submit, and then find out that you really needed JS and now you have to fill in the form all over again. How is that easy?
I am a developer, I can guess what site needs JS, like payment sites, sites that are actually SPAs. Check all submissions from HN and let me know how many have forms , load fine without JS and don't complain that JS is off and then fail on submit?
It is easy because I don't get popups for cookies, for allowing location access, no fancy scroll effects , I also disabled animated GIFs.
If you are a webdev and you don't work on an SPA but for some reason you site is half broken with JS off then use the standards and put a message that informs the user that JS is required.
I would not setup this to some random person computer, ad blockers too can screw with shopping carts and some websites also beg to stop the ad blocker , so I would not install ad blockers too any random person either, but here on HN JS off by default is a good advice, most should understand that if page is blank or some button does nothing then it might be a JS related issue.
Whose fault is that? The one who disables the JavaScript not actually needed to submit a POST request, or the one who implements a web service which requires client code execution to receive a POST request (or worse, who uses GET for POST)?
The best justifiable use case I’ve seen for building site that also functions without JS is accessibility in the section 508 sense. It’s been a few years though so perhaps new standards have emerged for this purpose.
There are. You can make a complex JS browser application that's completely accessible if you follow every single recommendation, and ARIA-tag every element that needs it.
That said, a lot of what you are doing in a browser application is re-implementing your own version of stuff that the browser already does, and the browser's version of it is accessible by default. So a JS-free application is almost always fully accessible for no extra development cost (as long as you check it for things like foreground/background contrast).
I often browse the internet through Firefox Focus with JavaScript disabled and the web is a breeze and really fast. Additionally it helps in escaping paywall s for most news sites.
But the search is not as good. Or good might be subjective in this case. But I tried using DDG mutliple times, but every time I end up just doing the same search on google becouse it is much better at giving me the result I want.
Vulkan-support is one of the main features on Godot 4.0 which is still not in alpha. But from what I've gathered alpha-release is pretty close. But no date announced.
Yes, input type="number" is not the best, but when I ask for 2FA codes or credit card number a metric keyboard makes sense. And this is currently the best way I know to achieve it.