Hacker News new | past | comments | ask | show | jobs | submit login
The “Developer Experience” Bait-And-Switch (infrequently.org)
275 points by feross on Sept 11, 2018 | hide | past | favorite | 217 comments



I'm a javascript developer, and I agree heavily with this article.

The web experience has gotten worse these past few years. I don't want a stupid autoplaying video following me around on the damned page. I don't want Facebook to track me endlessly and mastermind what I see.

There are some really cool things the web has gained. WebRTC, ubiquitous web previews, and using mp4s to add cheap animation. I'm also glad Flash has finally been killed off.

We should be focusing on providing well crafted, clean and slim experiences that are thoughtful and not bloated. Get rid of the 10+ trackers, the kitchen sink approaches, and all the cruft that isn't directly useful. At the same time, let's maybe not have a single entity own this slimmed down platform (AMP), but have it be the normal experience.


The web is designed by marketing people and implemented by programmers. That's because the incentive to fund the implementation comes from the desire to sell things. One of the things that I always found interesting about the web was that as soon as there was a graphical browser, things went in that direction pretty quickly.

Before that, our internet experiences were pretty technical in nature. More than the web, I remember being blown away by mail and news readers, like nn or gnus. The things we built were very practical in nature.

Contrast that to the demo scene. To me, the demo scene epitomises the marriage of technical expertise and artistic endeavour. Talk about "well crafted, clean and slim experiences that are thoughtful and not bloated". Take all that and crank it up to 11 :-) Of course, not necessarily utilitarian.

I'd love to see some artistically inclined people respin some of the concepts of the demo scene and build something practical that also has the ability to floor you from a technical and experience-oriented perspective. Of course, you could never fund it (that's why the marketers dominate the web), but the world of hacking has never needed funding. We're "starving artists" who aren't actually starving because we have high paying side gigs ;-).


> I don't want a stupid autoplaying video following me around on the damned page. I don't want Facebook to track me endlessly and mastermind what I see.

Those autoplaying videos and trackers are the reason these pages are there in the first place. All the free content and services on the web has to be paid for somehow, and that is the way.

It's not as simple as:

> Get rid of the 10+ trackers

Sites like Facebook can't just do that, that's how they make money and stay in business.

This is a business-model problem, not a technical problem. If anything, new technology allowed these business models to be slightly less annoying and damaging to the user experience.


> All the free content and services on the web has to be paid for somehow, and that is the way.

This is false. We had a perfectly healthy ecosystem before ads invaded the web. In most ways it was better because it was built by people who cared, rather than by companies who wanted to extract money and power from their visitors.


Web build by people who care and don't earn money on it still exists. Its audience is smaller proportion of total audience then it used to be.

And also, people who care want pay raise too. They enjoy living in better flat, going at better holidays, buying better bicycle for children or eventually have periods of life where it is impossible to have job, family and time consuming hobby at the same time.


Who said they couldn't be paid? Patreon, kickstarter, etc show that you can do what you care about and still make money ethically.


I very intentionally wrote "pay raise" instead of "some minimal amount of money".

Also, both succesfull patreon and succesfull kickstarter require significant effort to pull off. You have to create materials, build community and promote promote and so on. It is a lot of managerial work.

We are not talking about "buy third yacht" level of money here. We are talking about "I can finally afford dentist or something else basic" amount of money here. It makes difference in people's lives.


Are you suggesting that everyone who wants a pay raise deserves it just because they can build a website? If so, then I want a pay raise. But I'm unwilling to sacrifice my users' security and privacy to do it.

Nobody said it was easy to raise money. If it was, everyone would do it.

Crowdfunding and donations make plenty if money to afford a dentist or other basic needs. There are plenty of full-time groups making money with it. I don't know why you think thats not possible when it is already happening.


You are changing the topic. It started with false dichotomy between ads and caring for content and continued through "this form of monetization is more effective then the other one". I am suggesting that using "want money" as dirty word as if it wad something people (who don't even earn a lot) should be ashamed off is manipulative false argument.

Most crowdfunding campaigns fail. Most pay for medical campaigns fail. Most including those succesfull don't earn enough to be full time.

Inspirational feel good article about succesfull campaign is not representative of average case.


Can you point to where I said that wanting more money is a "dirty word"? Or can you point to where I said it was wrong to want more money? I don't see anything like that in any of my posts, and I certainly don't believe that to be true.

Most advertisement based sites fail. Does that mean we should also abandon that model?

Nobody said that the average campaign should succeed. I definitely don't believe that.

You can't just keep putting words into my mouth and then claim that I am the one changing the topic.


But is it sustainable that way?


>>> All the free content and services on the web has to be paid for somehow, and that is the way.

>> This is false. We had a perfectly healthy ecosystem before ads invaded the web. In most ways it was better because it was built by people who cared, rather than by companies who wanted to extract money and power from their visitors.

> But is it sustainable that way?

Yes. It'd be different, but sustainable. Sites like Wikipedia are proof of that.

It baffling that people claim that the web requires ads and trackers to survive when a counterexample is one of the top three links on nearly all of their searches.


> It baffling that people claim that the web requires ads and trackers to survive when [obviously it can]

It's a question of the content you put in the word "web". As a corrolary: I believe that "TV" would not exist without advertising.

I know that electomagnetic radiation, signal processing, and ray tubes are wholly orthogonal to Kelloggs selling cereal. But the modern television media landscape and its dedication to Brave New World style audience-capture is unfathomable without people splashing money at it to sell things.

So what is "the web"? Telnet, TCP, and HTTP? We don't need your stinking banners! We have oodles of excess server bandwidth and our websites are rarely more than 10K (of gorgeous hand-crafted fully-accessible HTML, natch).

But is "the web" your advertiser-supported newspaper? Your advertiser-supported searches? Your advertiser-supported social media news feed? Your advertiser-supported free phone app? Your advertiser-supported image host? Those things are, so far, only sustainably available through advertising.

Crack the micro-currency and micro-valuation code for everyone online and maybe that "web" can live without ads... we're not there yet though :)


> Those things are, so far, only sustainably available through advertising.

Really? Then I guess sites and creators that are sponsored via donations never got that memo.

We have plenty of money coming in via ethical channels. Sure, not as much as with asdvertising, but IME, most sites have severely bloated budgets compared to what they actually need for sustainability, and thats why we have the problems that exist today.


You've moved goal-posts ;)

Sorry, but there are no donation based major national news organizations or donation based search titans with historically large market caps or donation based social media empire whose CEOs political whims shake national elections the world round. Facebook, Google, PictureHostOfTheMoment, and the media giants with traditional business models are the essence of what most people are using their time on online. That's "the web" for most people most of the time.

I mean... Grab a random high-schoolers phone and see what percentage of services they're using are (in)directly funded by advertising and advertisers.

Your observations about the economics of the bulk of web-development are probably correct, but relatively minor compared to the mega-billions in suboptimal efficiency in every other aspect of those organizations... And the issue is always that "sustainability" is not growth, and growing frequently requires actions that are unsustainable in the long-term to achieve. Businesses grow, transition, and reshape themselves within the context of new revenue streams, and acquiring those new revenue streams is how you make owners richer and secure any desired financing.

So yeah, really, as of todays date "the web", a la "TV", is a pure advertising apparatus to capture audience and move product with peripheral benefits in political manipulation. Other models have not been validated by the market, nor consumers. Remember we all want ads, in the abstract.


You went from "isn't sustainable" to "has to be a worldwide major economic and political powerhouse" and claim that I'm the one that moved the goalposts?

lol, ok


> You've moved goal-posts ;)

> Sorry, but there are no donation based major national news organizations or donation based search titans with historically large market caps or donation based social media empire whose CEOs political whims shake national elections the world round. Facebook, Google, PictureHostOfTheMoment, and the media giants with traditional business models are the essence of what most people are using their time on online. That's "the web" for most people most of the time.

Actually, I think you're the one that's moved them. I don't think anyone can claim that we'd get a clone of today's web if we took tracking and advertising out of the picture. However, that doesn't mean we wouldn't get a different web that was good and valuable. We might even get something that's more valuable than today's. Projects like Wikipedia prove that extraordinary valuable things can be built online from volunteer labor and donations. If you personally need every idea to be validated by the market, then that's your validation.


Doesn't Wikipedia run on donations? Or are you saying you think that works for the majority of sites?


It does. No, that would not make sense of the majority of sites. Majority of sites have limited audience (topically, or geographically), so costs would be nowhere as high, and could be handled by personal input + donations (for private sites), or written off as marketing expense (for company sites).

Yes, websites do not have to pay for themselves directly. The idea they should is underlying lots of current web problems.


This is an easy argument to make when there's conveniently no way for consumers to decide how they want to pay. Realistically, humans are just lazy about finding decent ways of monetizing their time. I can't say I'll shed any tears if a business relying on ads goes out of business from my ad blocker: they always could have asked me for my cash.


Consumers will pay for the services that consistently deliver value, but most of media out there delivers value occasionally. And that’s assuming some exclusives, where a competing publication is not a click away.

When even the mighty NYT and WSJ have issues monetizing their Web presence, what hope is there for smaller fish?

Do I read an occasional article from The Guardian or Venture Beat? Yeah, happens every now and then. Am I willing to pay for it? Unlikely.


Where media struggles is in spending hundreds of dollars per page of content that has an effective lifespan of about 1 to 5 days before it goes off to the virtual landfill (formerly a real landfill), never to be seen again.

Maybe they need to ditch the daily blog-feed style of journalism and direct their resources toward building a more cohesive/holistic archive of events that provides some lasting interest and value, so they won't need to desperately squeeze revenue from each article as aggressively as possible before the aggregators drop it.


I think they try to. But it might be that it is actually much cheaper to write small daily articles that junior people can write than to write extensive high quality stuff. That puts you under constant pressure.


Which is precisely why they have a problem with monetizing. I can't really sympathise with that. Why should I care if the business model of flooding people with crap, occasionally containing some gems, is in trouble? Drop the crap, drop the frequency, send only gems when you have them.


Well that goes against attention economy we have today. Slow news are not news. I think what you want are magazines. It is precisely why they are not dead. In fact they had dip but now they are growing again.


Economist Espresso is an excellent example of this in action.


IMO this has already been proven to work as long as it's easy to pay. Just think of all the successful phone apps where the paid version is there to remove ads.


> This is an easy argument to make

For the record, I'm not making any value judgement or "argument" here. I'm pointing out that GP's complaining about issues that aren't really technical, but fundamental business practices.


i would like to see stats on that. in my experience people are very hesitant to pay even if they recognize they derive value from your service. it's just not the "thing that everyone does"


I think a bunch of smart people have spent a bunch of time on it and what we have currently simply makes the most money. Of course that may change once the majority block ads.


Consumers have basically already decided. How many people use incognito windows to get around the NYT paywall, for example?


Still waiting for the Spotify of news sites. I guess it's harder than it is for music?


Spotify to the music industry is like putting some ice on a broken neck. The music industry at least has concerts and merchandise to drive revenues up.

The problem is that news is very expensive to produce; particularly investigative journalism that might or might not pay off. Local news at the town or suburb level has a similar issue, where the market is quite small to begin with. Strip away the expensive parts and you're left with cheapo aggregator papers. But what happens when there's nothing left to aggregate?


It's interesting to see how no one even seems to try. In Europe there's Blendle [1], but that's only an aggregator and you still have to pay for each article you read (with hilarious differences in pricing). I've always wondered if they'd be more successful with a flat fee for complete access to all articles for all their papers and zines. But I guess the publishing houses don't consider this viable as it would hurt their subscriptions (who's gonna subscribe to just one paper when there's access to ten more for roughly the same price?).

[1] https://blendle.com


At least for my part Blendle has been a huge win for publishers. I've probably spent more money on Blendle articles in just a year or so (even asking for my money back a number of times) than I have on regular subscriptions.

Considering the fact that I'm generally not too interested in mainstream news, and somewhat frugal, I imagine there's quite a number of people who spend even more.


Does autoplaying videos make more money or just push up meaningless metrics? Personally, an autoplaying video means an instant back button press or tab close.


It does for certain demographics especially when you take into account how little people read these days.


> All the free content and services on the web has to be paid for somehow, and that is the way.

This is blatantly false... Significant quantities of digital work product are produced for free with little concern for payment.

Perhaps you meant "most" or "the majority of" or even "significant quantities"?


Perhaps governments could make ad driven revenue illegal, force the internet to drop that particular business model.

Get rid of false free.


A half-way house which could be interesting to explore would be to make advertising “static” again, i.e. all visitors to a given page see the same ads, or at least one of the same ad-rotation with the same probability distribution. The intrusiveness and complexity mostly seem to come from targeting ads against visitors rather than page content.

There are still a few sites which run simple static “someone paid us and we edited in a link using notepad” style ads. Frankly, they’re higher quality and more likely to be relevant than anything that comes through the big ad-targeting networks.

(I’d vaguely hoped GDPR might have this effect. In fact, it just seems to mean business as usual with a slightly increased floor on the amount of process and bureaucracy...)


And eliminate access to ad-supported services that many - especially the poor - would no longer have access to. No ads is just as dumb as all ads.


Any examples?

BTW, back in the 90s there were a ton of free sites, no ads. Mainly hobbyist sites etc. No reason that still can't happen.


I'm building this whole platform with tens of thousands of lines of code, which (in my opinion) is pretty unique, I use it every day with a couple of people that visit, but it's designed to handle much more. I'm not getting anything in terms of money out of it, if anything I'm losing money by hosting and stuff. I'll leave the link here: https://github.com/madprops/Hue . Also there's other successful open source solutions that are having a big impact right now, like Mastodon, it's a very active Twitter alternative: https://en.wikipedia.org/wiki/Mastodon_(software)


This is offtopic but I just have a small suggestion for Hue - When I checked it out, it required me to login/register right away. It would be nice if you could provide a demo version where one can check out the service without having to register first.


It's part of the natural move towards a centralized web. You want a website you can trust that delivers good content, as do other people, increasing the load on the host beyond minimal finance resulting in ads. That said a lot of sites aren't that trustworthy and don't deliver good content but are riddled with ads to try and make money


These sites are still there. In fact, many more of them, since the cost of hosting has dropped to near zero, especially for static content.

The new development is huge sites providing expensive content and services for free, and using ads and tracking to pay for it.


Besides the entire news industry? YouTube? Most informational sites besides Wikipedia? How about Google Search? Believe it or not, it costs money to run a business and not everything makes sense as a subscription model or pay-as-you-go.


> This is a business-model problem, not a technical problem.

The technical solution would be to introduce a highly optimized first-class browser API "just send everything I do to these trackers" without JS or including remote scripts.

Browser vendors adding thin veneers of security (while also introducing side channels) and web developers finding ways of circumventing these mechanisms, with some of these circumvention mechanisms being standardized again is a bizarre ritual dance that benefits no one. (Add to this the fact that the biggest browser vendor is also the biggest beneficiary of the whole ad racket.)


Yeah, it is a business model problem. My solution is to not visit sites that have this business model for most needs. Sometimes I tolerate it for specific content, but mostly I try to stick to text-based sites.


WhatsApp showed us that running a paid, low cost, privacy oriented service can be profitable.

I'd probably pay for that, again, as soon as they were completely untied from Facebook and had apologized. (Yep, they made a very public promise.)

I'd probably pay similar or slightly larger amounts for a number of other things as well (actually I do, so my point is rather that I'd pay for for stuff :)

Now before anyone thinks: "great, lets paywall all the things!" here's an important catch:

1. I'm happy to pay for my newspaper. I'm happy to buy the occasional newspaper on a newspaper stand.

I do not want to subscribe to every newspaper that anyone shares a link to, just to read one single thing that is relevant to me. I love the idea of blendle but it seems media (and my feeling is even a lot of HNers) don't get it.

2. I'm happy to pay a certain large cloud company that I don't like as much as I used to do for an increased storage quota. I'm happy to pay reasonable amounts for apps, services etc. I'm happy to pay for usage for things I use less often. I particularly like the Jetbrains model of pay-to-own (even if I personally prefer their competitors products. ;-)

I still do not want every app to pretend they are a service that I should pay a monthly fee for.

Also: I don't really mind relevant ads. The other day I actually clicked on one. Because it was relevant.

But there is no reason to track me around the web: most ads I ever clicked on voluntarily (that would be 5 - 10 for the last 10 years ;-) could probably be inferred from context: I was reading a blog post about something related or searching for something related.

Also (and here I go against many HNers), ads should preferably be injected server side.


Blendle is nice in theory but their article pricing is really off-putting. I know that it's the publisher's choice to set prices for individual articles, but I'm not going to pay half of the actual paper price to read one single article.

I use it to read a select few things that I'm really interested in, but I'd love it to be more of an individual aggregation of stuff (think Flipbook) where I don't mind reading lots of articles because they're essentially cheap enough for me to not care. Not going to do that, though, when a 300 word article costs up to 1€.


That's sad to hear. (Not using Blendle on a daily basis.)

I was always waiting for more articles to get available on Blendle, but if pricing is going through the roof it will be harder to recommend.


I'd love to be able to support individual authors, or 'collectives', regardless of where they publish. I suppose that's hard for Blendle to arrange though.


Does Facebook have 10+ ad trackers? You're logged in. They have plenty of engineers optimizing performance.

This seems more likely for news sites that are desperate for revenue and use multiple ad networks.


There's another profit model on the web, and it usually involves the user just entering their credit card. Hell, it doesn't even need all of them to do that, just enough. I absolutely use services like Patreon to support artists who make all their work available for free. I tip open source projects (usually every time I install it on a new PC). I happily pay for decent email.

Sites like Facebook and Twitter don't seem truly sustainable, and they aren't free.


> I'm also glad Flash has finally been killed off.

I actually think JavaScript video has been worse for users. Flash was easily blockable and worked fine for the majority of users. It was developers and Apple's need for control that killed off Flash.


Flash was a battery hog. HTML5 video is less so.


There is no 'HTML5 video'. There is an HTML5 tag that indicates a video object, which is delivered using the same codecs that Flash used.

All that changed is that the UI moved into the browser.


How much of that is codec driven?


Well designed software focused on user happiness doesn't make money. All those things that we hate, they are put in there because they make money.


Actually come to think of it, I've made some open source apps that people asked if they could donate to, because they were so excited by the possibilities my apps created, and by the convenience they now had because of them. I created a Patreon for this but have yet to link to it anywhere because I keep sinking into cynicism from all the ways I haven't been able to make money. But that's not the attitude I had when I created my window managers. Software that focuses on user-happiness can make money through donations. Just, probably not a full-time income.

EDIT: Apparently that's not true, you can make a full time income. Godot makes $10k per month in donations. And a month ago it was $8k. Wow.


Please post your Patreon link or a link to the link.


My patreon is on my HN profile, but I plan to put it in the readme of a new open source window manager I've been working on when I announce it. I figured out a way to emphasize ease of use and user experience so much more than I was able to with Mjolnir or Phoenix. It's really exciting and I think it'll make automation a lot more accessible for a lot more macOS users.


To add to the list of programmers making money- Evan You, creator of VueJS, makes around $11,000 on Patreon. The creator of the CSS library Bulma makes several hundred dollars, and the Quasar Framework makes a couple thousand.


Evan actually makes $16k/month [1]. Wow. If only I could figure out how to make a quarter of that from my open source work, I'd be in business.

[1] https://www.patreon.com/evanyou


When you say “make money”, there is a problem. Maximal efficiency does not necessarily correlate with making money. Are bloated and obnoxious ads making money? Depends on who you ask. And on what metrics we are using to determine actual value added.

One could make the argument that ads are as much of a waste of energy as mining on a proof-of-work blockchain. There is an implicit subsidy and externality in both cases.


> Software was never about making users happy, it was about tricking users, manipulating them into giving you either their money or their time

Calm down a bit with your generalities. If you sell the software (or the platform it runs on), there is no manipulation or trickery involved. It's only the "vast majority" if you look at freeware/ad-based.


You're right. I deleted that part of my comment. It was more ranty than I prefer to be. And I had things like Google and Facebook in mind, and 99% of the free games on the App Store or web.


One example in the article is code.gov, the video is impressive, the loading time on the Android Go device abysmal. And this has nothing to do with earning money. It is just bad software.


How about iOS?

I'd argue that's well designed software which is focused (albeit not exclusively) on user happiness.

Apple leaves a lot on the table when determining what to monetize and what trade-offs to make to avoid making users uncomfortable or infringing on their privacy.

Is that benevolence? No, but it doesn't have to be.


Did you actually perform experiments to substantiate your claim? As in: Do you have numbers that clearly show that there is a net benefit when you make the user experience as bad as it is today?


This is specifically applicable to content distribution. For products with paying customers the exact opposite is true. That’s not to say every company that makes a product will be good at it, but good UX absolutely does drive profits.


There are some really cool things the web has gained

I'd like to here about these things, if any. I'll generously assume you aren't seriously referring to "web previews" or animations.

I'm also glad Flash has finally been killed off

No, that's an absolute tragedy. Some of the youngsters here might not be aware, but there was a time when 99 percent of the garbage in a website could be turned off with a single blocker, all without impacting anything necessary.


> 99 percent of the garbage in a website could be turned off with a single blocker, all without impacting anything necessary

I was there and I remember no such thing. On the contrary, I remember people building whole sites in Flash. Killing that off is a Good Thing.

I think the problem with the web is too much freedom given by tools for free, which leads to such maddening abuse as Flash was in the past, and modern web development is today. Publishers have too much capability, and too much control over the rendering of content.


Don't forget our good friends ActiveX , ShockWave(whose relationship to Flash I've never been able to determine), and RealPlayer, which had an uncanny ability to be required for every online class assignment I ever had. If you were hapless enough to be running a Linux desktop, as I was, then have fun!

The modern JavaScript experience might be a bloated disaster, but at least it's a cross-platform disaster.


Shockwave was the player for content packaged for web from Macromedia Director, a product focused on creating multimedia binaries, often distributed on CD.

Shockwave had much more powerful video and 3D rendering capabilities than Flash, but Director arguably died after Macromedia failed to deliver a timely OSX port.


I had forgotten about real player. It was so awful...


> I remember people building whole sites in Flash.

Like some kind of Single Page App that takes forever to download, breaks the back button and has obfuscated source code? Lucky we're past that phase!


Yes, but with Flash pages scrolling rarely worked at all (people were too lazy to implement it), text wasn't searchable or copy-pastable, and no keyboard shortcuts worked, anywhere.

JS SPAs are bad, but not that bad, because browser defaults are sane, and getting around them is hard. Or in other words, web starts working by default, and developers have to work to break it. Flash had no defaults, you started with blank slate and implement everything yourself.


> No, that's an absolute tragedy. Some of the youngsters here might not be aware, but there was a time when 99 percent of the garbage in a website could be turned off with a single blocker, all without impacting anything necessary.

Some of the youngsters here might not be aware, but there was a time when sites could reliably send raw TCP and UDP through a popular plugin that implemented a different, and weaker, version of the same-origin policy than the browser did. (For example, in earlier versions of Flash, any video could send anything to any port > 1024 on its hosting server!)


yep. wishing Flash to be gone is misguided.


Ask anyone who've lived in the world of java applets how the rise of flash was awesome at the time. It was great for a lot of things, but unfortunately, like many things in life (javascript?), people abused it.


I liked flash too, but it being owned by Adobe killed it. If it was open, I'd be a lot more sad about it dying.


It wouldn’t have died if it were open. It would have been improved and more secure.


Is no one on HN willing to wrestle with the actual premise of the article?

> A rhetorical substitution of developer value for user value [...] > The “developer experience” bait-and-switch works by appealing to the listener’s parochial interests as developers or managers, claiming supremacy in one category in order to remove others from the conversation [...] The unstated agreement is that developers share all of the same goals with the same intensity as end users and even managers. This is not true. > Shifting the conversation away from actual user experiences to team-level advantages enables a culture in which the folks who receive focus and attention are developers, rather than end-users or the business. It naturally follows that teams can then substitute tools for goals.

Our community is absolutely infested with this attitude right now.

Tell me: When was the last time someone provided battery life impact numbers for React Native? If you use it, was that even part of your evaluation criteria?

The OP is absolutely right: there is massive conflation of _developer convenience_ with _user experience_. The modern frontend web dev scene is merely one of the symptoms. The fundamental assumption that "good for developers" must by definition equal "good for users" is a) usually unstated and b) absolutely unquestioned. I agree with the OP that it is also often c) wrong.

In some cases necessity has caused some developers to internalize and stockholm-syndrome themselves into believing some cross-platform monstrosity or PWA-in-a-mobile-app is better or just as good as a native mobile app. Not because it is, but because they have no choice given their current budget and staffing constraints.

A successful startup requires clear level-headed thinking. Do not mistake expedient or necessary decisions for optimal ones and be willing to re-evaluate your strategy at any time. Above all don't talk yourself into believing user experience doesn't matter.


We live in a world of 'ship MVP, gather customers, exit'

Of course the developer experience is going to be front and center, when you're trying to start a new product and don't want to invest actual money into it for real developers, you need something quick and dirty to move the project along.

I'm firmly in the camp that javascript 'webapps' are crap and make the web less web-like, the web I came to enjoy. I reluctantly admit that the ship has sailed and I enjoy the web less now.


Customers are somewhat complicit too, by choosing buggy products with more features over stable products with less. It turns out having a buggy product is still more desirable than not having the product at all. Also if asked to put a price on stability or performance... I suspect that number is going to be smaller than necessary.


I think the take away lesson is: Don't write great software, have a great sales team. At least, that seems to be the sentiment of all the failure stories posted around these parts.


Working in enterprise software I unfortunately agree with that statement.

Our product is terrible and yet we can signing all of these giant contracts. It is a bit disheartening to realize as long our stuff "almost works" and we keep pushing out features then we will keep growing.


> Our product is terrible and yet we can signing all of these giant contracts. It is a bit disheartening to realize as long our stuff "almost works" and we keep pushing out features then we will keep growing.

If you have an enterprise support model for your product, client turnover would be a leading indication of how successful the software really is. High renewals and good margin is a successful software product. Whether or not the companies that buy it are completely wasting their money doesn't seem to matter much as long as they all don't go belly up at the same time.

It's also important to realize, if your company is so bad, yet getting so many contracts, how bad could those other companies actually be? The answer is: really, really bad. Fortunately, the IT industry as a whole has convinced the entire world it needs to 'go digital' or their competitors are going to beat them to the market with their 'data analytics.' While it's probably true for many businesses, I think the vast majority is companies wanting to appear cool and hip.


> client turnover would be a leading indication of how successful the software really is

Not necessarily, after the original investment and installation there is a least one person and possible several higher ups with a vested interest in the project being a success and ending the support would make them look bad. Going through all that again and replacing your software won't happen until a new batch of managers champion the cause and will possibly happen even if your software is best in class.

I think a better measure would be their willingness to adopt other products your company builds.


There are just as many garbage native applications. There's no good reason a good app can't be built on web technologies. At least with the web, the app is going to work cross-platform and be sandboxed.


Russ, whats the value of UX on products that don't exist yet? DevEx is the egg that creates the UX chicken.

I love your attention to UX, but clearly developer experience is above user experience in simply a chronological sense.


I'm not making that argument, though that is often the reaction I get.

My argument is that you should understand when you're choosing to compromise the user experience, why, and how compromised it is. For example, you should know roughly how much worse the user's battery life is because of your decision. That makes the decision an informed one, based on your goals, budget, time to market, and so on.

To claim a decision born of necessity comes with no compromises is delusion. That delusion might not kill you but it points toward muddled thinking. That's dangerous at the best of times, doubly so for a startup.


Yep. UX is our jobs (as devs), it should be our main focus and sacrifices of it made minimally.

For better or worse, does seem the days of Apple engineers working weeks on perfecting an animation behavior are largely over.


This is such an interesting counter point. So really there needs to be a balance in tool choice that finds an optimum between deliverable UX with enough devX so that devs can stay sane enough to get the product built in the first place so the UX can exist.

Surely we're not going to start making arguments that out ultimate UX delivered apps are hand written asembly (at the most absurd end of the argument ;-) )


I think some of the earlier usages of this blurred things because the users were the developers in many ways. Emacs, for example, had a fun anecdote about how they had basic manuals for their secretaries teaching them lisp.[1]

That is, even non-developers were empowered by the same tools as developers.

That said, I can't but agree with your point and the premise. Worse, how many technological waves today have an implicit "and then migrate all current users" that is completely ignored by the team/developer?

[1] https://www.reddit.com/r/emacs/comments/3tj71x/even_office_s...


It seems to me that the article is written from the perspective of someone working for a global, consumer-focused business (or aspirations to become one), without much acknowledgement that other organizations have different needs.

For many businesses, it's okay not to sell to everyone. You can sell an Android or iPhone-only app, even though that leaves many people out. You can sell games that only work on some platforms. Sometimes it's okay to sell to businesses assuming they're office workers with desktop browsers and fast connections. You might not have to support multiple languages or international sales. And so on. These things are neither necessary nor sufficient for some forms of success.

Of course there are other organizations that do need to do these things.


The vast majority of modern websites could easily deliver their existing functionality (except with better interactivity) with server-side rendering and just enough plain-vanilla javascript to handle ajax calls.

If you don't believe this, I suggest you carefully reexamine your assumptions. A number of them are almost certainly false.


> server-side rendering

That costs an HTTP round trip, which can easily pay for the cost of tens of kB of JavaScript code on each hit.

Since a properly-optimized web app will have all or most of its JS code set to be cached indefinitely, the cost of the JS code is near zero; basically, just the cache lookup time. Contrast the HTTP round-trip, which you pay on each such interaction.

If you can render something on the client side without reaching back to the server, you should.

> plain-vanilla javascript

XHR and the browser DOM are horrid, inefficient APIs, in terms of lines of code per amount of functionality delivered to the end user.

I recently wrote some jQuery code to prototype a user-requested feature which ballooned to about 3x the size when the person managing the project demanded that it be done without any external dependencies. (It was about 2.7x the lines of code and 3.5x the bytes of code, the difference being that the XHR + DOM lines tended to be longer.)

In the same project, a separate rewrite from plain-old-JS to jQuery cut the code size nearly in half.

Since this plain-old-JS code was inline on the page, the user now pays for it on each page load, whereas an aggressively cached static library pull will be paid for only once as long as the user doesn’t toss the browser cache and re-visits the app often enough to keep the cached JS code in the cache.

Developer efficiency and user efficiency don’t have to be at odds. There’s a wide gray area between the article’s cherry-picked examples and a web app engineered for efficiency.


You're considering network transmission time, by which cached JS is indeed free, but ignoring parse/JIT time and CPU usage as factors. Processors aren't getting any faster, and all that client-side rendering code has a very real cost in terms of compute-bound work for the user's hardware. Processors aren't getting any faster.

The usual answer I've heard here is to use an SPA, but those have problems of their own. They chew up huge amounts of memory if you use tabs, their UX is worse than if you'd just built normal webpages, you have to deal with making it indexable, and so on.

Honestly, I'd be ecstatic if most pages loaded in two or three network round-trips. A cold request takes maybe 100ms when I'm on a cell connection, but JS-heavy pages I've seen tend to have load times measured in seconds.


Here are some relevant test results which are about 4.5 years old now:

https://modernweb.com/is-jquery-too-big-for-mobile/

At the time, your 3 round trips would have cost about a full second, which is roughly the same as the worst case JITting time for jQuery in those same tests. That worst-case result was on a slow Android 2 device.

Cellular network speeds have gone up, but certainly not as much as mobile processor speeds, so if you re-did those tests today with modern networks and devices, it’s probably a net benefit to pull the jQuery once, then JIT it on each page load, as compared to making even a single extra round-trip per page load.

> Processors aren’t getting any faster.

That’s true on the server side as well.

The article talks about externalized costs, but if you just shift the computing burden to the server, how do you pay for that?

You could load the page with more ads, which eats up the network bandwidth and JIT savings you just bought by moving the processing to the server.

Or, you could charge the users more than you currently do, which is economically little different than shifting the computing burden to the users, implicitly requiring them to buy faster mobile devices and better mobile data plans.

Consider also that the number of bugs created per line of code is roughly constant for each pairing of developer and programming language, so who pays for the costs of the extra bugs you’d expect to find in 2-3x more LOC?

TANSTAAFL.

> JS-heavy pages I've seen tend to have load times measured in seconds.

I doubt you’re comparing apples to apples.

There certainly are many very fat JS-heavy pages on the Internet, but what’s your comparison? If the JS-light alternatives aren’t accomplishing the same ends, then it’s not a fair comparison. You can’t compare, for instance, a web IRC gateway app to Facebook, even though you can use both to transmit plain text to another person.

Not that I’m defending Facebook. I’m just pointing out that they’ve got a wholly different thing going on there than the IRC gateway.


Don't even need Ajax anymore. You can render on the fly, and deliver new HTML over websockets. The gap between SPA and classic server-side rendering is getting smaller fast.

https://youtu.be/Z2DU0qLfPIY?t=14m58s


I watched 3:15 of that and he was still just getting started. Do you think you could provide a link to the good part?


I can recommend watching the whole thing, but I thought this bit was a pretty cool demo: https://youtu.be/Z2DU0qLfPIY?t=41m20s


Thank you!


JS is still required. Sending plain HTML is not the best path forward.


The idea in this case is that you'd have a bit of JS that would take care of updating the DOM, binding event handlers (which is specified and taken care of server-side), and so on.

How Phoenix.LiveView would do this exactly is still being worked on, but [Drab](https://github.com/grych/drab) will probably offer some inspiration.


What are you going to do with your AJAX result without client side coding? I don't buy this argument at all. You could ditch AJAX and do everything server-side like the old days (and it might even perform better given how heavy modern sites seem to be!)


AJAX requires client-side coding. I didn't say you didn't need client-side coding; I said you didn't need very much. AJAX (or the equivalent via websockets) is still necessary for partial page rendering and decent interactivity, which I think are still desirable features for the modern web. But of course for strictly static sites like most blogs you don't even need that.


Shove them back into the DOM with little bit of JS? AJAX allows for some legitimate improvements if you need to work around the document model, but these days, it's abused to extreme levels.


It's not that trivial, yeah you stick it back into the DOM somehow eventually. But you might not be getting back full HTML, you might need to modify it by adding some css classes, or event listeners. You might only be getting back data which you generate the DOM from.

Basically you need a full programming language and DOM and events etc like we have now. But I fully agree that it's abused currently.

Full disclosure I contribute to the madness by developing ReactJS applications.


I suspect Phoenix.LiveView will end up taking care of lots of the common use-cases with a relatively small client-side API, as much of the specifics would be handled on the server (including the code that gets triggered by an event). For the bits that really do need it, one would fall back on the 'regular' approach of writing client-side js.

In my experience at least, a ton of the React work I do is quite similar within and across projects, and with many of the 'special bits' I'd be happy to resort to a more standard approach if it allows me to avoid React altogether (not that I hate React, I just love how much simpler everything would be if I can avoid it).


I don't think anybody would argue that server-side rendering could deliver any or all experiences. Clearly even gaming is possible with a thin client (https://parsecgaming.com/).

What you lose at that point is latency. A round-trip to even the closest CDN is going to be vastly slower than rendering the next frame on a GPU with direct access to write to your screen.

It's not that things are impossible server-side. It's that clearly server-side rendering is not the universal answer for more-performant applications.


Some things will always need heavier processing client-side. If your site depends on high frame rates and a GPU, it's not in the "vast majority" category I was talking about, and my argument doesn't apply. My complaint is that far too many web designers building a site for a newspaper or a blog or a retail storefront seem to think they need a framework appropriate for a FPS game.


"If you disagree with me, you're wrong." Wonderful argument.


What I believe it's saying, and accurately so IMO, is there are a lot of so-called "best practices" which are dogmatically followed without stopping to critically think about whether or not they actually make sense in the context in which they're being applied.

My takeaway from "If you don't believe this, I suggest you carefully reexamine your assumptions." is "If you believe something is true, stop and think critically about _why_ you think it's true. Be open to the fact that this process may lead you discover it isn't true after all".


That's a lazy and ineffective way of making an argument. It's akin to telling your children to go spank themselves.

If OP wants to make a convincing point, the proper way is to propose a convincing argument, not "Think about how you're wrong."


Honestly, I didn't read the entire article. By the sixth paragraph, I already identified that the author was blaming the use of a tool for bad product management and software development practices.

All of the problems blamed on "javascript" in this article aren't a problem with javascript itself, but the improper usage of certain frameworks in certain circumstances. Using a big bulky framework for an internal application where you know the user is constrained to machines that can perform well with that stack is fine. Using that same framework on a public web application that will also be hit by mobile applications is not. The problem isn't the framework, it is that the product team did not properly vet the performance impacts of the development team's choice of framework or did not properly specify the performance requirements for the product. One other option would be that the governance structure for ensuring that those requirements are met before release was insufficient. Either way, the problem isn't "javascript", but the SDLC processes used to build the software.


Now if majority of new websites exhibit that problem, how do you classify it? Can you really blame the whole world for misusing a tool, without taking a critical look at the tool itself?


> Can you really blame the whole world for misusing a tool, without taking a critical look at the tool itself?

I’m reminded of all the folks who — in 2018! — try to claim that C is a perfectly good tool for writing large, secure systems. Never mind that it loads a gun, puts it in your hand, cocks the hammer, points it at your foot, puts your finger on the trigger, and pulls back just to the release: the fault is really only yours if you shoot your foot — or so these folks say.


The problem of C is the success of C. There have been some good, secure, large codebases written in it. Some truly beautiful code, when you scrutinize it.

This creates the illusion that the language itself may be safe, if only you do everything right. That last part is often forgotten by the time people make the decision to write something large in it.

There is no problem with safety nets. Good debuggers (extreme ones like Common Lisp, allowing you to rewrite broken code on the fly and resume from that spot). Good type systems (Ada, the MLs, going so far as to eliminate most errors regarding common mistakes like array-out-of-bound exceptions). Good test suites (comprehensive unit and regression test suites, fuzz testers) which are, admittedly, more pervasive across many languages (the option is available, at least). Formal methods (even the lighter weight ones, like design-by-contract). Automation, to minimize the impact of human error and fat-fingering things. There should be no concern about choosing a language which is semantically closer to your problem domain, but perhaps of slightly reduced performance, than a language that is semantically distant (see people writing half of Prolog into their complex code base involving knowledge bases and decision trees). Maybe some people will think you're not tough enough, but at least your code doesn't piss off every user.

We cannot trust ourselves to do the right thing at all times. Consequently we must decide on the language and environmental elements which can enforce the right thing, while minimizing the impact to developing our solutions in a reasonable amount of time.


> There have been some good, secure, large codebases written in it

Citation needed. The only one I'm aware of is SQLite, which has an order-of-magnitude more test code than implementation code and still fuzzers are able to find new memory safety bugs in it.


Almost anything by Dan Bernstein might make a good example.


JS bloat and abuse is very, very real - but the issue isn't JS itself but that the ASKS are bloat and abuse. In contrast with your example, the tool (C) is a bad choice for the job (secure systems). For Web, change the ask and the tool selection may very well change.

I'm a web dev since the 90s, and a JS dev for the last several years. I didn't start in JS, and when I'm asked to do different things I will move away from JS without a major fuss. I use it not because JS is my religious zealotry (though I'm not part of the "cool kids hate JS" club by any measure) but because JS is the least painful way to do what I'm being asked to do. Change what you ask, and I'll change my tool selection.

From this perspective, I don't find your example paralleling the situations to be highly applicable.


C is more like operating a very large saw without a handguard. If you're skilled and you know what you're doing, you won't cut off your fingers. Which means the median developer cuts off his fingers.

JS (and, more to the point, heavyweight JS framework code) is more like driving a massive SUV around everywhere because you think having to drive through snow and mud is more likely than having to park downtown, when really, all you do is drive around downtown in the summer. But you don't care, because you have a parking valet.


We all know how scrutinized Linux and BSD kernel development is and yet CVEs due to memory corruption happen all the time, to the point Kernel safety was the major topic all Linux Security Summit 2018.

If the best aren't able to write safe C what to say about everyone else?


I wouldn't use a power saw without a handguard, either. But I'm not gonna tell Linus the master carpenter that he can't use his antiquated power saw. Sure, he only has eight fingertips, but he also built the tower we're all standing on right now and continues working on it to this day, so if he's really doing something wrong, the only way to prove it is for someone to get a newfangled power saw, with handguards, and build their own tower. And maybe, someday, someone will do that, and we will look back on the days of poor old Linus and his seven fingertips and wonder.


I don't know of a single person who hasn't made very bad mistakes using C. Even the stuff coming from Bell Labs (Plan 9, UNIX) has had trivially avoidable bugs. By your analogy, everyone has lost at least one digit.


Really depends on what we talk about as the tool or the problem.

To the extent the claims is "JavaScript the language is the problem"... if somehow Scheme or Python had become the de facto scripting language of the browser in the mid 90s, we'd be having the same conversation. If JavaScript is obliterated by WebAssembly... we'll be having the same conversation (in fact, I predict webassembly -- along with every other effort to treat the browser primarily as nothing more than the universal VM that succeeded -- will exacerbate this problem).

If the conversation is about the fundamental problem of the various incentives to utilize the full computing potential of the browser as a platform even when a document model will do, then "JavaScript" is just shorthand for that since it's the primary language of utilization, and it's pretty easy to realize the tool is the role it fills rather than a specific language.


I agree and I realize now I need to say this explicitly instead of picking on JS directly.

I thought long about this in the past, and I think JS itself is not a problem - it's just a poor language. The problem is JS + DOM API + CSS + bunch of other things that browsers provide. I feel there's too much creative control allowed for on the web. The language itself matters little; what does is that publishers can shove ridiculous amount of code with little effort, and that the user/publisher balance over control of rendering is so heavily tilted towards the latter.


Majority of new websites dont have big performance problem, at least those I visit.

It seem to me that these debates are happening in alternative universe where all programmers work in well organized companies, withoit deadlines, have time to prototype multiple technologies for all projects and are also all seniors in everything.

In real world, programmers work in chaotic companies with little organization, have short deadlines, have short time to decide which tech, make decision based in what allows them to reach result fast or reliably, are often inexperienced and even experiences seniors don't know everything.

That is how it happens, to the extend there is a problem which author did not managed to convince me of.


> Majority of new websites dont have big performance problem, at least those I visit.

I'd love to visit that alternative universe. I visit many websites each day, and I could count on my fingers the ones without serious performance problems. HN is fortunately one of them.

> In real world, programmers work in chaotic companies with little organization, have short deadlines, have short time to decide which tech, make decision based in what allows them to reach result fast or reliably, are often inexperienced and even experiences seniors don't know everything.

I know that too well, from experience. It's part of the reason I feel the web needs much less features - as typical companies can't be counted on to use them responsibly.


The majority of new restaurants go out of business, is that an oven problem?

There's some kind of expectations that you can do anything on a website and there will never be reprucussions. Then the blame is so distributed and faultily attributed such that short term results keep the vast number of bad web advisors helping to mishandle more sites. Eventually, a relaunch is done to try to clean out the aggregate of mistakes without getting to the embarrassing root cause analysis.

Not all that different than the restaurant scene in some embarrassing part of the country.


I don't think it's that easy. The product team may vet the performance impact and conclude that the added bloat is an acceptable tradeoff for them against "being able to move faster".

And yet, if everyone is doing that, it leads to a bloated web overall.


Sounds a lot like the argument over gun control, but with javascript it's about footgun control


I completely agree with @jbeckham.

The TLDR posted at the top of this article (i.e., there's too much JavaScript on the web) does not line up at all with the problems presented in the article.

The common denominator for poor web experiences is not JavaScript it's not SPAs, and it's not client-side rendering, etc. It's a poorly managed software development life cycle: ill-defined requirements, poorly architected SPAs, not vetting performance impacts, etc.

JavaScript is a victim of its own success. It is more accessible to novices and easier to implement than ever before. It's to be expected that with the growing accessibility should follow a growing number of poorly designed JS-heavy apps.

> as many or more JS-heavy performance disasters cross my desk in an average month as in previous years.

I don't think that's a refutation of JavaScript itself. An abundance of bastardized JS-heavy web apps built by novices doesn't refute the usefulness of JS-driven SPAs and client-side rendering.


Alex Russell first diagnosed this problem at Chrome Dev Summit in 2016 in a really amazing talk: https://www.youtube.com/watch?v=4bZvq3nodf4

He makes the point that lots of JS frameworks are so large that you're already doomed before you write the first line of real "app code". The framework itself has already blown your performance budget.


This argument comes up frequently though. As computing power increases, the complexity of applications (no matter what machine they're running on) will increase. What's the point of having faster hardware if the experience is only the same but faster? People want more capabilities, and JavaScript provides that.


One point made in OP is that not everyone has the latest fastest hardware. Did you check out the clip comparing code.gov rendering their SPA on a new iphone vs a lowend android device ? Millions of people are dealing with extremely subpar experiences because developers are building their website on a work-provided macbook pro on a gigabit connection to their office...

Sure, chrome lets you simulate 2G connections and all that, but last time I checked it didn't simulate the slowdown of compiling 2MB of javascript on an old phone. Could be a revealing constraint to put in the developer tools.


You actually can throttle your cpu with chrome developer tools. I often use it when profiling my client side code. I also don't like to have the fastest phone, I have a Nexus5x, because it helps me know if it works on my phone it will run great elsewhere. I recommend other front end developers do the same.


even the nexus5x is a magnitude beyond what actually poor low income households typically have access to. think feature phones not smartphones


If more complexity would bring in more power, nobody would be complaining. The problem is that those capabilities are not there. You get mostly the same experience, with more glitter, but much slower. This is a case for desktop software too, but it's especially pronounced on the web.


You don't get increased capability just because of increased complexity. I worked (until a couple months ago) on a system that was about 10x larger (SLOC) than an earlier similar system. But it didn't offer 10x the value or capability.

Ignoring the costs of maintaining that complexity, it was at best a 2x increase in capability. Accounting for the costs, it was a net loss.

Finally: "same but faster" is a valid capability that people want. I want a compiler that is simply the same, but faster. Once I hit a certain threshhold of speed improvement, then I'll entertain an actual capability improvement (more compiler features like incorporating static analysis or whatever).


Two years ago, it was reported that the average webpage was a bigger download than Doom — 2.39MB.

Which is a better user experience:

1) Download an enormous framework that also runs slower and slower each year on constant hardware, because developers add more features?

2) Download a tiny page which hands heavy lifting off to the server (if you really do that the pages are only about 8kb plus images), and which requests an equally tiny page when the user does something requiring a server response?

It won’t be the same for every app, of course, but most of the content I see online doesn’t need or even benefit from JavaScript. I’ve literally turned it off by default and generally get a better UX… yes with a few exceptions, but only a few.


Typically it seems to come from the encouragement to offload more of the website logic to the client side. A WordPress install takes usually 15MB of disk space. a lot of that is PHP code that runs on your server. That will never be seen by browsers as it's sharing the work between server and client.

The onset of SPA's what contributes to this. It's making server-side content generation leaner, although sometimes at the expense of the user experience with worse page responsiveness during downloads. What's debatable is if the user benefits are worth the increase of upfront UX cost.


Computing power doesn't unconditionally increase. As our computers get better, we produce ever-smaller computers, trading away some computing power for a more compact form-factor.


But what actually happens is that you have faster hardware but the user experience is exactly the same (or even slower) - but the developer experience is improved.


The argument of "JavaScript is bad" or even "JavaScript is necessary but only in small amounts" is getting pretty tired. I don't think anybody could possibly hope to argue that all client-side code is bad, but that's all web+JavaScript is: an easy deployment platform, for both content creator and consumer, for client-side code. Where else could you deploy and within seconds your customers have your new code?

If you're worried about how easy it's becoming to track people in the browser, or to track people in the Internet ecosystem in general, that's an entirely different conversation. JavaScript has absolutely nothing to do with the security and privacy status of the web aside from what it is allowed to do per web standards.

Stop blaming a language or the ability to write anything Turing complete when, clearly, it's just a tool and the same problems exist in every single code distribution arena.


Sorry, the point of the article went way past your head. This is not about javascript or the qualities of the platform. The word 'tracking' is nowhere to be found in the text.

There is just an insane amount of tolerance for what's acceptable performance today, and a huge disconnect with the reality of the average user. Here we are with supercomputers all around, and the web is slow. Slower than it was ten years ago. This has nothing to do with javascript as a language.


> Sorry, the point of the article went way past your head.

This is not useful or helpful tone and really does not belong on HN [1]. My comments addressing security and privacy were in response to the current top comment [2].

I fully realize that performance is the goal of the article. What I strongly disagree with is the article's projected viewpoint of the universality of JS being slower.

> despite the best effort of (...) tools that send less JS be default

> Video of a single slow-loading page lands in a visceral way; abstract graphs don’t.

Saying JavaScript is universally slower is as true as saying client side code is always slower. This is obviously not the case. Just imagine if the next CounterStrike game tried to do everything with server side rendering rather than transferring 5GB of game code one time and tiny fractions of that from then on. Obviously the argument is not universal. Costs can be amortized. Code can be cached and intelligently replaced. There are entire industries where nobody would even think to debate that a lot of client side code is the only way to achieve their goals.

I'm not saying everybody does this well, but I'm confident saying that the answer is not going to be universally less JavaScript.

[1] https://news.ycombinator.com/newsguidelines.html

[2] https://news.ycombinator.com/item?id=17963207


> Just imagine if the next CounterStrike game tried to do everything with server side rendering

Okay: https://parsecgaming.com/


:-D https://news.ycombinator.com/item?id=17963838

Edited because I had the wrong link. I had posted that exact service earlier as an example of things being feasible but not best for latency, which is a very real component of user-perceived speed.


Agreed, it's a poor argument.

I once worked for a hardware company which manufactured Set Top Boxes (like TiVO). They could have developed the UI engine in any language, but they chose JavaScript (running on their own custom JS engine). It's surprising how many companies use JavaScript to implement the UI for their hardware devices; to infer that developers only use JS because they don't have a choice is inaccurate.

JavaScript has become a great language. I've programmed in pretty much everything else so I think I'm qualified to say this. In my opinion, if a developer needs to know just two languages today, it's JavaScript and C/C++.

With knowledge of JavaScript and C/C++, you can do anything on any device. A developer who knows JavaScript and C/C++ is usually a pragmatic developer; a good developer.


I once worked for a hardware company which manufactured Set Top Boxes (like TiVO). They could have developed the UI engine in any language, but they chose JavaScript (running on their own custom JS engine). It's surprising how many companies use JavaScript to implement the UI for their hardware devices; to infer that developers only use JS because they don't have a choice is inaccurate.

This is true. Another example is the 2nd and 3rd gen AppleTVs. They were based on iOS but the “apps” for it were all hosted on top of WebKit using Apple’s own markup language - TVML.

While I’m definitely not a fan of the movement toward JS everywhere, I have to admit that of you want to tie your horse to one language, JS is the way to go.


I remember at the company I worked for, we used AppleTV as inspiration for our own UI so it makes sense why we would use similar tech.


You named three languages.


C and C++ are not separate languages, one is a subset of the other. I just mentioned C explicitly because some hardware devices don't have a C++ compiler.


No they are not, this is a compiler error in C++.

   char *ptr = malloc(1024);
And this uses a different evaluation order

   int i = a ? b > b ? 1 : 3 : 5;
There are other subtle differences when talking about C89 and C++98, let alone when talking about C11 and C++20.


> Where else could you deploy and within seconds your customers have your new code?

Applets, Flash, JNLP, 0install, ClickOnce? JS is the most popular but it isn't unique.


Can you quote the part of the article that blames the language? I read the article but I must have missed that part. Thanks.


"easy deployment platform"

It's a fucking nightmare.


"Write your code here, here and here, sacrifice 4 goats under a full moon, run your through 3 (no more, no less) seperate tools (each with their own config files), send that to an intermediate compiler to generate a dozen more files, don robes and perform ceremony to Elder God(s) of your choice, to send to another sort-of-compiler, sacrifice your intern for good measure, run everything through a packaging tool (chosen using a carnival spin wheel, different for each run).

Discover your code has errors that get silently ignored, repeat the whole again, any remaining errors are clearly blessed by said Elder Gods, deploy it, ignore warnings and go onto the next project. :P


This article seems to start with the premise that JS-heavy frontends improve the developer experience. This, I would heavily challenge.

Compared to a Rails, Django, Express.js, or any similar more-traditional framework that does server-side rendering with HTML templates, a little bit of JS and AJAX sprinkled in, SPA apps with frameworks like React are significantly more complex and requires more specialized knowledge on the frontend. I'm just starting to dive into React and Redux, and the amount of jargon and complexity is mind-blowing. All to solve the complexities of what? Bigco's massive bloated app with thousands of developers working on it?


> Compared to a Rails, Django, Express.js, or any similar more-traditional framework... SPA apps with frameworks like React are significantly more complex

Hmm, I don't agree with that at all. I work with Rails and React daily, and they're about equal complexity to me - React just tries to hide it less.

Rails especially can get incredibly complex the second you try to do something in a way that's not the "approved way".


Do you have any examples of incredibly complex rails? Also, I thought rails was supposed to be a lot of magic - hiding the complexity from the developer


Hey! Sorry, I missed this comment.

It's changed recently I've heard, but up until recently making a model without using ActiveRecord was a huge PITA that mostly engendered "don't do that" comments.


Hi, I'm a Redux maintainer, and spend a lot of time answering questions about React too.

Any specific concerns I can help with?


I really disagree with the philosophy espoused in the article, which is that the web will not succeed for 'users' in this or that place on this or that hardware unless 'developers' decide to do things in a certain way.

This is not what the web is about, at all. The web is not centralised, relying on all-powerful 'developers' to decide to build things for 'users' who are powerless and must accept what they are given. It's the opposite. If people don't like things they can build better things themselves. If something doesn't work for them, they can build something that does. The tools are out there, the knowledge of how to use them is out there and freely available, there are practically no barriers to entry to building stuff assuming you have a computer and an internet connection.

So, yeah, just build things in the way you think is best. If people don't like it they won't use your thing, and will use someone else's thing that serves them better or just build their own. That's both the greatest beauty and the greatest strength of the web.


Cheers to this! ...so much bitching and moaning around here


Just Russel trying to justify his job at Google, nothing more.


"More JS" vs "Less JS" is oversimplified - what are we building here?

- A web app that exists as a tool users log into? This kind of thing would have used to be a desktop app. I'm perfectly fine with a 1-2MB JS bundle. The images and other data you're downloading will soon be much greater than that size. App too big? split it up into modules. Targeting users with 3G or worse connections? Ok, lets look at server-side rendering with a simplified experience. I don't think apps like this are really the problem here. If it is a powerful app I don't mind waiting 5-10 seconds for everything to load, so long as there is a good user experience around loading. These are the kind of apps I build in React and they load fast and feel fast, even on cell connections. Our speed issues and optimizations are generally elsewhere, usually on server-side data fetching. Granted I haven't worked anywhere that has users with flip phones in far-off countries. I would probably choose a different strategy in that case.

- A media-heavy web site that displays content for non-authenticated users? These have tons and tons of tracking JS and other things, not to mention the hated auto-play videos and intrusive ads - I think thats the "CO2" that the author is talking about, but the issue is mostly organizational and economical - these companies are made up of large numbers of people who all need "just one more script", or are getting paid to insert garbage into their page. This is a shame and we can complain about it all we want to but until the economics behind web content improve I don't see that underlying issue getting fixed. This is the problem that AMP is trying to solve. I agree that it shouldn't be proprietary, but you can't expect everyone at a media company to think like an engineer and prioritize site load time over the big bucks in analytics and internet chum, unless it gets too bad. Companies will do what they can get away with, whether its CO2 or JS.


"A decent hedge-fund strategy would be to run a private WPT instance and track JS bloat and TTI for commercial-intent sites — and then short firms that regress because they just spend 3-8 quarters rewriting everything in The One True Framework."

For a post that weighs so heavily on the need for evidence, this claim doesn't have any. There have to be cases where bloat has a material impact on a company's finances, but how frequent are they?


Author of the article here; you'd be truly shocked.


I have carefully considered this comment, and decided that it's not evidence.

Now, you're working at Google, and it's definitely possible you're privy to great evidence that you're not sharing. But saying "you'd be truly shocked" is still not really good evidence.

I know there were the Amazon studies where there's a tight link between latency sales. I don't dispute that latency matters. I personally love websites that load fast, have no JS on my personal site and no more than 20 rules. I just haven't seen actual up to date reasons to believe that modern web development is hurting the bottom line for most businesses.


The entire post is premised on frustration in my ability to share what I know in order to avoid punching down. That I can't share specifics is the whole reasonto try another tack.

See also: https://wpostats.com


> I just haven't seen actual up to date reasons

Nobody owes you data. If you need that data, get out your checkbook and pay for it - $50,000 will fund most studies.


re: "is it the tool or is it developers using it wrong" it's clearly the tool(s). It's like the JS scene can't leave well enough alone and must constantly reinvent everything, and anyone who preaches restraint is regarded as a luddite who "just doesn't get it."

I mostly blame the "JS thought leaders" i.e. people who should know better. They hype one tool after the next with little to no concept of the cost of obsolescence (having to throw out your 2 year old new web app), which is yet another major issue with JS hype. Then employers ask for the tool, then devs _have to_ learn the tool whether they want to or not if they want to match job descriptions. Everyone's afraid of "falling behind" and as a consequence everyone rushes headlong.

Ironically, given that this article is about DX, the JavaScript ecosystem has in recent years been regarded by many as an insane mess of infinite complexity, majorly frustrating many developers who have jobs besides just learning (or teaching) new frameworks. That is to say, I don't think these costs have even bought us good DX. They've bought a "lowest common denominator" environment where you can hire a new dev bootcamp grad to build your website and your app quickly and cheaply and agile-ly and quickly and cheaply and... did I say quickly? Nevermind the mess you'll spend years cleaning up.

Summary: JS (and beyond) thought leaders & framework authors are selling silver bullets and businesses just can't say no.


I think of a site like Wix that enable anyone to build a website in minutes in a point-and-click interface. Very heavy javascript. A basic template makes it super easy for the 'developer' and the result is huge mess. The landing page I'm looking at loads 102 scripts, 2.0MB 23 fonts and stylesheets, 13.8 second total load time.


It doesn't matter to the devs behind wix because people like me are so few, but I always close wix sites instantly.

That's because not only do they load a lot of JS but they load it serially. Each domain requests JS from a new domain and it takes 3+ interactions of noscript + reloading to get anything rendered at all. If they'd just have all the script domains requesting in the first load it'd be fine.


I hate autoplaying videos and bloated adware as much as anyone else on here, and I'm very partial to arguments in favor of increased accessibility. But the thesis of the article is that optimizing cost-to-produce at the expense of time-spent-waiting-for-the-page-to-load is an optimization that fundamentally undercuts the poor.

There wasn't much data showing this to be true, and I suspect the opposite may be. If reducing pageload time increased costs, would poor people be happy to pay more for the faster product, or would it just drive them out from being able to afford it? Or in the case of free services that make money via advertising and tracking, would it be better to do more of that to pay for the extra cost needed to produce the good in order to shave off pageload time? I'm not so sure. "Make it better and more expensive" is an easy argument to make if you're a wealthy software engineer, but I'm not convinced that applying "better and more expensive" as a value helps those lower on the socioeconomic ladder.

Focusing on "make it cheaper to be better" seems like it would do more to make experiences more equitable — i.e. we should improve the developer experience of high-performance open-source tools so that it takes less time and costs less to make better products, rather than driving up the cost of production at the (literal monetary or privacy) expense of users in order to produce a faster product.

Edit: Or, y'know, we should pay CEOs less and use those savings to make better products without increasing the costs passed on to our users... But somehow I don't see that happening any time soon.


Probably off topic, but I have to relate this anecdote because its just so funny to me.

I took a new job at the beginning of the year. It was to lead development of v2 of an internal app originally written in Angular and convert/upgrade it to React.

I poked around at it a bit before getting started. It seemed fine, maybe a little slow.

My version got to the point where it could get up to a dev environment. It again seemed a bit slow. I took a look at the build process and realized it did not include gzip (express/compression). I added that in and yeah it seemed about right.

I then took a look at legacy production and... you guessed it, in production for over a year this product had no gzip and was serving a JS payload of 5x what it could be, around 5mb. And no one noticed!

My point is people just don't care or maybe they're just inured to it at this point. There's a lot of "oh for every 100ms of loading time Amazon.com loses $xxMM" talk but in the real world? Not what I'm seeing.


Every time I read an article like this, I agree with it, and a lot of developers seem to be the same.

Two things are stopping me from actually doing this. Firstly, and most importantly, my boss isn't reading articles explaining why accessibility and quick load times will make him more money. Secondly, I've been present for some absolute disasters that happened because of the combination of AJAX, jquery/vanilla js, server-side MVC, and scope creep, and I don't know which frameworks or development patterns I can use to stop that happening on a project that I have more control over while still keeping that project lightweight.

Articles that give clear and actionable suggestions, or are targeted at the people who make high-level decisions about what features go into a product and how many third-party scripts should be installed, would be more useful than articles about how developers are very naughty for adding 10+ ad trackers and UX tools.


Agreed. No dev in their right mind would intentionally attach these bloaty 20x more script spawny advert/trackers or put an irritating popup, most of these are signs that at some point in time, there was a bad decision and dev had a gun at their head to make that decision material.


I hope that by talking about what it means to build well when trying to serve everybody, we can show businesses how short they’re falling of the mark — and why those common root-causes in JS-centric development are so toxic.

I guess I hope that you're successful, but one would assume that if today's Javascript climate was so toxic and having a tangible effect on your average user (not just the Hacker News user), then...there would already be swarms of clever individuals capitalizing greatly on this apparently obvious state of the web.


This will fall in deaf ears, and maybe rightly so, if computing history is any guide.

In 1954, the biggest issue John Backus had to overcome for FORTRAN was performance: "it is difficult to convey to a reader in the late seventies the strength of the skepticism about "automatic programming" in general and about its ability to produce efficient programs in particular, as it existed in 1954."

But the world went on to use higher-level languages with worse and worse performance characteristics.

We can't blame developers trying to use better tools (we could, but that wouldn't change a thing). DOM, CSS, and the imperative mutation model was a curse on front-end development, ensuring that only companies as big as Google could write something like GMail (for which they wrote their own Java-to-Javascript compiler first). If something needs fixing it is the underlying browser model. Maybe it will get fixed sometime if this trend continues to its terrible but logical extreme, and something finally gives.

This is also purely a matter of cost - if businesses do care about performance then they'll have to put more developers per team, spend money on training, ask for fewer features slower. If performance _really_ mattered, then they'd see a loss in revenue, and as a natural response start talking about performance in their agile meetings, and JIRA cards.


It's an incentives problem.

We could solve it if we checked performance like we do unit tests, and held developers accountable to a literal performance budget set by the business.

> Few teams I’ve encountered have actionable metrics associated with the real experiences of their users.

Imagine getting latency numbers on load times, bundle sizes, TTI, and so forth on every commit. PRs that blow the performance budget would be immediately flagged. You could plan for performance like any other business cost.

Even nontechnical business leaders can understand reports like "3s load time", and set tech team goals based on a relationship between load times and lost sales. "Load time must be under 2s, even on old Android + Edge" is something everyone can understand.

The article is right: what's most pleasant for the developer isn't necessarily what's best for the business. If metrics like load times are only being seen by developers, the developers have a moral hazard: either they

1. make choices that benefit themselves, at a cost to the business

2. make their own lives voluntarily worse, without recognition

The article is right again that sometimes, better DX is in fact worth worse performance for the user. As with anything, it depends on the business.

But businesses won't accurately and fairly decide on their preferred tradeoff if only engineers are in charge of it.

We need better performance infrastructure!


As a positive example, GHC has tests that are run for PRs that check performance (in terms of memory allocation): https://ghc.haskell.org/trac/ghc/wiki/Building/RunningTests/...


> This argument substitutes good intentions and developer value (“moving faster”, “less complexity”)

How can one even argue that as "less complexity"? Using a framework vastly increases the complexity of a site. (Though it may hide a lot of it under a rug, at least until the first bug occurs.)


Have the people who complain about javascript page load performance read about netflix.com? Netflix uses JS to render their landing page server side. No client JS necessary. How can a page load faster than raw HTML?


You do realise that Netflix doesn't need to depend-on/load adverts and mass tracking scripts and given the company itself is tech based their sales doesn't get the mighty power to force their devs inserting x40 tracking/adverts scripts and annoying popups? P.S. Netflix does ~17 ajax calls immediately after loading to hydrate its SSRd page ;)


I’m not a JS developer and I don’t have a dog in the fight, but the author’s rhetoric is on point. No idea if it’s legitimate at all, but the carbon emissions analogy seems rhetorically masterful.


What if the issue was not restrained to javascript? What if the state of the web was just another case of "victim of its own success" ?


In my experience, it's not only a problem with javascript. Developers putting themselves above their users and non-developer teammates seems fairly widespread.


The problem goes beyond dev choices and UX. As an FE dev, it's advantageous to your hiring options to gain experience with the latest shiniest frameworks. I don't see any FE job postings asking for YUI/Dojo/Mootools/Backbone/JavaScript-native anymore.


While I do agree with the general point of the article, I think that, like most conservation arguments, it lumps the responsibility of cleaning up after the sins of the biggest polluters on the daily habits of working class people.

We aren't going to make up for the emissions of shipping bottled water in barges across the Pacific Ocean by taking shorter showers. And the issue is not using JavaScript for features. The issue is the gigantic ad and tracking networks that are being included, sometimes 10 or 20 at a time, in projects to rob users of their valuable behavioral data. That also just so happens to be written in JavaScript.

But in a world where one can get entire VR experiences out in less than half a Megabyte of payload, it's ludicrous to think that something as simple as a messaging app should take so much in comparison. But it's comparing apples and oranges. The VR experience isn't spying on you (despite the fear mongering to the contrary).


>We’re meant to take it on faith that it will all work out if only the well intentioned people are never questioned about the trajectory of the outcomes.

What a phenomenally articulated phrase with so many applications beyond JavaScript.


Try championing any of this at any company. The thinking these days is if you aren’t using react and webpack and a god awful bloat that comes with it you’re not building a real application. There are so many new developers working at companies that only know react. They don’t know anything else and they certainly aren’t engineers. They have been taught one thing and the groupthink is so thick you can cut it with a knife.

The problem is every react codebase I’ve ever worked on in my full time and freelance work have been an absolute mess and downright painful to work in. They don’t even fulfill on the developer promises. Can anyone even point to a successful open source project using it?

I’ve literally made quite a good freelance business by targeting failed react projects and rewriting them with vanilla js.


When your website has a "loading screen", you fucked something up.


This metric is over-simplified by a huge degree. I also agree with you completely. Made me laugh, too.


I feel like web sites are actively becoming hostile to drive users into running proprietary mobile apps instead of spending time on the open web.


well, WASM is going to change things a lot I think over the next 5 or so years.

Things like https://blazor.net/ (though not necessarily the first wave of these frameworks) are going to become more popular, and there are various variants in other languages. I think there will be a slow (very very slow) decline of js


The "hello world" demo app appears to require ~1.3MB of transfer, nearly all of it critical-path: https://blazor-demo.github.io/

This future isn't better.


Most images on the modern Web are bigger than that and pages are full of them.

Also that is still WIP and doesn't yet make use of .NET linker.


Images aren't as expensive as critical-path code as we (the browser) parse and raster them off-thread. Outlined some of the costs here: https://infrequently.org/2017/10/can-you-afford-it-real-worl...

Excited to see how much smaller a good linker can make this.


JavaScript JIT compilers also don't compile everything on one go.

I can easily imagine asynchronous compilation of Web Assembly modules in background threads as its use increases.


WASM streaming compilation is possible today and runtimes are adding tiered compilers: https://v8project.blogspot.com/2018/08/liftoff.html

Regardless, as long as you're network-bound in the critical-path, that will only help to the extent that partial execution works well. That is baked into HTML/CSS/JS; not so much with WASM.


How is partial execution baked into JS? IIUC, my whole JS file is fully parsed an resolved before it is evaluated. If you're referring to the use of multiple files for parallelism here, then the obstacle there is not wasm itself but the work that's still in progress to define a dynamic linking story for it.

I'd point out that the need for better dynamic linking to enable lower TTI is also a problem faced in the bundling/tooling heavy JS community.


JavaScript can be evaluated top-level function by top-level function and file-by-file, modulo variable hoisting. You can create situations where everything is blocked, but that's not the norm.


Do you have numbers for how much of the web avoids variable hoisting? How much of the web has more than one top-level fn per file? Do these numbers shift significantly between different toolchains or methods of delivering JS?


It would be interesting to try for some apples-to-apples comparisons for the parse and initial exec time of js vs wasm. The latter was designed with the goal of streaming validation and compilation, js needs to have a lot of work done before it can load.

Which strongly suggests to me that intuitions about KBs-of-JS vs KBs-of-Wasm are likely to be wrong, especially as the state of wasm implementations improves.


WebAssembly could make things worse.

https://en.wikipedia.org/wiki/Jevons_paradox


Much worse. Hostile code you can't block.


Agreed, once the final missing pieces are in place (mainly garbage collection at this point) javascript will be an opt in kind of thing rather than the only available choice.

Personally, I might get my hands dirty with Kotlin on the frontend. I'm using it on the backend currently so the transition should be relatively smooth for me and I have some crappy JS code that I would love to get rid of at the earliest convenient moment. I imagine there are also a lot of Android developers that might be willing to work on Kotlin based web apps that like me would prefer to not deal with Javascript.

I imagine short term things will be a bit rough as tooling and frameworks are still evolving. But as that improves, there will be a very sharp increase in the amount of non JS code running in browsers. Currently the big milestones for Kotlin would be the 1.0 release of the native compiler (0.8 is the latest, I believe), related work on WASM support, and the finalization of things like garbage collection in WASM. Looks like that should all be happening over the next 6-12 months.


Embedding an language runtime for something like C# is never going to improve performance. You’re still bound by the speed of the browser engine, and now you’re layering an extra interpreter on top.

Projects like this are exactly what the author was complaining about - good for .Net developers who don’t want to learn JS and Rust, bad for user experience.


You can call me a small town coal miner then.


A huge majority of the JavaScript for the project I work on is tracking scripts explicitly requested by marketing people. They all disagree on where analytic data should be sent to, so it ends up in Google Analytics, and LinkedIn, and FB, and and and...

The developers on my team try to use just enough JS to accomplish what we need, and almost all of the rendering happens server-side.

I think what I'm describing is an under-appreciated piece of the puzzle. How many developers have you met who just loooove Google Tag Manager?




Consider applying for YC's W25 batch! Applications are open till Nov 12.

Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: