Hacker News new | past | comments | ask | show | jobs | submit login

It's better (and more efficient) to have a developer write (or waste, if you consider doing their job "wasting time") two interfaces than to waste the computer resources of a million people.

Electron is just outsourcing cross platform development to the Chromium project.




Unfortunately waste of users' computer resources is an externality and developers are more expensive than ever. Revealed preferences show that customers would rather buy more hardware resources than pay higher prices for efficient software (within reasonable limits).


I would be careful talking about revealed preferences here - individual customers have close to zero choice in the matter. They select from what's available on the market, not from the space of all possible software.


If the efficient market theory is to be believed, then as soon as a competitor shows up with the user's preferred choice, they will gain the market share.

The fact that no such competitor exists is evidence that the user's preferences aren't what you stated, but is in fact consistent with reality, namely they prefer free and bloated.


The mistake here is believing in efficient market theory.

It's not so easy for a competitor to show up, because they have to work against heavy first-mover advantage and network effects in software. Those who do a crap job get first to the market and set the trend (and the expectations). Moreover, there's a heavy component of tragedy of the commons here, with commons being users' computing resources - the software is often designed with implicit belief that it's the only thing running on user's machine. It's enough to make a sale, and you don't get a reward for making it so that your users can run many other stuff simultaneously with your application.


Generally I agree with you but for Electron we're talking about own-goals the developers inflict on themselves. No one outside of the HNosphere uses Electron apps.


Plenty of people outside of HN use Slack, Discord, Atom, VS Code, Skype and Twitch.


Right now I have Discord, VS2017, a Postgres IDE and a bunch of Chrome tabs open. Task manager reports for memory usage (descending order): Discord: 1168,4MB, VS2017: 463MB, Chrome tabs: about 300MB each, PostgresTool: 152MB and the rest below 100MB. If I dont include all the chrome tabs, Discord uses more memory than all the other apps together, and some of them are "big" apps.

So my 32GB RAM, high specs system, is able to run only up to ~ 30 instances of Discord, not counting even the OS.

Wtf. It feels like malware.


That's why I have a simple rule:

"No JavaScript on the desktop"

And suddenly 8GBs of ram is at least 4x too much(Linux) even when I'm using "bloated" Java apps...

https://suckless.org/sucks/web/:

  Millions of jobs are based on outputting HTML in an inefficient way.


Then perhaps OS vendor should start thinking about unifying their UI SDK using web engines, because having to support the finicky toolkit of every OS is a waste of engineering time.


That was Boot2Gecko aka FirefoxOS. I would not mind having that on a phone.


> Electron is just outsourcing cross platform development to the Chromium project.

... and ultimately to the consumer, who pays for higher spec'd hardware and more electricity.




Join us for AI Startup School this June 16-17 in San Francisco!

Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: