Hacker News new | past | comments | ask | show | jobs | submit login

The security aspect was an interesting part of this piece, because one of the main reasons webapps took over from Windows apps is because they were perceived as more secure. I could disable ActiveX and Java and be reasonably confident that visiting a webpage would not pwn my computer, which I certainly couldn't do when downloading software from the Internet. And then a major reason mobile apps took over from webapps is because they were perceived as more secure, because they were immune to the type of XSRF and XSS vulnerabilities that webapps were vulnerable to.

Consumers don't think about security the way an IT professional does. A programmer thinks of all the ways that a program could fuck up your computer; it's a large part of our job description. The average person is terrible at envisioning things that don't exist or contemplating the consequences of hypotheticals that haven't happened. Their litmus test for whether a platform is secure is "Have I been burned by software on this platform in the past?" If they have been burned enough times by the current incumbent, they start looking around for alternatives that haven't screwed them over yet. If they find anything that does what they need it to do and whose authors promise that it's more secure, they'll switch. Extra bonus points if it has added functionality like fitting in your pocket or letting you instantly talk with anyone on earth.

The depressing corollary of this is that security is not selected for by the market. The key attribute that customers select for is "has it screwed me yet?", which all new systems without obvious vulnerabilities can claim because the bad guys don't have time or incentive to write exploits for them yet. Somebody who actually builds a secure system will be spending resources securing it that they won't be spending evangelizing it; they'll lose out to systems that promise security (and usually address a few specific attacks on the previous incumbent) . And so the tech industry will naturally oscillate on a ~20-year cycle with new platforms replacing old ones, gaining adoption on better convenience & security, attracting bad actors who take advantage of their vulnerabilities, becoming unusable because of the bad actors, and then eventually being replaced by fresh new platforms.

On the plus side, this is a full-employment theorem for tech entrepreneurs.




> A programmer thinks of all the ways that a program could fuck up your computer; it's a large part of our job description. The average person is terrible at envisioning things that don't exist or contemplating the consequences of hypotheticals that haven't happened.

I'm not sure programmers are much better. There's a long history of security vulnerabilities being reinvented over and over. Like CSRF is simply an instance of an attack first named in the mid 80s ("confused deputies"). And why are buffer overflows still a thing? It's not like there's insufficient knowledge about how to mitigate them.

And blaming this on the market is a cheap attempt to dodge responsibility. If programmers paid more than lip service to responsibility, they'd push for safer languages.


> And blaming this on the market is a cheap attempt to dodge responsibility. If programmers paid more than lip service to responsibility, they'd push for safer languages.

If programmers paid more than lip service to responsibility, the whole dumb paradigm of "worse is better" would not exist in the first place. As it is, we let the market decide, and we even indoctrinate young engineers into thinking that business needs is what always matters the most, and everything else is a waste of time (er, "premature optimization").


> If programmers paid more than lip service to responsibility, the whole dumb paradigm of "worse is better" would not exist in the first place.

I used to think like this but I've come to realize that there are two underlying tensions at play:

- How you think the world should work; - How the world really works.

It turns out that good technical people tend to dwell a lot on the first line of thinking.

Good sales/marketing types on the other hand (are trained to) dwell on the second line of thinking and they exploit this understanding to sell stuff. Their contributions in a company, in general, are easier to measure relative an engineer since revenue can be directly attributed to specific sales effort.

"Worse is better" is really just a pithy quote on how the world works and it's acceptance is crucial to building a viable business. Make of that what you will.


The world doesn't always work that way though. There are plenty of areas where we've decided that the cost of worse is better is unacceptable, and legislated it into only being acceptable in specific situations. For example, many engineering disciplines.


The prime directive of code made for a company really is to increase profits or decrease costs, though. Most of the time just getting the job done is all that matters. Critical services and spacecraft code are exceptions.


Yes. Which is precisely the root of the problem. Increasing profits and decreasing costs are goals of a company, not of the people who will eventually use the software (internal tools are an exception). The goals of companies and users are only partially aligned (the better your sales&marketing team is, the less they need to be aligned).


> And blaming this on the market is a cheap attempt to dodge responsibility.

How many hacks, data breaches, and privacy violations does it take for consumers to start giving a shit?

Also, any programmer will tell you that just because an issue is tagged "security" doesn't mean it will make it into the sprint. Programmers rarely get to set priorities.


> How many hacks, data breaches, and privacy violations does it take for consumers to start giving a shit?

There's a quote by Douglas Adams pops up in my mind whenever the subject comes up:

> Human beings, who are almost unique in having the ability to learn from the experience of others, are also remarkable for their apparent disinclination to do so.

This is the only explanation there can be for this. Every time there's a breach somewhere (of which there obviously are plenty), there's a big outrage. But those who should go "oh, could that happen to us, too?" choose to ignore it, usually with hand-waving explications of how the other guys were obvious idiots and why the whole thing doesn't apply to them.

This obviously goes for consumers and producers.


Exactly this. The last company I was in had a freelance sysadmin and a couple of full time devs. The sysadmin had been banging on for ages that we needed a proper firewall set up. It was only after we thought we had been hacked (it ended up being a valid ssh key on a machine that we didn't recognize), we checked and found at least half of the windows machines were infected with crap. Only then did they get the firewall. We decided not to admit our mistake about the ssh key, as it seemed like it was the only way to get things done.


> How many hacks, data breaches, and privacy violations does it take for consumers to start giving a shit?

https://en.wikipedia.org/wiki/Say%27s_law

In other words, it takes a better alternative to exist. Better can mean cheaper or faster or easier, a lot of things. That can be accelerated by the economic concept of "war" (ie. any situation that makes alternatives a necessity).


I don't think it's about "dodging responsibility" but just an examination of the tradeoffs involved in development. The code we're developing is becoming more transitory, not less over time. How secure does a system that is going to be replaced by the Next Cool Thing in 4-5 years need to be? It really depends on what you are protecting as much as anything.

The incentives for someone to break into a major retailer, credit card company, or credit bureau are much different from Widget Cos. internal customer service web database. What I think the article is missing, even though it makes alot of good points, is that if there's a huge paycheck at the end of it, there will always be someone trying to exploit your system no matter how well designed it is. And if they can't hack the code quickly, they'll learn to "hack" the people operating the code.


> And blaming this on the market is a cheap attempt to dodge responsibility.

You are oversimplifying. Dunno in what programming area you work (or if it's software at all) but "we work with languages X and Y" is something you'll find in 100% of all job adverts.

Tech decisions are pushed as political decisions from people who can't discern a Lumia phone from an average Android. That's the real problem in many cases.

That there exist a lot of irresponsible programmers is a fact as well.


buffer overflows used to be a thing in all software. Nowadays it's relegated to stuff written in C (essentially).

It used to be that RandomBusinessApp would hit this stuff, now most of it ends up in Java so it might still crash but usually it's mitigated better.


> If programmers paid more than lip service to responsibility, they'd push for safer languages.

Most programmers want to dio their job quickly and easily, and go home.


i disagree with one premise...web apps werent ever seen as a more secure alternative to windows apps. they were seen as easier to deploy. that was netscapes big threat to MS. You could deploy an app to a large audience easily. its hard to get across how hard things were back in the day. citrix came out as an option as well...same deal. easier to deploy.

people really thought activex was brilliant...until security became an issue. i can remember when the tide changed.

anyway, fair points otherwise. cheers.


Agreed. They are easier to deploy, even multiple times per day. This is one of their selling points even today compared to native mobile applications, which have other advantages.

Another advantage is that they are inherently available across OSes, usually across different browsers (but we know what it takes.)

Finally, they used to be much more easy to develop.

Tldr: larger audience, less costs.


I agree. Web apps were easier to deploy, centrally manage and deliver over desktop, assuming you had a stable connection. In fact it was often hard to get people to run apps on the web because internet was wither slow or ADSL was unstable. SaaS was considered risky.

The true definition of a full stack developer in those days would make today's definition of full stack faint.

You had to know how to setup hardware with an os with your software and databases, often having to run your gear in a datacentre yourself that you had to figure out your own redundancy for, all for the opportunity to code something to try out. Being equally competent in hardware, networking, administration, scaling and developing a web app was kind of fun. Now those jobs are cut into many jobs.

Activex was what flash tried to be.. The promise of Java of using one codebase everywhere.

Seeing webassembly is exciting.


> they'll lose out to systems that promise security (and usually address a few specific attacks on the previous incumbent

This happens in other areas besides applications as well. Programming languages, operating systems. This leads to an eternal re-invention of the wheel in different forms without ever really moving on.


Yep. Databases, web frameworks, GUI frameworks, editors, concurrency models, social networks, photo-sharing sites, and consumer reviews as well. Outside of computers, it applies to traffic, airlines, politics, publicly-traded companies, education & testing, and any industry related to "coolness" (fashion, entertainment, and all the niche fads that hipsters love).

I refer to these as "unstable industries" - they all exhibit the dynamics that the consequences of success undermines the reasons for that success in the first place. So for example, the key factor that makes an editor or new devtool popular is that it lets you accomplish your task and then gets out of the way, but when you've developed a successful editor or devtool, lots of programmers want to help work on it, they all want to make their mark, and suddenly it gets in your way instead of out of your way. For a social network, the primary driver of success is that all the cool kids who you want to be like are on it, which makes everyone want to get on it, and suddenly the majority of people on it aren't cool. For a review site, the primary driver of success is that people are honest and sharing their experiences out of the goodness of their heart, which brings in readers, which makes the products being reviewed really want to game the reviews, which destroys the trustworthiness of the reviews.

All of these industries are cyclical, and you can make a lot of money - tens of billions of dollars - if you time your entry & exit at the right parts of the cycle. The problem is that actually figuring out that timing is non-trivial (and left as an exercise for the reader), and then you have to contend with a large amount of work and similarly hungry competitors.


>concurrency models

We started out with OS threads (I guess processes came first but whatever) and now we're trying to figure out what the next paradigm should be. It looks to me like it's Hoare (channels, etc) for systems programming and actors for distributed systems, both really really old ideas. To be fair there are other ideas (STM, futures, etc) that fill their own niches, but they either specialize on a smaller problem (futures) or they're still not quite ready for popular adoption (STM). If this is cyclical then I think we're pretty early in the first cycle.

Sure, the spotlight moves from one model to the other and back, but that's because the hype train cannot focus on many things at the same time, not because the ideas go out of style.


> So for example, the key factor that makes an editor or new devtool popular is that it lets you accomplish your task and then gets out of the way, but when you've developed a successful editor or devtool, lots of programmers want to help work on it, they all want to make their mark, and suddenly it gets in your way instead of out of your way.

Only if it is open source. Seems like Sublime Text (just an example) has avoided this effect... perhaps evidence that open source is not the best model for every kind of software?


How do we fix this?


We don't. Learn to embrace it instead.

There's a flip side to everything. In this case, if you "fixed" this problem, it would imply a steady-state world where nothing ever changed, nothing was ever replaced, and nobody could ever take action to fix the things bugging them. To me, this is the ultimate in dystopias. It's like the world in The Giver or Tuck Everlasting, far more oppressive than the knowledge that everything we'll ever build will eventually turn to dust.

Or we could get rid of humans and let machines rule the earth? Actually, that wouldn't work either, these dynamics are inherent in any system with multiple independent actors and a drive toward making things better. If robots did manage to replace humans (ignoring the fact that this is already most peoples' worst nightmare), then the robots would simply find that all their institutions were impermanent and subject to collapse as well.


Is there no possibility of steady progress without having to continually discard good solutions and reinvent things (e.g. web development catching up with the 90s)? Someone on this thread said that our field has no institutional memory. Can we at least fix that?


You run up against Gall's Law [1]. The root cause is that many of our desires are actually contradictory, but because human attention is a tiny sliver of human experience, whenever we focus our attention on some aspect of the system we can always find something that, taken in isolation, can be improved. (I'd be really disappointed if we couldn't, actually; it'd mean we could never make progress). However, the "taken in isolation" clause is key: very often, the reason the system as a whole works is often because we compromised on the very things that annoy us.

Remember that in some areas, the web is far, far more advanced than software development was in the 90s. It's not unheard of for web companies to push a new version every day, without their customers even noticing. At my very first job in 2000, I did InstallShield packaging and final integration testing. InstallShield had a very high likelihood of screwing up other programs on the system (when was the last time Google stopped working because Hacker News screwed up the latest update?), because all it does is write to various file paths, most of which were shared amongst programs and had no ACLs. So I'd go and stick the final binary on one of a dozen VMs (virtualization was itself a huge leap forward) where we could test that everything still worked in a given configuration, and try installing over a few other applications that did similar things to make sure we weren't breaking anything else. We never did ship - we ran out of money first - but typical release cycles in that era were around 6 months (you still see this in Ubuntu releases, and that was a huge improvement on programs that came before it).

And this was still post-Internet, where you could distribute stuff on a webserver. Go back another decade and you'd be working with a publisher, cutting a master floppy disk, printing up manuals, and distributing to retail stores. You'd have one chance to get it right, and if you didn't, you went out of business.

The thing is, many of the things that made the web such a win in distribution & ubiquity are exactly the same things that this article is complaining about. Move to a binary protocol and you can't do "view source" or open a saved HTML file in a text editor to learn what the author did; programming becomes a high priesthood again. Length-prefix all elements instead of using closing tags and you can't paste in a snippet of HTML without the aid of a compiler; no more formatted text on forums, no more analytics or tracking, no more like buttons, no more ad networks (actually, I can see the appeal now ;-)). Require a compiler to author & distribute a web page and you can't get the critical mass of long-tail content that made the web popular in the first place.

You can see the appeal of all of these suggestions now, in a world where things have gotten complicated enough that only the high priesthood of JS developers can understand it anyway, and we're overrun with ads and trackers and like buttons that everyone has gotten tired of anyway, and a few big companies control most of the web anyway. But we wouldn't have gotten to that point without the content & apps created by people who got started by "view source" on a webpage.

[1] https://en.wikipedia.org/wiki/John_Gall_(author)#Gall.27s_la...


You make a lot of good points.

My concern, as readers who have seen some of my other HN comments may guess, is that the next time someone starts over, they'll neglect accessibility (in the sense of working with screen readers and the like), and people with disabilities will be barred from accessing some important things. "How hard can it be?", the brave new platform developer might think. "I just have to render some widgets on the screen. No bloat!" It's hard enough to make consistent progress in this area; it would help if there were less churn.

Edit: I guess what I (very selfishly) wish for is steady state on UI design and implementation so accessibility can be perfected. I know that's not fair to everyone else though. Other things need improving too.


As someone who had to help "teach" JAWS about UI elements on a friend's computer back in '05-'07, accessibility should be the first concern. If anything, that's one upside to Google - the spider "sees" like a blind person. The better-crawled a page is, the more likely it is you won't lose massive page elements.


FWIW I'd consider it the opposite of selfishness to want to improve accessibility.


Selfish that, in my heart of hearts, I want what benefits me and my friends (some of them), to the exclusion of what the rest of the industry seems to pursue (churn in UI design and implementation, pursuing the latest fashion in visual design).


> Move to a binary protocol and you can't do "view source" or open a saved HTML file in a text editor to learn what the author did

I disagree with that. Using binary formats to exchange data between programs doesn't preclude using textual formats at the human/machine boundary. Yes, "view source" needs to be more intelligent than just displaying raw bytes, but that is already the case with today's textual formats. Everything is minified and obfuscated, so the browser dev tools already have to include a "prettify" option. Moving to a binary protocol would turn that into "decompile" and make it mandatory, but it effectively already is.

Requiring a compiler to author and distribute a web page is no different than requiring a web server or a CGI framework or the JS-to-JS transpiler du jour. It adds another step in the pipeline that needs to be automated away for casual users, but that's manageable. Even if the web world moves to binary formats (as WebAssembly seems to indicate), your one-click hosting provider can still let you work with plain HTML/CSS/JS and abstract the rest; just like it abstracts DNS/HTTP/caching/whatever.


> the browser dev tools already have to include a "prettify" option. Moving to a binary protocol would turn that into "decompile" and make it mandatory, but it effectively already is.

This will be a legal problem. At least in my jurisdiction, transforming source code (which is what prettifying is) is not subject to legal restrictions, but decompiling binary machine code into readable source code is forbidden by copyright law. (For the same reason, I'm concerned about WASM.)


Steady state progress .. towards what?

That one single goal we all share and agree on, and know exactly how to get to so progress can be steady and incremental and continuous?


That's not a million dollar question but one worth several 10's or even 100's of billions. If you can find the answer to it you'll push us across the hump and away from this local oscillating maximum.


You make the perfect product

You strive for excellence

You keep improving

Like Jiro did with sushi

And then the product dies with you


> The security aspect was an interesting part of this piece, because one of the main reasons webapps took over from Windows apps is because they were perceived as more secure. I could disable ActiveX and Java and be reasonably confident that visiting a webpage would not pwn my computer, which I certainly couldn't do when downloading software from the Internet.

Indeed. And then we made sure all interesting data (email, business data, code (github/gerrit etc)) was made available to the Web browser - so pwning the computer became irrelevant.

It's indeed like the 90s - from object oriented office formats, via macros to executable documents - to macro viri - and total security failure. Now we have networked executable documents with no uniform address-level acl/auth/authz framework (as one in theory could have on an intranet wide filsystem).

So, yeah, I kind of agree with the author - we're in a bad place. I used to worry about this 10 years ago, by now I've sort of gotten used to the idea, that we run the world on duct tape and hand-written signs that says: "Keep out - private property. Beware of the leopard.".


> I could disable ActiveX and Java and be reasonably confident that visiting a webpage would not pwn my computer

Unfortunately, this is not entirely true. There were bugs in image processing, PDF processing (some browsers would load it without user prompting), Flash, video decoders, etc. IIRC even in JS engines, though those are more rare. Of course, you could go text-only, but then you couldn't properly access about 99% of modern websites.


When there would be a bug in PDF processing, you end up with a RCE, right?

But downloading an EXE is basically allowing arbitrary code execution on your machine no matter what. So _even with the security bugs_, webapps are basically safer than installing a native app on desktop, at least in its current state.

I see your point though. There are still a lot of entry points we need to be careful about


It doesn't help that curl | sh has become trendy.


The Javascript security model breaks down in the case of file:///, no overflows are required. The security you get today is more flimsy than you probably think. And it used to be far worse.




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: