If operating systems had ever provided security strong enough to run untrusted code, we wouldn't have needed to. It's really a failing of operating system security. Even today no operating system provides an application sandbox as strong and versatile as the browser.
Platforms have fallen back on walled gardens (app stores) with centralized control of all code execution to compensate for the deficiencies of their security models. I don't want a future where the only apps I can run have to be signed and approved by Apple ahead of time. The web is the escape hatch.
The whole security angle is an after-the-fact rationalization and the vast majority of users (not developers, users) never cared much about it - as anyone in the IT department of pretty much every company can tell you (again, users, not developers, not sysops, not admins, but regular end users).
The real reason we see so many web apps is that the decision on what technology to use for an application is largely something developers decide (with "developers" i do not only mean individuals but also companies and organizations that develop applications) and the web gives developers pretty much complete control over what is going on in their application, what the users can do and how, allows them to force everyone use the same version, allows realtime monitoring of how the users use their applications and provides the vendor locking tightest than the most obfuscated native application could have (since all the data is stored in the servers).
Security is just a very convenient excuse since a lot of people shut down their critical thinking whenever it comes up (my guess for why that happens is that since a lot of people have been shamed by supposedly security experts and even more people have mimicked that shaming, we ended up with everyone just shutting up whenever security is brought up to avoid looking like That Clueless One that would be shamed next - but that is just a guess).
But the real reason is the heavy control that web apps give to the developers and the vast majority of users do not really understand how biased against them that setup is.
Have you looked at the network traffic or behavior of most native apps? Continuous monitoring and cloud-based state are the norm everywhere, and for the same reasons.
Not all apps do that (in fact i cannot think of any desktop application that i have installed that does something like this - though if i could, it would get the boot) and because applications run locally it is possible to monitor and control their behavior (indeed, with a server you just don't know what is going on, but with a local app you can at least tell that something is going on).
Also, while the UI is far from ideal (at least on Windows), you can block individual applications from accessing the Internet. It should be much simpler than it is now, though.
What do you mean with stolen? Someone making something similar or making unlicensed copies? For the former i do not see how web apps prevent that and for the latter that only matters when the developer asks for money for each license, which is extremely rare with web apps anyway (the closest is subscriptions and rentals but that happens with desktop/native applications too, e.g. see Adobe and Autodesk). But even that is really a facet of the heavily developer biased control i mentioned that web apps provide.
But people still download and run apps every day. Some even prefer it. I doubt that the cause of this migration was due to security concerns -- since when are app developers particularly concerned about the quality of the security controls imposed on them? I suspect the shift was more due to ease of access, both by the user and for the developer, helped along by easier compatibility.
App developers care about the barrier to getting users to use their app. The barrier to getting a user to click on a weblink is much, much lower than the barrier to getting them to install an app.
This is one of the largest failings of the App Store providers. They should recognize this installation barrier and work towards fast and ephemeral app installs.
Right, but users can only use an app once it has been developed, and ultimately if the user needs an app, they will go with whatever format it's being distributed in. Then there's also the convenience factor — Gmail is a good example of this — where a web app is more convenient than a downloadable app that does the same thing, because it requires no installation or updates.
I would also suspect that most users haven't any clue of the security implications of using something in a browser versus using an app.
Well, my wife is an exception to that. She installs all kinds of crap apps on her phone.
And we got our phones through our daughter who works at Verizon, so when my wife moved from an Android to an iPhone they called me up and asked for my iCloud password which I, like a dumbass, gave them.
I just checked and I've got four more bullshit apps on my phone I need to delete :D
Sure, running untrusted code will never be 100% safe. But visiting a random website is 99% safe where running random .exe’s is 1% safe. Neither is perfect, but in practice, one is good enough for most situations and the other isn’t.
It counts whether the platform makes it easy to use and promotes it in a way so that average users actually use it.
If I create a native Windows app and link people the .exe to try it out, approximately 0% of people who run it are going to run it in a securely sandboxed way. If I create a web app and link people it to try it out, approximately 100% of people who run it are going to run it in a securely sandboxed way.
Furthermore, some people will specifically avoid trying out the .exe I send them because they don't trust me fully with everything on their computer. As a developer that wants to show off things I make, I don't want this obstacle to exist. If I make a web app instead, I know it's more likely people will try it out.
Virtual machines per app are a thing, but they have a lot of disadvantages (particularly resource consumption). Various sandboxes have existed for a long time, but I’m curious which ones you would point out that have survived anywhere near the scrutiny and attack surface that browsers have.
Is letting Google, Facebook or Apple control every aspect of our lives by sending them all our personal data, contacts and network information voluntary more secure?
I would rather run every application in its own VM under a different unprivileged user
P.s. the browser is the main attack vector on mobile, not only because it's so complex that bugs are everywhere, but mostly because web app security sucks
Browsers usually have fairly strong security models, since they are expected to constantly run untrusted code (which has nothing to do with web app security, FWIW). Apps rarely get this kind of scrutiny and often don't (Android) or can't (iOS) employ features that browsers can to do, such as multiple processes.
Many game engines do it as well, just think about DOOM mods
I think the point is that browsers are not as good as an OS as an application platform (given the limitations) but are as complex as an OS and have more bugs
The fact that mobile apps are terrible is not an excuse for having a terrible document protocol used for applications
Apple invented mobile apps as we know them today,but native apps in general have served people well for ages
We are at a point where a native app with some API is more maintainable than a browser app
Not even talking about the ecosystem and its fiascos, like npm corrupted libs used by millions without even looking at a single line of source code or the famous leftpad incident
It doesn't really matter where the weakness is, if it is exploitable
As Alan Kay once said "the web was made by amateurs at best"
Platforms have fallen back on walled gardens (app stores) with centralized control of all code execution to compensate for the deficiencies of their security models. I don't want a future where the only apps I can run have to be signed and approved by Apple ahead of time. The web is the escape hatch.