I agree. This feels to me to be more like inducing paranoia to push some hidden agenda. The "open" design is a feature, not a bug, and why mobile apps and their isolated design only encourage silo'd data and horribly wasteful workarounds (upload something from one app into some distant third-party cloud service, and download it again for another app to access it... provided those two apps' developers have explicitly decided to provide such a feature.)
Every binary I run could delete all my data or steal my passwords or whatever else. That's not something to be afraid of --- it's something to cherish and be proud of[1], and it's why you don't run anything you don't trust. This "free-for-all" sharing and access encourages the sort of ad-hoc and unpremeditated interactions which are beneficial to the software ecosystem as a whole.
I usually have a magnifier, color identifier, and clipboard-listening-translator running. None of those would be easily implementable or may not even exist, had computers started out as locked-down systems from the beginning.
There's a difference between locked-down-by-Apple and locked-by-you according to your own choices (e.g. Qubes). Being able to run software from others without being completely vulnerable to them is essential to that open ecosystem we want. A free-for-all OS is like a meatspace without private property -- it can work at a small scale.
The data siloing on iOS is a UI decision, not a sandboxing decision. They decided most people simply don't understand filesystems (probably based on all the people they've seen who just save their files in the default directory, or the desktop, or the recycle bin(!) and then can't find them)
In the macOS sandbox, an app can access any old file, but only ones the user has implicitly told the app it can access - through the system "open file" dialog, double-clicking, drag & drop, etc.
A design that dooms even savvy users to being compromised is simply not acceptable anymore. Most people are not discriminating enough about the software they install. But okay, maybe you actually are and never make mistakes[1]. But is all the software you use signed? By a certificate traceable to a real, accountable person? Can a software update, perhaps released by someone who has hacked the original developer, come along and turn it into malware? Because all of your apps have full-capabilities, it can now do things that you would never grant if explicitly asked.
Now you're completely screwed. All of your emails, pictures, medical information, other embarrassing personal information/media, bank logins, and so forth are compromised. Maybe you have two-factor authentication for all of your accounts (doubtful), but who cares? It can just read the memory of your browser process, quietly wait for you to login to the various services, and then hijack your sessions in the background.
So tell me: how in the world is this an acceptable situation?
[1] I sure hope you never use any of these new-school package managers for various programming languages, text editors, and such. You know, the ones that grab the latest commit from GitHub repositories run by total strangers and then run them with full privileges.
Even if it’s signed software, the sandbox is a useful layer in limiting the scope of any vulnerabilities it has. Say a media player has a bug in its decoder that allows a malicious video file to make the player execute arbitrary code. If the program is sandboxed, they shouldn’t be able to pivot from that into stealing your ssh keys or recording your screen.
> So tell me: how in the world is this an acceptable situation?
It is benefit vs cost: for me it is acceptable because i want to be able to do whatever i want with my computer using applications that can freely interact with each other without the OS getting in my way. I want the underlying system to provide the functionality for the applications to perform their tasks but not add unnecessary barriers that make features practically impossible.
Also FWIW personally as a developer i highly dislike the idea of signed software. First of all i do not like the idea of having to ask (let alone pay) anyone to have my program be runnable by others and second i do not like the idea of my name being carried alongside my programs unless i explicitly add it.
It is benefit vs cost: for me it is acceptable because i want to be able to do whatever i want with my computer using applications that can freely interact with each other without the OS getting in my way.
It is not the eighties anymore. Software is exploited through malicious files that exploit vulnerabilities in file readers (e.g. malicious PDFs). Software distribution sites get compromised (Handbrake, Transmission, Linux Mint), etc. Signatures and proper sandboxing reduce the attach service considerably.
Also, to address your point: sandboxing does not have to be binary. For instance, on macOS, a sandboxed app can request the user access to the calendar or address book. In that way some random app can't just steal your address book. [1] By default an application cannot open any files from the user's home directory, unless the user e.g. choses it using a file opening dialog (which runs out-of-process). On Flatpak on Linux, you as a user can decide to run an application with more entitlements (AFAIK currently only through the command-line).
We need to go to a world where the power lies with the user and not the developer to decide what an app can access, upload, etc. macOS sandboxing and Flatpaks on Linux are now providing part of that solution.
Also FWIW personally as a developer i highly dislike the idea of signed software. First of all i do not like the idea of having to ask (let alone pay) anyone to have my program be runnable by others
Flatpaks on Linux are normally signed, but they can be signed using GnuPG key. No need to pay or ask permission. Of course, the downside of this model is that it puts the burden on the user to check that the key is owned by a reputable person.
second i do not like the idea of my name being carried alongside my programs unless i explicitly add it
Most users should not want to run a program by a random person.
[1] Don't say that free software is not vulnerable to such problems. A popular free software documentation browser embeds a Google Ad, some open source applications use Google Analytics. I do not want to upload any data to Google. Unfortunately, as a user, I can only avoid that by inspecting all my outgoing connections. Of course, this is out of the reach of a normal user. I would absolutely be in favor of sandboxes where by default an app could not make any network connection.
> We need to go to a world where the power lies with the user and not the developer to decide what an app can access, upload, etc. macOS sandboxing and Flatpaks on Linux are now providing part of that solution. [..] Flatpaks on Linux are normally signed, but they can be signed using GnuPG key. No need to pay or ask permission. Of course, the downside of this model is that it puts the burden on the user to check that the key is owned by a reputable person.
I do not see those as placing power in the hands of the user, i see it more as placing power in the hands of the platform holder the developer develops for and the user uses. The user should be able to do whatever he pleases, even at the cost of the developer's wishes and to do this, the platform must allow the user to subvert both the platform's restrictions and the program's requests.
Anything that doesn't put the user in a position of utmost authority and trust is not something that places powers in the hands of the user.
I think the emphasis is supposed to be on the ‘should’. That is, it’s not in the user’s rational interests to run such software. Hence the locked down security model all the platforms are moving toward.
Everything you install from the core debian stable repositories, is pretty much guaranteed to not have any actively malicious code. Otherwise it would be discovered and removed. All debian packages are signed by the package-upload process. It's not perfect, but it's been good enough so far. This is a world that is now niche, but used to be practical!
It's very depressing to be transitioning into a world where you can't use a small trusted open-source mail client and IM client etc, where you have to install crazy huge untrusted piles of flaky and inefficient code in order to interact with real-world people and institutions, and you can't easily control which version you have installed, and mobile devices become obsolete after just 3 years. (Even iPhones - not because of Apple, but because Instagram and Snapchat got too heavy/shitty for the older models! This does not happen for communication over SMTP or XMPP!)
So I see the "we need super sandboxes around every little thing" as a sort of bandaid on a much bigger problem which is not going to be fixed in the foreseeable future.
Like the XScreenSaver time bomb that caused a nag screen to pop up on April 1 one or two years ago on peoples screens? Debian distributed this code for years without realizing because no one even glanced at the code. The developer didn't even try to hide the time bomb, he even provided angry comments why he added the code.
The only reason Debian didn't distribute some really bad malicious code (that we know of) yet is because they got lucky and no developer tried to mess with them.
You’re assuming all Debian packages are secure. Like wlesieutre said in a sibling comment, software from a trusted source can and often does have vulnerabilities.
Every binary I run could delete all my data or steal my passwords or whatever else. That's not something to be afraid of --- it's something to cherish and be proud of[1], and it's why you don't run anything you don't trust. This "free-for-all" sharing and access encourages the sort of ad-hoc and unpremeditated interactions which are beneficial to the software ecosystem as a whole.
I usually have a magnifier, color identifier, and clipboard-listening-translator running. None of those would be easily implementable or may not even exist, had computers started out as locked-down systems from the beginning.
[1]https://boingboing.net/2012/08/23/civilwar.html