This and Stallman's "Right to Read" (posted yesterday at https://news.ycombinator.com/item?id=14332257 ) are particularly relevant given the whole WannaCry situation --- no doubt there will be plenty of authoritarian-minded thinking computers (and maybe even programming them) should be locked-down/regulated more, to stop such attacks.
In some sense, I rather take solace in the fact that hacks, attacks, cracks, leaks, etc. are continuing to happen regularly --- they are a sign that there is still some freedom left in society. "Imagine a world without crime" is a somewhat common phrase used by some, and if you actually do, you will realise that it would pretty much be the world of Orwell's 1984: there is no crime because there is no more freedom of thought nor action; everything has become under the control of some central authority.
This goes beyond computers, although they will be a large part of it; it's really a general war on freedom.
In some sense, I rather take solace in the fact that hacks, attacks, cracks, leaks, etc. are continuing to happen regularly --- they are a sign that there is still some freedom left in society.
I think this strongly conflates many issues. First of all, WannaCry spreads through a weakness in Microsoft's SMB implementation. Such a weakness can occur in both walled-garden and open systems.
Secondly, sandboxing, besides improving security, can increase our freedom: we get to decide which application can access what data. In the current model used on many computers, an application can access all user files and exfiltrate usage patterns, address books, etc. WannaCry would be pretty ineffective if it was sandboxed and could not encrypt user's files. Now such malware (as we've seen with the Handbrake malware) can upload all your passwords, browser history, etc.
Sandboxing is fine. It should be up to the user what we decide to sandbox and what not.
(I do agree that society is moving towards being totalitarian. Or at least, making itself extremely vulnerable to totalitarianism.)
Exactly. First of all, I'd like to be in control of sandboxing. Secondly, and this is just my dream, I'd love sandboxing to be designed to allow me as a user to easily pierce it, so that I could e.g. augment a sandboxed app with my own code if I don't like how it does something.
Unfortunately, the nature of the world is such that if you allow users to control sandboxing, the next wave of attacks will come from applications that kindly ask users to disable the sandbox because of $reasons.
> Unfortunately, the nature of the world is such that if you allow users to control sandboxing, the next wave of attacks will come from applications that kindly ask users to disable the sandbox because of $reasons.
You can make the best locks in the world and it's all for nothing if every time an attacker knocks on the door the user opens it up and lets them in, but the solution can't be to weld shut every door.
Security commonly fails at UX. We could do better.
But at some point, if you ask the user "should this app access your private information" and the user says yes, that's what needs to happen, and the user needs to learn when to say no.
Do not make it easy to disable the sandbox, but keep the user in control.
Linux evades a huge amount of virus and phishing just because you have to `chmod +x` stuff before running, for example. A "Software wants to do nasty stuff. Allow | Disallow" prompt just does not make it.
Also, users will want to piece the sandbox some times, do not make it an all or nothing situation. Make giving common permissions easy, rare permission hard, and give fine grained control over them.
Big data, machine learning and things like facial recognition in the public space is exceedingly leading us toward a society of total control.
We are not there yet, but I personally fear, that all it would take is the political realization, that the ability to predict behavior leads to more efficient governance.
Technology has the ability to "set us free" or "enslave us", but it's up to the people to decide which they want in a democracy. In the current conservative climate, we're certainly heading more toward dystopia, racing rapidly to make some of the darker cyberpunk come true.
In short, they identified easily manipulated people by analyzing Facebook profiles, and manipulated them through Facebook ads. For me this is like a real-life botnet. The laws say 1 person, 1 vote, so why not manipulate the person to vote the way "we" want them. In a big-data way, targeting swing states...
Not GP, but: not just people who disagree with me. People are influenced by what they're exposed to[0], and even if that effect is small on an individual level, in aggregate it can impact large portions of the population if you're able to manipulate the information that reaches people.
farm animals are more productive, and spread quickly look at how many cows there are.
on the downside, look at how they live.
I'm afraid this comment isn't up to HN standards, but i've always thought the teleological end of humanity was something that resembles an ant colony more than pack wolves, i'm not sure there's anything to be done to avoid it.
I find both the ant colony and pack wolves to be bad outcomes -- and something in between, an actual community that doesn't solely "exist to produce", but its also neither "each man for themselves", much better.
It's also funny how "ant colony" seems to be the direction of both the now deceased USSR-style socialism and modern capitalism -- with elites on top, total surveillance, and everything, just with different names.
David Brin calls this drive by the elites a meme of feudalism in his essay _Otherness_:
> Feudalism is one of the oldest. It may appear to be rare nowadays, but some philosophers and historians have called it the “most natural” of human societies, simply because it cropped up in so many places throughout the millennia -- everywhere, in fact, that metallurgy and agriculture combined to let close-knit groups establish and enforce inherited aristocracies.
well productivity in terms of reproduction. there wouldn't be as many cows or chickens or pigs, etc. without use breeding them, is what I meant. sorry for lack of clarity
We need to be careful with the scope here, I think. "War on general-purpose computing" is already abstract enough most people just roll their eyes and move on; expanding that to "war on freedom" guarantees that almost nobody will care.
I have mixed feelings about what to do. On the one hand, there would be merit in requiring a professional license for programmers like in other engineering fields - a lot of the mess in our industry could be removed if at least some jobs would require a license, which would give both a recognized right to refuse work based on ethical issues (backed by professional association) and the liability in case you fucked up badly and people died.
On the other hand, I fear the day when Turing-complete systems will get regulated and require a (probably expensive) license to use. Like many others here, I benefited a lot from being able to tinker with computers and programming languages in my teenage years, before I had access to formal education on the topic. I would like my children to have the same chance.
In a way, my feelings are reflected on a smaller scale in the way I feel about sandboxing. On the one hand, I appreciate the idea of isolation and don't like user-hostile software to be able to do whatever it wants on my system. On the other hand, I'd love to have the right to breach the sandbox myself and mess with software running on it. On Windows I still can alter GUI elements of running applications; try that on unrooted Android.
If it was just about security vs. freedom, then the problem would be relatively easy; some compromise could be reached (e.g. expanded definition of "life-critical" systems which would require licenses, and also hopefully licenses for jobs requiring use of personally identifiable information). As it is, there are other selfish/malicious actors in play - like music/movie industry pushing for DRM, corporations fighting for their walled gardens, etc. It'll be hard to navigate this problem space.
> On the one hand, there would be merit in requiring a professional license for programmers like in other engineering fields - a lot of the mess in our industry could be removed if at least some jobs would require a license
Right, and we all believe that the Volkswagon debacle really was some nefarious programmer doing it without any knowledge of management.
I'm tired of developers getting blamed for this shit, put the blame squarely where it belongs, on the people with the money who are making such decisions.
Through, to be fair, plenty of developers are cool with such management decisions as long as the company is cool or the decision helps their career too. They may even proactive come with proposing ideas to management.
Management is ultimately responsible for the overall culture. However, bad cultures find plenty of eager employees. We are talking about skilled people able to find job elsewhere if the ethics was consideration - we are not talking about uneducated dudes having no choice.
They'd probably find someone willing to risk their license for extra money, but it still makes hiring such people more expensive, and could make the management rethink the whole idea.
Good process - meaning testing by people motivated to find bugs would do hundred times more for security and quality then certification of individuals. Houses are not safe because of schools, they are safe because of building codes and fines of you break the code.
It is interesting to compare WannaCry to the Google Phishing Scam that was going around a couple weeks ago [1]
Obviously really different issues, but the resolution was that the phishing scam was shut down in short order because it relied on Google for OAuth and they turned the app off and revoked the credentials.
I always like to try to pick out the single most useful sentence or two from long essays like this. Here's my go:
"So when I get into a car—a computer that I put my body into—with my hearing aid—a computer I put inside my body—I want to know that these technologies are not designed to keep secrets from me, or to prevent me from terminating processes on them that work against my interests."
Did we even fight the war? Around here I regularly see people advocating for this sort of lockdown. It used to just be apple, but now it's in windows and Chromebooks.
There's a fair amount of lockdown with cloud providers as well. Not DRM related, but there nonetheless. Egress charges make it unsavory to move some apps out. And proprietary api setups make it more difficult to move as well...things like Google's cloud spanner, or AWS native features.
The big cloud providers will also, over time, shape what's available in hardware in a way that won't benefit on-premise customers.
I'd point at the SaaS model to be the death blow. Instead of running computation on our data, we put our data in third party control so that they can run their computation on what's now really their data. Chromebooks & stuff is just an extension of that in a world where SaaS is the dominant model of regular user's interaction with computers.
No, we never had a chance. If you consider these embedded "secure" CPU controllers in the news lately, we also have lost control of PC's as well. With baseband firmware, which was never ours in the first place, we have lost mobile.
Just like in the present we have mystery embedded CPU's all over in our computers, so in the future we will have mystery embedded modems siphoning our data all over in our computers. Offline will not be realistically achievable much longer.
Well, the locked-down nature of iOS and Chrome OS are what make them markedly more secure than a general purpose computer in the face of real-world threats. Just this week, my Mom's Chromebook kept her safe from what appears to have been a targeted attack (someone sent her an email with a link to a malicious .ru domain and my name in the From header). That locked-down design is also why I'm writing this response on a Chromebook.
Of course, we must fight tooth and nail against any regulations that would restrict access to general-purpose computing, or put limits on connecting a general-purpose computer to the Internet. But it's important to remember that there are absolutely legitimate reasons to use and recommend locked-down machines, and that the most technically adept of us will use a combination of both categories of computing device in our lives.
You not getting p0wnd is an example of the shareholders of the owner of the machine, and the interests of the data input substrate (you), being aligned.
You get this security in exchange for all your personal data being uploaded to their servers (for advertising to try and change your behavior for their clients, and for storage for your government, and perhaps others), and Google getting to decide which apps you can run, and which apps developers can publish in the store.
As long as you stay aligned with the interests of the shareholders of the ad-surveillance company (and the government) it's a great deal.
I've wanted a locked down computer ever since every program that I installed in 1995 wanted to put a useless icon in my tray and take over every file type association. General purpose computers are great when I'm writing the program, but awful when running someone else's poorly written program that wants to take control over my computer.
I like to think of a general purpose computer like a giant truck: great if you like that kind of thing, but not for everyone.
What about a thriving Free Software ecosystem where the code can be audited and modified, and where the best apps (i.e. the ones that follow what users generally accept to be a standard of "respect" to your machine and its user, as well as doing their job) bubble up to the top?
Using various Ubuntu-based distros and Free Software apps over the last 8 years, I have never felt like any program was overstepping the boundaries.
Could start by installing Java, let's see, oracle.com, click, ah, what's this? It's installing the Yahoo! browser toolbar by default??!!!
Yeah. Curated.
And even if you do manage to get a curated software store without lockdown that has a business model, it's not clear to me how you solve the problem of people sharing passwords with sketchy family members and friends who then install hidden software that secretly, behind their back, stalks them without so much as an informational prompt or opt-in.
No, I think there are good reasons for having devices somewhat locked down, especially when it's being done for the sake of the end user.
Ubuntu, Debian, Fedora, Arch; many Linux distributions provide excellent curated software repositories that just work for general purpose software. Have done so for well over a decade. All without locking the user out of his own computer.
And of course Steam, GoG, and the HumbleStore provide curated repositories for proprietary games on many platforms.
Linux official distro package repositories. Free open source software has had currated software repositories for many years and since hosting is quite cheap not much of a "business" model is required.
Don't want to deal with the official repository you can easily host your own and provide a package that adds your own repo and serves the latest and greatest version of your software. Want a solution that targets multiple distros and bundles deps there is no snappy.
Preventing untrusted users from installing software is also a solved problem. Require the root password or simply don't let them use sudo.
If you give morons and bad people physical access your machine AND permission to install software it seems to me that you have put you in a bad place where no amount of lockdown can render you safe.
Rather than make the futile effort to render your situation tenable when we know it wont how about we continue to improve actually feasible use cases?
> sharing passwords with sketchy family members and friends who then install hidden software
I guess that's a fair use case for locking down the OS.
But when it comes to various sites giving you bad software, how is locking the device and not giving the user control supposed to help? Any security measures like sandboxing could exist on a device without taking control from the user. And while installing zero programs is pretty secure, the user could make that choice without a lockdown.
> So, which curated software store/repo do you prefer?
Ninite is quite good. Other comments already pointed out various linux repos that do a very good job. Also look at the iOS store for something that has pretty aggressive quality control, even if the rules aren't perfect. (While iOS is locked down, the store would work the same if it wasn't.)
You sound like you've been living under a rock for twenty years! Choose pretty much any Linux distribution and you'll get an excellent package manager that plugs into curated lists of software, AKA repositories.
Can you elaborate? (I'm not familiar with the Windows Store, though I guess it is similar to the Apple and Play Stores.) What do you mean by your statement?
Your point holds true for Windows 10 S, but why do you consider Chromebooks to be locked down? You can enable developer mode, install your own OS on them, or even re-use the Chrome OS kernel for other desktop environments through crouton.
You're also not forced to use Google's Chrome Web Store (except on Windows with the official distribution of Chrome).
Well, you can do all that until you cannot anymore. The systems are designed in such a way that Google only has to flip a switch, and all Chromebook securely turn into bricks (presumably only the ones that haven't had developer mode enabled).
On top of that, Chromebooks are mostly used in an education environment where they are in fact locked down to Chrome OS for specific email address domains. Then the domain administrator can conveniently switch it into a brick.
Google for (oh the irony) "Widevine" and how it interacts with it. So it has the same sort of problems that Microsoft palladium had : things like Netflix and the like simply won't work as soon as you have the developer switch enabled.
General purpose computing is dependent on 2 things : ability to run your own programs AND the ability to lie about that fact to the network. Without the second the first is useless.
Google understands this subtle point, but it appears most people do not. So Google is exploiting our ignorance here.
I'm pretty sure Netflix works fine with developer mode switched on. I can confirm at the moment (away from Chrome OS devices right now) that Widevine works fine with Netflix on desktop Linux; I'm pretty sure I remember Widevine working on my Chromebook with developer mode enabled.
Sort of a side note here, but I think this is a big misunderstanding about Apple. Sounds crazy, I know, but hear me out before judging.
I think with Apple it's more about allowing the end user (the real end user, not the end user's boyfriend/girlfriend/wife/husband, child/sibling/parent, boss/teacher/roommate etc.) to have control over their own information.
In order to enforce this end user control, Apple needs to lock out rogue programs. It can't do so 100%, but it tries to get as close as reasonably possible. They try even harder with mobile devices, because mobile devices tend to be so personal and capture so much private information of one individual.
Apple has a lot to gain from controlling user experience, preventing malware, and protecting their 30% cut of software costs. Its not merely that they don't have much to gain from protecting users from snooping girlfriends. In almost every other instance they actually stand to gain from helping the other side.
Apple has every reason to help your mom/dad/boss/school ensure that you don't misuse THEIR property.
Your perspective seems to be mostly unfounded speculation that doesn't hold water.
Also because users are used to their phones being able to do less than a laptop. "The appstore doesn't have it" is a legitimate reason to not be able to use a program on a phone but not a Windows machine (for example)
Chromebooks are more open than nearly any other commercially available x86 and ARM laptop, they are in many ways more open than nearly any other commercially available x86 or ARM laptop since PC 5150. Yes, they have a verified boot process, but once you're admin on one you can just swap it to developer mode. It's clearly a feature, not a bug.
I can't tell if you're trolling by stating deliberately untrue statements or you have some insane thought process where this statement makes sense to you. The chromebook is deliberately and consciously a closed platform. You can't write or run arbitrary code on it, you can't swap the OS, you can't even SEE the OS.
That's the primary selling point. "We got this, don't look under the hood" Getting your own boot rom on there is a huge pain (I had to remove a screw off the main board) there were like ten steps I would never tell a non expert to attempt.
GP is right, in a sense. Chromebooks (the Intel ones, anyway) typically ship with Coreboot, and Google developers regularly contribute to upstream Coreboot.[0]
While you're correct that it's a pretty closed platform out of the box, you ultimately have more control over the platform than a typical x86 PC, if you go through the hoops of flashing your own Coreboot payload. The fact that it's hardware with official Coreboot support is notable, IMO.
> You can't write or run arbitrary code on it, you can't swap the OS, you can't even SEE the OS.
You can write and run arbitrary code on it, you can swap the OS, and you can even rebuild the firmware from source and flash it yourself (though at that point you will void the warranty since the firmware is in a position to cause hardware damage, but not for any of the other things). There is a local terminal emulator, and when you turn on developer mode you can have a local unix shell with mount and write access to removable disks, so you can image install media for an operating system. The boot firmware comes with SeaBIOS by default for the reason that Google (for whatever reason) wants you to be perfectly able to install bios-based operating systems as long as they're compatible with the hardware. If it's Linux, then any downstream hardware drivers involved are packaged in google's public kernel repositories, and if it's a userspace component, it's in the ChromiumOS repositories, where you can pull and build the entire operating system (aside from the pepper plugins for digital restrictions management and flash).
Seriously, try doing many of these things with a typical off-the-shelf Windows or Apple laptop and get back to me. On Windows laptops you need to jump through hoops to disable firmware features which prevent you from installing new operating systems, on Chromebooks it's well documented and has no effect on warranty or your ability to roll back the changes. Most (all?) windows laptops have no vendor-provided open source firmware, and most of them don't have even a third party one which works. To write an install disk on Windows, you need to install a third-party image burning application, and there's no guarantee that the firmware will even let you boot it. Many windows laptop manufacturers will either inconvenience or flat-out deny warranty claimants who have replaced the operating system on their laptop. Microsoft does not publish the source code of the NT kernel. Most Windows laptops contain at least one piece of hardware which is not adequately documented such that somebody could write a driver for it without reverse engineering.
Apple laptops are in some ways better, in some ways worse here. On the one hand, the firmware does not explicitly prevent you from booting third-party images, on the other hand, the firmware and hardware have undocumented behaviour which makes it exceedingly difficult to get most new operating systems to reliably boot and install on it. Usually even if yo u do manage to install and boot something, some important piece of hardware (the wireless chip, the display output multiplexer, the webcam, the touchpad) is either slightly or completely different from a documented counterpart, and takes months or years to be supported by anything except OS X and the first-party drivers on Windows. Darwin releases are usually eventually open source, Apple's compilers are partially open source, but also include several proprietary components. Most of the operating system libraries which weren't already available when they started are proprietary. Though on the plus side, you don't really need anything extra to write a boot disk for a third-party operating system. On the down, there is no provided firmware source code, and the system is sufficiently underdocumented that it's unlikely anyone will ever write an alternative firmware image for your given Apple laptop model.
My experiences with EFF in real life in Europe have been abysmal. I tend not to donate to organizations that are contrarian by statute.
Especially so when you go to conferences, you hear panels where a EFF representative is chatting with a Google lobbyist and ends up using sentences like - and I quote - "OMG, you are so right", "Wow, I can't agree more". The theme was the censorship of content and a critique - clearly instrumental to Google's bottom line - that any content shouldn't be regulated by law.
I had the opposite European experience. They seem to do pretty fine and talk constructively during the conferences I've attended where I listened to them, and I'm a regular donor to them, but, as a donor, I have to point this out:
Their communication with the donors is almost non-existential. Waited for like two weeks to receive an answer for them, had to ping them after two weeks, they've replied, but failed to answer my questions, which I pointed out and never received an answer, then they've sent me a t-shirt to a wrong country (got the address right, missed the country completely and I have no clue how that happened), so I had to ping them again and let them know that they've fucked up the country while sending me a package, to which they didn't reply, but a couple of days later sent me another email in which they claim to have sent another package to the right country. Meanwhile, my initial questions remain unanswered for about two or three months now and all I get from them is "another donation successfully charged" emails.
I don't mind a new security model for personal computing, one that sanely quarantines - or containerizes or sandboxes - an application.
Any browser or App process has nothing good to do in ~/Library other than its own application support and prefs plist files (and the equivalent in Linux and Windows)
If an App needs to access any artifact in my $HOME it needs to be explicitly authorized for that specific one (e.g. I want to embed images in a presentation) by a UI element that is part of a the OS itself.
Right now, I might be running as root on my machine, because as soon as I'm 0wned, my whole life can be siphoned to some dodgy server on the net.
The trouble with this sort of article is that it takes too long to get anything useful out of it. A better structure would be an explanation of the core idea at the beginning and the rest of the article to explore and defend the idea. Writing is not an ordered series of deductions as this author assumes. It's far more complicated.
This kind of text work is called essay, it doesn't need to follow the rules of writing an article because it's not one.
The author probably knows how complicated it's to write (he is a professional writer), and still manages to do it good enough to get some prizes for his work https://en.wikipedia.org/wiki/Cory_Doctorow#Awards
The problem is not a lack of access to a general-purpose computer. The problem is the lack of control over the computers you're forced to use in order to manipulate their attached I/O devices such as lights, refrigerators, screens, speakers, insulin pumps, steering and brakes, Internet connections, etc.
The Free Software Federation spends so much of their energy fighting against commercial operating systems and expends almost zero effort on embedded systems where catastrophic failure can equal death.
Ironically one of the biggest problems with "Internet of Things" junk appliances is not that they run locked-down DRM-saddled operating systems, but that they run free ones like Linux.
Linux itself isn't a problem, but that they're often bundled with tragically out of date versions of everything and there's never any thought given to how to update them.
This malware incident is simply relating to a commercial operating system. Imagine when someone's hacked your lights by worming into the Linux install on the controller and demands $50 in Bitcoin to return control to you.
Embedded systems are, in theory, much more straightforward to deal with - they don't have vendor lock-in. Yet. In contrast, the main reason that Windows has a massive userbase is its massive app support, which is a result of Windows' massive userbase.
They do a lot more now. They're often networked and remotely upgradeable.
Sure, you couldn't see the code in the microcontrollers in your old TV - the ones listening for IR signals and driving the LCD. But modern TVs do all of that and more; they have internet connections, some have cameras and microphones, etc.
And even if you couldn't see the code in your old pacemaker (which I would say was still a problem), your new one probably has wireless administration and firmware updates. Those likely have remotely exploitable security vulnerabilities[0], and even if you have one inside your body, you aren't allowed to look at the code yourself.
It used to not matter so much, often it was possible to replace them with SOP PLC like devices without much work so it made sense to think of them like hardware.
The relevant issue is "can the product's behaviour be changed after purchase - and if so, by whom?"
This was an important enough consumer rights issue even before the invention of the internet - now that it can be done secretly and remotely, it's paramount.
A corporate setting is completely different. They can choose how you access their own hardware (putting aside personal gripes for purposes of efficiency and convenience)
However the real problem is hardware level DRM implementation and non-open CPUs, which really mean that I cannot truly own my own personal computer
This 6 year old article is still relevant but now most of the public seems very satisfied with iPhones or Android phones as their primary digital device. Even though I can hack on Haskell and Python code on my iOS devices, I see them, and am happy with them basically being appliances.
When I program, I am in portable environments (Pharo Smalltalk, Emacs with Common Lisp or Haskell) that more or less can sit on top of any general purpose OS.
If our government (I am in the USA) locks down computing devices to an extreme extent, then that will just screw up our economy and more well run tax jurisdictions (countries) will benefit.
I'm really surprised there was no direct mention of the DMCA. That's turned out to be the thing that's given us un-repairable John Deere tractors and printers that won't use 3rd-party ink cartridges.
You will always be able to do general purpose computing, maybe just with a loss of speed. No one is going to back door every 10 dollar microcontroller.
10 dollar microcontrollers are way more powerful then pen and paper. In 10 years they may be as powerful as today's $2000 computers.
It's a matter of scale. You can do general electronics with some AA batteries. If you want to build a nuclear reactor, yeah there will be interest from others in your affairs.
"In 10 years they may be as powerful as today's $2000 computers."
That's been continuously true for some decades now. What might save us is capitalism in that the bean counters are not permitting $10 controllers for a toaster, those get the ten cent controllers. Also the cost of labor to program those is a limiter in that a mass market toaster with $50M of programming budget will be financially destroyed in the marketplace by a competitor spending $5K on "its a 60-180 second timer, nothing more".
An embedded swamp will form of $25K cars with expensive insecure software. Your car will get powned 1000x more often than your toaster even though the risk to your life is fairly similar.
In some sense, I rather take solace in the fact that hacks, attacks, cracks, leaks, etc. are continuing to happen regularly --- they are a sign that there is still some freedom left in society. "Imagine a world without crime" is a somewhat common phrase used by some, and if you actually do, you will realise that it would pretty much be the world of Orwell's 1984: there is no crime because there is no more freedom of thought nor action; everything has become under the control of some central authority.
This goes beyond computers, although they will be a large part of it; it's really a general war on freedom.