Hacker News new | past | comments | ask | show | jobs | submit login

Biased in favor of empowering individuals, rather than making them serfs under corporate landlords? What's wrong with such a bias?



We lived under that model, and it sucks for a lot of people.

Barons also provided for the common defense. An unlocked software architecture and untrusted code leaves people at the mercy of hackers, it doesn't empower the average user.

We can do that defense of users better than we're doing right now, by divorcing the trust model from the paid vendor model, but it's worth noting how we got here in the first place and why where we are now is better for a lot of people than where we were. If we can't recognize that, it's hard to make a value proposition that people will actually sign on to.


An unlocked software architecture and untrusted code leaves people at the mercy of hackers, it doesn't empower the average user.

There are other solutions to this problem that do not involve surrendering control over everything that matters to corporate interests that may coincidentally align with your own sometimes.

For example, do we really think a corporation primarily motivated by making its profits is going to be habitually careful about software security? We know this is not often the case, because the quality of so much software produced by big businesses today is terrible in terms of security. What is more telling is that it continues to be terrible even though we are well aware of methods to mitigate many of those vulnerabilities. But those methods usually require more skill and/or more time to get right, and typically that means greater up-front costs, and since it turns out that Joe User will often put up with junk security and pay good money for the software anyway, the profit incentive does not have the outcome we would like.

Note that this doesn't necessarily mean Joe likes the poor security. He might be willing to use the tech anyway because he values what he gets from it more than he believes the dangers are costing him, or because he believes the dangers are inherent in the tech and unavoidable, or because he just doesn't realise the dangers are there in the first place.

This can be solved in a number of ways, but most of them come down to showing people that it's possible to do better by writing software that actually does better, and given the lack of commercial incentives coming from above, the most likely place we'll see that is in community efforts. Those efforts don't even have to result in some sort of openly developed software that dominates its market to succeed here, just to make it obvious to enough normal people that they don't need to accept substandard junk and they should demand better in return for their money.

Much the same argument can be applied to issues like privacy, repairable and upgradeable hardware, open standards for data portability, freely programmable devices, and no doubt many more. If competition in the commercial markets isn't getting the job done, yet we know doing the job is possible, it's up to those of us with the knowledge and skills to push things forward for now. That could mean directly creating better hardware and software, educating our less skilled and experienced peers to improve standards in the industry and make hiring better people less expensive, or simply educating users and pushing for higher standards from the customer's side.


"For example, do we really think a corporation primarily motivated by making its profits is going to be habitually careful about software security?"

Yes. It's the monarchy argument - Apple focuses on security because they own the ecosystem, so they're hurting themselves and there's no-one else left for Apple to blame if they screw it up.


Let me know when everything on iCloud is fully end-to-end encrypted. Until then, Apple are still knowingly compromising security in the interests of other priorities.

Maybe those priorities are well-intentioned, helping to make sure their customers don't lose access to precious photos because they lost some important key. Maybe they are driven by more nefarious motives, such as the endless rumours of government interventions. It doesn't really matter, because the gaping security vulnerability is there all the same, and much of what Apple does actively pushes users into it, and at the same time Apple makes a big deal about the security and privacy of its devices.

At least if you are technically inclined you can read their more technical documentation to understand what is really happening, which is to their credit. But it seems likely that most people potentially affected by that vulnerability won't have done so.


Full end-to-end encryption bumps up against Apple's second goal of user friendliness. If the data they store is encrypted with a key that you and only you own, and you lose that key, they can't help you as a user.

This may be surprising, but a lot of users don't accept that as an answer and find this to be a less valuable business model than one where the trusted vendor has skeleton keys.


Sure, and maybe that's a reasonable position to take commercially, though I note that Apple does use end-to-end encryption for various specific categories of data on iCloud so the argument about being user-friendly with recovery options is not without its weaknesses.

In any case, iCloud still isn't fully secure, and Apple isn't very honest about that in its marketing. It also doesn't give those users who do value security and privacy more highly the opportunity to use iCloud with real security, while simultaneously making it unnecessarily difficult to transfer data using other, more secure methods.

As one potential alternative for discussion, you could instead construct a system where users had the option of keeping keys entirely under their control, with suitable backup options, or of creating an online emergency recovery system where partial key data was given to Y trusted friends and recovering that data from any X out of those Y would be sufficient to reconstruct the key. Each friend need only know that they are agreeing to be an emergency keyholder for someone, and no-one else other than the original user and the keyholder need know that the relationship even exists.

The UI for a system like that could be extremely simple. Turn on the emergency recovery system. Identify a list of friends you want to trust. They each get asked if they're willing to help, and you either get a list of people who all said yes or prompted to choose more people if anyone declined or couldn't be reached. Once you've got enough, everyone who is a trustee carries around a suitable fragment of the other keys on their own secure device(s) in exactly the same way that their own key is held.




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: