I agree it is a balancing act, I reviewed the key takeaways with a few disagreements:
> Right to Verify: Society has the freedom to inspect the source of all software used, and can run it as they wish, for any purpose (Software Freedom).
This is an age old question raised by proponents of Richard Stallman's vision of what software distribution should be like. I strongly oppose this view, and here is why:
- Software that is critical to security, such as cryptographic libraries and protocols should be open source. At some point, trust needs to be placed on someone. These open source libs can be modified by Apple and we wouldn't know. Ok, then, the argument should be that Apple should open source iOS. Oh...but Apple can do things at kernel level. Ok, force Apple to open source the kernel. Oh...but Apple can trick us with microcode level code. Ok, force Apple to open source that too ~~! This chain can go on to the point where you can just say - I am building the phone myself from scratch. Then, you may ask, why should I trust Pine64's hardware? You can't just arbitrarily stop at some point in this chain of trust!
- Software that makes the Clock app tick and the Weather app give weather information has no business in the Right to Verification. A game that keeps you engaged and Photos app that edits your photos - with everything opensource, the world would be worse off. Innovation would cease. Free low-quality stuff everywhere.
- Users should own the data in an ideal world. I would hold the keys to my own data but then I realize how misinformed, incompetent and careless the general public is with their own keys. Most people can defend their household and their belongings, but when it comes to technology and abstract concepts - it would be a total chaos if every user had to keep their own key safe. I would pay for someone to keep it secure for me (sounds insane, I know but don't we hire security guards?). If that's true, then I might as well pay Apple to keep my stuff secure. When you buy a door lock, you're already placing trust in the vendor who manufactured the lock.
> You can't just arbitrarily stop at some point in this chain of trust!
And you shouldn’t. There should be a possibility to verify at every single level, even if it might require special tools. Independent verification is the key to trust. But users can also choose to stop at any point if they so wish.
> - Software that makes the Clock app tick and the Weather app give weather information has no business in the Right to Verification.
Actually, such software should be verifiable and open, too, because otherwise it will have too much power over user, just like any software. It’s basically what Stallman says, and every time users start to use proprietary software in any form, it starts to do things against their will. Power corrupts.
> with everything opensource, the world would be worse off. Innovation would cease. Free low-quality stuff everywhere
I do not see how you came to this conclusion. A lot of free software is of higher quality than the proprietary alternatives. Care to elaborate? You are also using word ‘free’ as if it’s about price. Free software is not gratis.
> but when it comes to technology and abstract concepts - it would be a total chaos if every user had to keep their own key safe.
I fully agree here. The solution however is not to choose an almighty emperor that has infinite power over you, but to allow users to choose whom they trust and change their mind. I trust Purism with my hardware and software, but I also have the power to change my mind and pay someone else to develop my software, since it’s FLOSS.
> Right to Verify: Society has the freedom to inspect the source of all software used, and can run it as they wish, for any purpose (Software Freedom).
This is an age old question raised by proponents of Richard Stallman's vision of what software distribution should be like. I strongly oppose this view, and here is why:
- Software that is critical to security, such as cryptographic libraries and protocols should be open source. At some point, trust needs to be placed on someone. These open source libs can be modified by Apple and we wouldn't know. Ok, then, the argument should be that Apple should open source iOS. Oh...but Apple can do things at kernel level. Ok, force Apple to open source the kernel. Oh...but Apple can trick us with microcode level code. Ok, force Apple to open source that too ~~! This chain can go on to the point where you can just say - I am building the phone myself from scratch. Then, you may ask, why should I trust Pine64's hardware? You can't just arbitrarily stop at some point in this chain of trust!
- Software that makes the Clock app tick and the Weather app give weather information has no business in the Right to Verification. A game that keeps you engaged and Photos app that edits your photos - with everything opensource, the world would be worse off. Innovation would cease. Free low-quality stuff everywhere.
- Users should own the data in an ideal world. I would hold the keys to my own data but then I realize how misinformed, incompetent and careless the general public is with their own keys. Most people can defend their household and their belongings, but when it comes to technology and abstract concepts - it would be a total chaos if every user had to keep their own key safe. I would pay for someone to keep it secure for me (sounds insane, I know but don't we hire security guards?). If that's true, then I might as well pay Apple to keep my stuff secure. When you buy a door lock, you're already placing trust in the vendor who manufactured the lock.