Utter and absolute b.s. ,the purpose of purism is privacy,not security. They are not claiming greater security than the iPhone or Pixel,they are claiming transparency,open hardware and a non-commercial alternative mobile OS.
The price of having "a large team of ______" is either closed hardware or complete loss of privacy.
It pisses me off so much when I read well reputed people like you say ridiculous things like this. How big was Linux's security team when it started out? Look at duckduckgo and protonmail competing with google search and gmail. Are you saying they won't be able to afford a dedicated security audit team once the product starts profiting in the tens of millions?
> If you want to use a niche phone as a fashion or political statement, more power to you. But if you try to market that phone as "transparent code is the core of secure systems", I'll take issue with that; it's neither a necessary nor a sufficient condition for security.
I will make that fashion statement. And you know what,security is all about risk(and I know you could teach me an entire course on this). As an individual, hardware or software that cannot receive an independent review either due to lack of openness or transparency in the development process carries an enormous amount of risk. Even someone as talented as yourself cannot unilatetally audit android's codebase,you can't even start with iOS(even if you could you 'd be breaking a fineprint somewhere). The fact others can audit the code and design is a relevant factor when assessing risk and evaluating security. More transparency == less risk == better security.
> This phone may very well be more "fair" or "ethical" than an iPhone. But if it's not as secure as an iPhone, it's unethical to claim otherwise.
What happened to threat modeling? As an individual,I care much more about Google tracking my activity and sharing it(I'm sure you read the recent AP piece revealing google doing just that even when location was turned off) than I am worried about russians using a 0 day kernel rce or the FBI trying to decrypt my phone. Many of us are just normal people seeking the basic dignity of privacy and property ownership(as in freedom over the phone you own). Their product has a very real and significant impact to the reduction of the amount of risk I have to accept as an individual who has to use a smartphone, I don't see why you need to bilittle their work with obviously non-constructive criticism.
Edit: to all the '*' points you mentioned,how exactly and practically would purism's lack of those features impact the privacy and security of purism under real world threats. Also,in case it wasn't clear already,Google and Apple are considered threats to privacy and security by many who support works like purism's.
If your phone is not secure against outside, malicious actors, then all the privacy you gain is entirely pointless.
A phone needs to be secure and protect privacy.
Having a security team doesn't mean you get closed hardware, it means you have a security team.
>How big was Linux's security team when it started out?
Not very large and nowadays there is lots of people working on Linux security and they discover a lot of CVE's that are fixed in the kernel and then backported. Early versions of the Linux kernel weren't very well guarded against outside attackers (in fact, we're only just seeing the tailend of when vulnerabilities get introduced to the kernel lift from the beginning of the git history at around 2.6).
>Are you saying they won't be able to afford a dedicated security audit team once the product starts profiting in the tens of millions?
I think, and this is pure speculation based on the text of the OP, that the statement they made is intended to highlight that security teams cost money and Protonmail and DDG aren't able to afford as good security teams as google as a simple function of "how much money can we spend on it".
>As an individual, hardware or software that cannot receive an independent review either due to lack of openness or transparency in the development process carries an enormous amount of risk. [...] More transparency == less risk == better security.
Sing along kids: "It all depends on your threatmodel!"
But seriously, this conflates "open source = security" which isn't remotely true and can be easily disproven by adding a backdoor to any open source project and sending you the result. How many people out of 100 would even understand the code, how many of those are able to find the backdoor, how many of those can patch it out and compile it without backdoor?
Open Source gives security for people who understand code, not for the people I meet at work that have difficulties opening Facebook if you remove it from their homepage setting.
>What happened to threat modeling? As an individual,I care much more about Google tracking my activity and sharing it(I'm sure you read the recent AP piece revealing google doing just that even when location was turned off) than I am worried about russians using a 0 day kernel rce or the FBI trying to decrypt my phone.
Same trap as before. Lots of people don't worry about facebook tracking and don't worry about russians using a 0day on them. Most people won't buy a phone on the premise that google won't track them, they buy phones based on the features it offers and the big numbers vendors print on the spec sheet. Of course lots of people still don't want tracking but together with the previous group they form the majority of people who might not want tracking but ultimately can't be assed to care enough to spend more than a couple dollars per year on avoiding it.
>to all the '*' points you mentioned,how exactly and practically would purism's lack of those features impact the privacy and security of purism under real world threats
First point; separation of concerns is good here, a separated baseband means if for any reason the baseband is compromised, and even open source gets compromised, it cannot damage the phone itself. This makes hacking the phone from the outside through the telephone network rather difficult.
Second point; Auditing by security teams improves code security. As mentioned above, the Linux kernel and many other high profile projects receive a shitload of auditing by security professionals combing through the code because if they don't the world would spontaneously combust about 32 seconds later.
Third point; A secure enclave is very useful. Even some open source projects have them such as the U2FZero because they enable the software to operate on a zero-knowledge principle; it cannot leak your private key if it has no access to the private key. Similarly on a phone, your encryption key for storage can be a very safe 512 bit key and your password is compared using on-enclave software protecting the key itself. This way a state-actor or malicious mussad-level actor can't get your phone (though mussad would just replace it with a uranium bar and kill you by cancer because mussad doesn't care) encryption key because the software will delete the key or simply not grant access if it detect manipulation.
Fourth point; Getting third parties to evaluate your design is helpful, again as mentioned, a lot of high profile OSS projects have third parties scanning the code because two pairs of eyes is better than one.
> If your phone is not secure against outside, malicious actors, then all the privacy you gain is entirely pointless.
Not true,even the most insecure phone is secure against some subset of attackers. If you don't trust the maker of your phone,all other security is useless. Imagine going into war but you suspect your body armor and vehicle is boobytrapped by your own side...
> Having a security team doesn't mean you get closed hardware, it means you have a security team.
Didn't claim othetwise. Closed hardware is needed to control the market well,alternative being to control user data and open up hardware and software.having a security team dedicated to a baseband audit means your profit a very large sum and you're already succesful...
> Not very large and nowadays there is lots of people working on Linux security and they discover a lot of CVE's that are fixed in the kernel and then backported. Early versions of the Linux kernel weren't very well guarded against outside attackers (in fact, we're only just seeing the tailend of when vulnerabilities get introduced to the kernel lift from the beginning of the git history at around 2.6).
Same can be said about windows seecurity,the point was lack of dedicated security teams early on. Even making $10mil profit a year,you'll find it difficult to find one security guy and dedicate resources to support his work. My whole rather obvious point you missed here was the correlation between adoption of a product and ability to dedicate resources like a security team.
> I think, and this is pure speculation based on the text of the OP, that the statement they made is intended to highlight that security teams cost money and Protonmail and DDG aren't able to afford as good security teams as google as a simple function of "how much money can we spend on it".
Somewhat,I understood it as "more money means more security and transparency is much less valuable". Which I disagree with. Transparency and good security hygeine are much more important than throwing money and bodies at it.
> But seriously, this conflates "open source = security" which isn't remotely true and can be easily disproven by adding a backdoor to any open source project and sending you the result. How many people out of 100 would even understand the code, how many of those are able to find the backdoor, how many of those can patch it out and compile it without backdoor?
I made a point out of security being a measured evaluation of risk. Transparency and being open source are variables,just like having skilled developers,a good security procress,good project management and resources like money and time. You need some of each but completely ignoring a variable means everything it multiplies is also 0. Opensource helps improve security,but only as a variable.
> Same trap as before. Lots of people don't worry about facebook tracking and don't worry about russians using a 0day on them. Most people won't buy a phone on the premise that google won't track them, they buy phones based on the features it offers and the big numbers vendors print on the spec sheet. Of course lots of people still don't want tracking but together with the previous group they form the majority of people who might not want tracking but ultimately can't be assed to care enough to spend more than a couple dollars per year on avoiding it.
Most people didn't have sex with condoms either until sexed came along. After fb's facebook fiasco,something like 42% of their users either stopped or dramatically reduced using fb. People have no choice but to buy apple or iphone therefore you're purely speculating here. Most people don't know exactly how bad things are,try showing someone you consider average their google activity and location history and offer them a purism phone and prove me wrong. Most people want a fancy featured phone but they also believe their hard earned money should be enough of a price to pay. They would buy the privacy enabling phone with the same looks and features for a higher price.
For the rest of what you said,it seems you ignored the part where I said 'real world threats'. Name one real world compromise using baseband firmware and I'll donate $100 to a btc wallet of your choosing.
The price of having "a large team of ______" is either closed hardware or complete loss of privacy.
It pisses me off so much when I read well reputed people like you say ridiculous things like this. How big was Linux's security team when it started out? Look at duckduckgo and protonmail competing with google search and gmail. Are you saying they won't be able to afford a dedicated security audit team once the product starts profiting in the tens of millions?
> If you want to use a niche phone as a fashion or political statement, more power to you. But if you try to market that phone as "transparent code is the core of secure systems", I'll take issue with that; it's neither a necessary nor a sufficient condition for security.
I will make that fashion statement. And you know what,security is all about risk(and I know you could teach me an entire course on this). As an individual, hardware or software that cannot receive an independent review either due to lack of openness or transparency in the development process carries an enormous amount of risk. Even someone as talented as yourself cannot unilatetally audit android's codebase,you can't even start with iOS(even if you could you 'd be breaking a fineprint somewhere). The fact others can audit the code and design is a relevant factor when assessing risk and evaluating security. More transparency == less risk == better security.
> This phone may very well be more "fair" or "ethical" than an iPhone. But if it's not as secure as an iPhone, it's unethical to claim otherwise.
What happened to threat modeling? As an individual,I care much more about Google tracking my activity and sharing it(I'm sure you read the recent AP piece revealing google doing just that even when location was turned off) than I am worried about russians using a 0 day kernel rce or the FBI trying to decrypt my phone. Many of us are just normal people seeking the basic dignity of privacy and property ownership(as in freedom over the phone you own). Their product has a very real and significant impact to the reduction of the amount of risk I have to accept as an individual who has to use a smartphone, I don't see why you need to bilittle their work with obviously non-constructive criticism.
Edit: to all the '*' points you mentioned,how exactly and practically would purism's lack of those features impact the privacy and security of purism under real world threats. Also,in case it wasn't clear already,Google and Apple are considered threats to privacy and security by many who support works like purism's.