Hacker News new | past | comments | ask | show | jobs | submit login

As much as his core message is debatable, it's usually pretty coherent and logical. However, I'd like to criticize one specific part of this essay: the claim that Chrome auto-updates are a "universal backdoor". Although silent updates tend to viscerally feel a bit creepy, they differ little in practical consequence from regular click-through ones: it's not like most users verify the binaries of every package update on their system, so having the user click through just makes an attacker with a fake update wait a little longer. Note that Chrome auto-updates can be turned off, and both the updater and the updated software are (mostly) open source and you are free to compile your own version, so it's hard for me to see the problem.

(for the record - it would be nice to have a system to prevent one server from distributing malicious updates to one user, perhaps by verifying with multiple independently owned servers. However, I do not know of any mainstream software that does that, including free Linux distributions, so it's unfair to criticize Chrome for the lack.)




> updated software are (mostly) open source and you are free to compile your own version

The "mostly" part kills your argument. That means, the updates contain non-free parts and are hence unsafe. Chrome should be avoided if Chromium or Firefox works for you.


Any binary compiled by a third party could be malicious, whether it's supposed to be compiled from free source code or not, whether it's Google or Mozilla or Debian. Verifiable compilation would partially solve this and would indeed make the non-free parts a problem, but this is another thing that nobody is doing (yet).

edit: Of course this also has nothing to do with the silentness of the updates. :p


One of the benefits of having the four freedoms is you, the user, can choose who (if anybody) to trust.

Don't trust Mozilla to build a non-malicious binary from the Firefox source? Get somebody you do trust to build it. Or build it yourself.

Of course when you get down to it you have to trust that there exists a compiler that isn't malicious but I think the point still stands


One also has to trust the source code of the project. Given the sheer number of unintentional back doors in the form of exploitable bugs occuring in so many projects, this is a complete non-starter to me, even had I the time to completely review the source code of any given project. I will miss mistakes, and I will most definitely miss intentional "mistakes", even if I'm well versed in the programming language at hand and using it daily within a professional capacity.

Open source has some significant values to me, but security is not among them. It may raise the barrier to some very simple, basic, hobbyist style maliciousness going uncaught, but this is by and large not what I'm concerned about. I'm concerned by more sophisticated maliciousness. If I cannot trust your binaries to not contain it, I generally cannot trust your source code to not contain it either, even if I personally review it.


"All bugs are shallow given enough eyeballs". This can apply to intentional security exploits as well. Of course, a small free project may not have enough eyeballs, but at least in a bigger one that does generate some trust.


An oft repeated phrase, but I need only point to the Debian OpenSSL keygen debacle -- and how long it went uncaught -- to note just how easily extremely serious bugs in code known to be extremely security critical can go uncaught despite the "number of eyeballs".

Bigger projects lead to bigger attack surfaces -- I'd trust the small free project more than I would the bigger one. Less code to review, fewer contributors one must simultaneously trust (I'd model project trust as each contributor being a potential single point of failure) and -- all other things being equal -- the same number of eyeballs per LOC.

I'd qualify neither Firefox nor Chrome as small projects.


Oft repeated because it contains some truth. Also, overweighted. Availability of source code is not a substitute for security audit and good practices in development. It does help a little, directly. Indirectly, it helps a lot, because it means now you (or anyone else) can pay anyone to perform that audit. You can choose who you trust, beyond simple blind trust in the person providing the software.


You don't have to trust a compiler. Even with Thompson's original presentation, you could (with sufficient care) audit the machine code - possible, just impractical without a seriously massive investment. More recently, though, someone figured out how to make sure a compiler is cleanly built from its source: http://www.dwheeler.com/trusting-trust




Consider applying for YC's W25 batch! Applications are open till Nov 12.

Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: