Hacker News new | past | comments | ask | show | jobs | submit login

"The open source community has a long way to go if it's going to clearly demonstrate that it's model is advantageous"

How is it less safe than not having the source, though? Absent the source, how can you trust that the once trustworthy, if secret, code hasn't been nobbled via hackers, a court order etc.

The model of security through obscurity is what's broken, and that's what we've got here. You have to trust Facebook today, and forever and trust anyone else with the technical or legal power to silently violate its security; something that's much harder if the code is available for analysis.




It's not less safe to have the source code. Source code is a good thing. All things being equal, I prefer open source components too. But all things are not equal.

In this case, if you're ideologically attached to open source, you're in luck: the cryptographic components WhatsApp uses are open source, and there's another good messenger that uses them: Signal, by Open Whisper Systems.


The problem with WhatsApp is that IT is not open source, so you only have facebook's say-so that they're using the signal code, and you have to have faith that they've implemented it securely, and you have to trust that, as an american company, the current or future government won't silently compel them to introduce a weakness, such as allowing that when a phone gets the message `sing` it sends an encrypted copy of the last n messages back to facebook. I'd be more comfortable using the compilable, open source signal app, because i at least have the opportunity to see what the code is doing, should i somehow manage to find an sabotaged version of clang, gcc, vs etc.


People need to stop writing this exact comment. Even if you're right about open source and security, you can't be right about it for this reason, because this comment is false.

You are not in fact stuck with Facebook's "say-so" about what WhatsApp is doing. Obviously, if you have the executable binary, you can straightforwardly validate its functionality with a disassembler/decompiler. There are thousands of people that do this professionally.

There might be some other reason why it's important that WhatsApp's code be published, but it can't possibly be this one.


Has anyone actually done this? I lack the skill to but for a piece of software this important, surely someone is working on this? And will it be repeated for every update?

Personally if I was the NSA or GCHQ I think it would be a wicked smart plan to release this and have it be actually secure, have it audited, gain trust, let people use it for a while then after a while release an update that shoved all the messages in memory to a server somewhere once people let their guard down. Implausible yes, but that's the issue with trust. And I'm not sure releasing the clean source would really help this. The only thing would be to hold up on updating the app until a trusted party had audited the updated binary. More paranoia: Also would have to audit the specific binary for every country with wierd laws (e.g. UK's RIPA) appstore to make sure we didn't get a nice local flavour of poison.


If WhatsApp releases the code, how do you know that's the code they run their service on? Open source does _nothing_ with respect to the trust issue with third party services like WhatsApp or Facebook. It's completely orthogonal.

I was wrong about heartbleed, it was found through a source code analysis technique called fuzzing - decades after the vulnerability was introduced. That's decades during which source code analysis could have been used by bad actors to find the same vulnerability and exploit it.

Opening the code doesn't by itself eliminate the vulnerabilities. What it does do is fire the starting pistol on a race between black hats and white hats to find and either exploit or close the vulnerabilities. It's very much a two edged sword. So what we need is confidence that the white hats are winning that race.


"Opening the code doesn't by itself eliminate the vulnerabilities. What it does do is fire the starting pistol on a race between black hats and white hats to find and either exploit or close the vulnerabilities."

The race is more dependent on motivation than whether the code is open or closed. Often, blackhats are financially motivated (whether they themselves monetize the vulnerabilities directly - or they are hired-by/paid-for a 3rd party) - they factor in their return on investment (ROI). Open source merely makes their job easier but given enough motivation, they can and will find vulnerabilities in binaries, remote services, etc almost as easily.

One problem is that many people believe in "many eyes makes all bugs shallow". Numerous examples prove that assumption false. Is it because there is more financial interest in closed systems? ...or is it because in open systems people automatically assume that because it's open that someone else must have vetted it? (e.g. "if I'm interested in it, and thousands/millions are too, then it's highly likely some expert better than me already looked at")




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: