Hacker News new | past | comments | ask | show | jobs | submit login

Just chiming in here: it’s almost all about the graph. If you have the graph, the content is almost irrelevant.

This is why Signal hiding the graph as best they can, using SGX, is incredibly important work. Say what you want about Secure Enclaves, we know of no better way to conceal social graphs.

Yes there is still potentially some metadata analysis that can be done at the server to coordinate IP addresses but we know signal doesn’t keep those logs because of their response to the sealed subpoena (which they successfully sued with the ACLU to unseal):

https://signal.org/bigbrother/eastern-virginia-grand-jury/

We can only dream of a world where companies are held to this standard of transparency and user privacy.




>...we know signal doesn’t keep those logs because of their response to the sealed subpoena ...

That doesn't prove that. If Signal was, say, a NSA project they would have to respond to such things in that way to protect the signal intelligence value of the metadata they were collecting for their primary mission.

After Crypto AG we know it is a bad idea to trust any particular entity. Something like Signal can only be trusted as much as can verified.


Absolutely. You should trust anything as much as you can verify it and no further.

I submit that there is no better option right now.


If you are not trusting the people that are running these things, then Signal is just another siloed messenger where the servers are controlled by a single entity. There are certainly worse but Signal is not special.


Signal has open clients with reproducible builds. We know that they are keeping their promises wrt what information is communicated with the backends. That's a step above the other options in common use, and in fact does make Signal special.


> Signal has open clients with reproducible builds.

Not really. First of all, there is only one Signal client allowed to connect to Signal’s servers. And in the real world, the vast majority of Signal uses are getting their APK for that app from the Google Play store (the Signal team has said that they prefer you to use the Play store as well, instead of direct-downloading an APK from their website which they offer only grudgingly). That means that a state-level actor could possibly carry out a targeted attack to replace the Signal app on a given person's phone with a malicious build.

Also, Signal’s reproducible build system requires a specific version of the Android development kit. It has been pointed out that a state-level actor could be sitting on vulnerabilities in that, and not in the Signal source code itself.


Both these attacks indicate a problem that doesn't have anything to do with using Signal. If the actor can replace apps on your specific phone, then you're pretty fucked no matter what app you use.

If the attack is on the android dev kit, but not on signal, then.. the attack isn't on Signal, it's on the dev kit. Unless Signal's using an unusual version of the dev kit, your risk exposure to this attack is equal to any other app that you would use instead of Signal.


> That means that a state-level actor could possibly carry out a targeted attack to replace the Signal app on a given person's phone with a malicious build.

No, they couldn't. They would need the Signal developers' key. Android requires app updates to be signed with the same key as the original app.


A state-level actor can get the Signal key either covertly or by simply marching into the Signal offices with either a warrant or (if that fails) guns. Now, whether that will actually happen is a secondary issue -- but I submit that you have a mistaken conception of what a "state-level actor" means in a threat model. The fact that Google Inc doesn't hold the necessary keys but Signal LLC does is not a meaningful distinction to a state-level actor.

That isn't to say "all crypto is hopeless", simply that you shouldn't consider Signal to be state-level actor proof.


The point isn't to build your own and use it; it's to verify that the binary in the app store matches the source they published.


Do you personally do that with every release (which happens every few weeks or so)? Do you know anyone who does that who is trustworthy? If not, it's a fairly useless form of protection.


Are you running an open client? Is anyone?

That's all a smoke screen. Nobody is running an open client with a reproducible build, everybody is running whatever version is downloaded from their app store of choice.

It's not special, and I don't trust it a bit.


The point of reproducible builds isn't to run an open client, but validate that their copy in the app store matches the source they say it does.


But nobody actually does that.


Not manually, perhaps. But automated integrity checks of reproducible builds are trivial to write.


Have you personally done that? Do you know of anyone who is doing that and publicly tracks said verification? It doesn't matter how trivial it would be to verify if nobody is actually doing the verification (not to mention you'd actually want many people doing it and publicly posting their verification, as well as you checking that your hash matches everyone else's before installing the APK -- and there is no automated setup for doing that on Android.)


I don't think any significant number of people do it. I don't use Signal specifically, but I don't even know that there is a way for me to actually do it and then track whether that matches the version the iOS app store loaded on my phone, at least not without jailbreaking the phone.


What's better, Signal or Telegram?


In contrast to Signal, Telegram doesn't end-to-end-encrypt messages by default (they get stored in plaintext on their servers), it also doesn't protect the social graph and even stores your contact list on their servers. Even WhatsApp is more secure than Telegram.


Awesome. Thanks


They are more or less the same thing if you are not trusting anyone. Both require you have to verify the key fingerprint for a particular contact (safety numbers) if you want effective end to end encryption. Both are silos. Telegram is better about distribution and can be gotten from places you might trust better (e.g. F-droid, Debian). Both have some sort of reproducible build thing going on. Both could get access to your connections to other users if they wanted where Signal also insists on access to everyone's phone number. Telegram works on desktop without also insisting you have the program running on your phone.


Addendum because I can't edit any more. Apparently Telegram needs a phone number as well.


I would be interested to learn if you have examples of services whose privacy practices you admire more than signal’s.


> Say what you want about Secure Enclaves, we know of no better way to conceal social graphs.

I'm not following. Secure Enclaves have nothing to do with protecting the social graph of Signal users. They're used to store the contact list (and other things) in the "cloud" in a safe way – things that weren't even shared / stored anywhere by Signal before Secure Value Recovery was introduced.



Ohh right, sorry about that – I totally forgot about that feature!




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: