Hacker News new | past | comments | ask | show | jobs | submit login

It's ok for someone to believe that, but I don't believe that. Unfortunately there is no practical way to verify it either.



Well, if you're in a position where you can only put faith in someone else's word as to whether it's good for your needs (this is the vast majority of people), there's this: https://community.signalusers.org/t/overview-of-third-party-...


What are you talking about? Signal is open source, and its cryptographic security is trivially verifiable. If you don't trust the nonprofit behind it for whatever reason, you can simply compile it yourself.


> and its cryptographic security is trivially verifiable

That's going quite far. Even with all the details of it documented and open, there's a relatively small number of people who can actually verify that both the implementation is correct and the design is safe. Even though I can understand how it works, I wouldn't claim I can verify it in any meaningful way.


Multiple teams have done formal verification of the Signal Protocol, which won the Levchin Prize at Real World Crypto in 2017.


Sure, there are teams who have done it. But it's not trivial. The fact there's a price for it shows it's not trivial. If I choose a random developer, it's close to guaranteed they wouldn't be able to reproduce that. The chances go to 0 for a random Signal user.

Alternatively: it's trivial for people sufficiently experienced with cryptography. And that's a tiny pool of people overall.


The idea isn't that you do formal verification of the protocol every time you run it. It suffices for the protocol to be formally verified once, and then just to run that one protocol. If you thought otherwise, you might as well stop trusting AES and ChaCha20.


It is possible for the core protocol to be tightly secure, while a bug in a peripheral area of the software leads to total compromise. Weakest link, etc. One-time formal verification is only sufficient in a very narrow sense.


It is also possible for a state-level adversary to simply hijack your phone, whatever it is, and moot everything Signal does to protect your communications. Cryptographically speaking, though, Signal is more or less the most trustworthy thing we have.


Just look at PuTTY and e521 keys.

Or go back to Dual_EC_DRBG.

Unless DJB has blessed it, I'll pass.


What do those two issues have to do with each other?


These were showstopper bugs that betrayed anything they touched.

Avoiding this is obviously a huge effort.


Dual EC was a "showstopper bug"?


It did stop openssl whenever you tried to use it in production mode ;)


If you compile it yourself, can you still connect to the Signal servers?


And, even if you can connect with your own client, can you trust the server is running the code they claim it is? They were caught running proprietary server code for a time in 2020-2021. https://github.com/signalapp/Signal-Android/issues/11101#iss... / https://news.ycombinator.com/item?id=26715223


But the client is designed to not trust the server, that's why encryption is end-to-end. So does it matter?


In some sense, no - the protocol protects the contents of your messages. In another sense, yes - a compromised server is much easier to collect metadata from.


Metadata, yes. Of course, the protocols, and thus all the inconveniences of the Signal app people constantly complain about, are designed to minimize that metadata. But: yes. Contents of messages, though? No.


If Signal, the service, was designed to minimize metadata collection, then why is it so insistent on verifying each user's connection to an E.164 telephone number at registration? Even now, when we have usernames, they require us to prove a phone number which they pinky-swear they won't tell anyone. Necessary privacy tradeoff for spam prevention, they say. This isn't metadata minimization, and telephone number is a uniquely compromising piece of metadata for all but the most paranoid of users who use unique burner numbers for everything.


This is the most-frequently-asked question about Signal, it has a clear answer, the answer is privacy-preserving, and you can read it over and over and over again by typing "Signal" into the search bar at the bottom of this page.


The answer is not privacy-preserving for any sense of the word "privacy" that includes non-disclosure of a user's phone number as a legitimate privacy interest. Your threat model is valid for you, but it is not universal.


The question you posed, how Signal's identifier system minimizes metadata, has a clear answer. I'm not interested in comparative threat modeling, but rather addressing the specific objection you raised. I believe we've now disposed of it.


I don't believe there has been any such disposition in this thread. There have been vague assertions that it's been asked and answered elsewhere. Meanwhile, the Signal source code, and experience with the software, clearly demonstrates that a phone number is required to be proven for registration, and is persisted server-side for anti-spam, account recovery, and (as of a few months ago, optional) contact discovery purposes.


Yes. There's also libraries that do this, like libsignal.


It’s not practically open source though - how many people actually build it themselves and sideload onto their Android/iphone?

How much effort would it be for the US government to force Google to ship a different APK from everyone else to a single individual?


I don't know, a lot? They could with the same amount of effort just get Google to ship a backdoored operating system. Or the chipset manufacturer to embed a hardware vulnerability.


"Here's a court order, you must serve this tainted APK we built to the user at this email"

VS

"You must backdoor the operating system used on billions of devices. Nobody can know about it but we somehow made it a law that you must obey."

Come on, that's not the same amount of efforts at all.


Looks like exactly the same amount of effort to me?


Effort maybe but not likelihood of discovery


The cryptography is not where Signal is vulnerable. What Signal is running on, as in operating system and/or hardware that runs other embedded software on "hidden cores", is how the private keys can be taken.

Anything you can buy retail will for sure fuck you the user over.


Retail hardware actually has a better track record at the moment than bespoke, closed market devices. ANOM was a trap and most closed encryption schemes are hideously buggy. You're actually better off with Android and signal. If we had open baseband it would be better, but we don't, so it's not.

Perfect security isn't possible. See "reflections on trusting trust".


Bespoke but-not-really-bespoke closed-market devices made by the right people are very secure, but they are not sold to the profane (you).

> ANOM was a trap

Yes, ANOM was intended to be a trap.

> and most closed encryption schemes are hideously buggy

Yes they are. Hence some of us use open encryption schemes on our closed-market devices.

> You're actually better off with Android and signal.

I am better off with closed-market devices than I am with any retail device.

> If we had open baseband it would be better

And the ability to audit what is loaded on the handset, and the ability to reflash, etc. In the real-world all we have so far is punting this problem over to another compute board.

> Perfect security isn't possible.

Perhaps, but I was not after "perfect security", I was just after "security" and no retail device will ever give me that, but a closed-market device already has.

> See "reflections on trusting trust".

Already saw it. You're welcome to see:

  - https://guix.gnu.org/blog/2020/reproducible-computations-with-guix/
  - https://reproducible-builds.org
  - https://guix.gnu.org/en/blog/2023/the-full-source-bootstrap-building-from-source-all-the-way-down/


Oh, so none of this has anything to do with Signal. Ok!


In theory, "none of this has anything to do with Signal", and you are correct ; but back over here in reality: Signal runs on these systems.

Hence the security afforded by Signal is very weak in-practice and questionable at best.


> Unfortunately there is no practical way to verify it either.

discuss an exceedingly clear assassination plot against the President exclusively over signal with yourself between a phone that's traceable back to you, and a burner that isn't. if the secret service pays you a visit, and that's the only way they could have come by it, then you have you answer.


I think the bar for paying such a visit would be infinitely high (they would find a way to defend in a more clandestine manner) to keep the ruse going.


Let us know how that goes




Consider applying for YC's Spring batch! Applications are open till Feb 11.

Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: