Hacker News new | past | comments | ask | show | jobs | submit login
Signal is secure, as proven by hackers (kaspersky.co.uk)
200 points by cjg on Oct 7, 2022 | hide | past | favorite | 312 comments



If Signal is secure really depends on your threat model.

While they never classified them as data breaches, they had all the numbers in their system extracted number of times.

If someone was using a real phone number, even if Signal was secure — just knowing the number alone would give an attacker addition information. When Moxie was in charge, he repeatedly refused to allow new users to signup without phone, even though in there numerous ways this could have been done and are being done by their competition.

Lastly, all three Signal board members have held the top leadership role in past year, which is highly unusual, in fact, never heard of anything like it.


Signal's non-phone-number-keeping competition keeps a plaintext database of every pair of users that has ever communicated. The reason Signal uses phone numbers is the same as it has always been: to avoid having that database at all, or features that depend on it.


What features depend on the use of a phone number? Sign up and contact discovery? I don’t understand why a plaintext database is the alternative.


Your Signal contact list is your device's contact list, keyed by phone numbers.


For clarity, this assumes you allow Signal access to the device’s contact list, which is not required.


For clarity, you allow the open source client app (which you can audit and compile yourself) to securely share your contact list with a secure enclave (which is open source so you can audit it as well).


The client-side contact list is one of two lists: (a) device’s contact list if Signal is given access — or (b) list of numbers that have manually been add to the App and Signal is not given access to device’s contact list because user declined providing it access.

Also, might be wrong, but appears Signal’s secure enclave, which is based on Intel’s SGX technology, is known to be vulnerable to side-channel attacks and Intel’s SGX code & hardware is not open source:

https://medium.com/@maniacbolts/signal-increases-their-relia...


Yep, you have to trust the secure enclave for this to work!


No, you have to trust anyone with access to the server; secure enclave provides no security given it appears it is vulnerable to side channel attacks; see comment you just replied to for related link.

Basically all it does is given Signal plausible deniability for public search warrants:

https://signal.org/bigbrother/

It would do nothing to stop national security letters. Basically, you should assume the functionality provided by the secure enclave is only in name; given the NSA ATT national security letters enabled both physical access and system modifications, see no reason to believe others would not be required to do so as well:

https://en.m.wikipedia.org/wiki/Room_641A

Again, average person does not have a threat model that makes this relevant, but to say Signal does not have access to the SGX data is purely based on trust, not mathematical proofs.


At that point, your phone is vulnerable. As is your computer, and your connected fridge.

If your threat model does not allow it, use manual e2ee with pigeons. Otherwise, well secure enclave will get better with time, which will make Signal better too.


> secure enclave will get better with time

Intel already dropped SGX from their new line up of CPUs (11th gen +).

https://news.ycombinator.com/item?id=29932630


Does that mean that the secure enclave technology will die? Feels like:

1. SGX is just an implementation, could be replaced by something else. 2. SGX still works on CPUs that have it.


Signal should make threats clear to users, just as they should have notified all users that multiple 3rd parties downloaded all the phone numbers in Signal. Offering federated servers would also allow users to secure their own servers.


Most of Signal's "competitors" don't just keep a plaintext database of every pair of users that have communicated, they keep a plaintext database of every conversation. (GChat is at least good about making expiring conversations visible and easy.)

There's not necessarily "alternatives" most of the competitors offer a much richer feature set and Signal has stripped away those features down to what they see as the bare minimum.


It would be more accurate to say that Signal has stripped its features down to those it can implement safely without keeping serverside databases of communications metadata. It's surprisingly hard to do that, and so Signal has noticeably fewer features than things like Matrix, which suffered calamitous security vulnerabilities as a result.


Didn't moxie explicitly say Signal needed to now keep your contacts list in their servers via PINs/SVR? It is encrypted of course, and they feel they had to do that to enable identifiers not based on phone numbers. Some people disagreed on that last point.


I don't see it mentioned in the article (though I might have missed it), but:

> As such, the cybercriminals managed to pull off the attack by impersonating the victim of the attack for roughly 13 hours.

All the contacts of that victim would have received a warning that the victim had changed their encryption keys (WhatsApp has that warning too, but unlike Signal it's disabled by default), and IIRC, that warning shows up before you try to send a message. Even if they're not the kind of people who checks whether the encryption keys match (it's very easy to do if you're meeting in person, but not many people do), that warning can be enough to alert the contact that something odd is going on.


"secure" is not a binary state.

Title is low-brow and weak.

"secure from a small group of specific hackers" is not as catchy though.


This "article" was written to subtlety advertise Kaspersky's security app.

> And, of course, install a security app on your smartphone.


Whatever the security of Signal is, all my trust into any service ends when it requires from me personally identifiable information, which could be easely replaced by other methods (I am system software developer for ~31 years, don't even try to feed me with bs).

And as such, Signal/Telegram/Whatever that requires me to enter phone number, has zero trust from me. I don't care who analyzed the protocols security, what the safety measures are, what cipher algorithms you use, how many hackers tried to break in and failed, what other PR/SEO methods you use.

Just the fact, that you require information that is so deeply PII as phone number is a reason that overrides everything else. From my standpoint, software that requires that, is honeypot.

(btw, this is my personal opinion, you don't have to agree - I can (from any device, without giving any PII, now or 20 years in past) login into IRC network (vpn/... is outside this topic) and use asymmetric cryptography (with exchanging public key safely by some other method, stenography anyone?) to chat completely secure. I can send email (with exchanging public key safely by some other method, stenography anyone?) and communicate completely secure. I can use Counterstrike chat on random server to do the same. So what does the Signal does for me in terms of safety of exchanged information? Show me a nicer UI so I can use graphic smileys?)


I think you may be misinterpreting the use case for Signal. Signal is not meant to be used for anonymous communication, it's meant for secure communication. Signal's threat model is meant to solve issues with SMS: a MitM can tell a message was sent but not the contents of the message. If you were to replace the phone number with a username of some sort it would be a little more anonymous, but could be easily doxed from users' poor OpSec or poor OpSec of a users' peers. Given Signal is meant to be used by people who don't know what OpSec is, it doesn't seem worth the effort to put up trivial anonymity barriers.


Sorry but anonymity is security. Once you know who to beat, you will get the keys. Wanna bet? We can do an experiment... ;) /s

(anyone down-voting this post - you don't agree? You are welcome to join the experiment too. Why don't you put your livers where your mouth(fingers) are? I know for myself that immediately when you would show me rubber hose (not even coming to wrench!!!) I would tell you whatever you want to know. But you can obviously do it better, so lets test it. I have a nice, 1kg french wrench waiting for your beliefs (and knees). Yeah, I know I am not politically correct, but facts are facts and you deny them for unknown reason. /s)


> Sorry but anonymity is security.

I'm actually with you here. But anonymity is HARD. Way harder than people give credit. We're not anonymous here on HN. But Signal is doing this step by step and I think that is the right way. Yes, the wrench attack works, but it doesn't always work. It would be better if it was hard to use the wrench attack, but protecting against it is surprisingly difficult. Signal seems to be taking this into account and also working on this in steps.


Not really. Anonymity is simple. You don't require PII for registration. As simple as that.

What is hard are their other use cases. Preventing rotten eggs entering the system. And they cut shortcut here, sacrificing security of their users to keep the bills down. The phone number requirement is not there to protect you but to protect their ecosystem and even more their resources (if I take it as lightly as possible, check the last line).

Again I will use IRC as an example, it was able to connect so many people without requiring PII, yes there was spam (same as with emails, I host my own mail server and I dont see spam, it is 100% filtered out by rspamd), for what, 30 years(?), they can do it too. I don't see any excuse for their requirement.

But ok, lets be fair - easy to connect. I have around 400 phone numbers in my phone of people I asked for their phone number - they were relevant to me at some point, for instance primary school schoolmates. How many, do you think I communicate with? I went to count history for last year - 35 numbers. 35! Out of those, I would communicate over secure channel with 6 (current company). And now this is an issue?

Let me say it again, with all the facts, I can only take them as a honeypot.


Honestly you seem to just want to hate Signal. That's okay to dislike something but calling Signal a honeypot is a strong statement. That requires strong evidence, not weak conjecture.

It is hard to respond to you in good faith because it does not feel like you want to engage that way. Maybe I am misunderstanding your statement. So I will engage in good faith once more (reminding you that this is a HN rule).

Anonymity is nowhere near as simple as not requiring PII during registration. Even from the provider side. We can agree that it is difficult to maintain anonymity from the user side. But choices made by the provider can make these things impossible. Let's suppose Signal gives me one username that I can use and I must use that continuously. What do I choose? If I use "godelski" then I've made a permanent connection between that identity and my real life identity. After all, I do talk to friends, family, and coworkers on Siganl. If I choose another name and share that name through this name, then those two are linked together. All of this is information leakage and highly valuable to OSINT people. I'm sure someone here has significantly more experience than me and would be happy to expand upon this too. But it is extremely naive to believe/claim that anonymity is provided simply by removing the phone number. It is far more complicated than that. Hopefully you now know that.


There's quite a few other hints in the marketing and decisions at Signal that reek a little bit of honeypot (by the NSA).

Super hard to prove though...

For me personally the strongest hints are the fact that it's centralized, there's no getting around connecting to their server(s) for the app/clients to be useful and that it's impossible to know what exactly their server is running (by design?).

Oh, and Signal is based in the US. That fact by itself pretty much means all bets are off when it comes to security or anonimity.

I don't see how the NSA would not be pwning that server or owning/creating that server and/or organization (indirectly or directly).

Or to put it differently: why would it be hard or illogical for the NSA to setup an innocent seeming "good guys" radiating non-profit chat service that is supposedly secure (yet centralized and non-anonymous by design and also a honeypot)?


The whole point of e2ee is that you don't need to know what the server is running, just check the (open source) client.

The point of their private contact discovery is to leverage SGX enclaves so that you can verify what code is running on their server.

Sealed senders allow you to send a message without revealing to the server who you are.

The whole point of Signal is to build something that you don't have to trust. But of course you need to put some effort to understand it (and what it means, e.g. if you don't trust your OS running the client or if you don't trust the SGX enclave).


>Oh, and Signal is based in the US. That fact by itself pretty much means all bets are off when it comes to security or anonimity.

It has to be based somewhere right? Would you trust it if it was based in China or Russia? Or even mid tier countries like UAE or Singapore? Even "neutral" countries like Switzerland isn't safe, as we've seen with Crypto AG.


The patriot act clearly compromises every 5-eyes nation. China and Russia have their own equivalents of a Patriot act, so hosting there is no good either. Mid tier and neutral countries can be compromised as in your example above. This is all truly depressing, but why does it have to be hosted on a central server.

I hate web3 hype as much as the rest of HN, but this seems like a genuinely useful application for it.


Please don't post insinuations about astroturfing, shilling, bots, brigading, foreign agents and the like. It degrades discussion and is usually mistaken. If you're worried about abuse, email hn@ycombinator.com and we'll look at the data.

https://news.ycombinator.com/newsguidelines.html

https://hn.algolia.com/?dateRange=all&page=0&prefix=true&que...


I always took that rule as "don't post insinuations (accusing other HN users of being) astroturfing, shilling, bots, foreign agents...".

Can we not point out when corporate entities are comprised of these things?


See the second link. (And, of course, the insinuation made above applies directly to several members of the HN community).


> why would it be hard or illogical for the NSA to setup an innocent seeming "good guys" radiating non-profit chat service that is supposedly secure

Let's also investigate the inverse side of this. Supposing Signal works, would the NSA not instead want to launch a disinformation campaign against them and exaggerate downsides? I think such action would also be easy and logical.

But I don't think that's happening, to be clear. I think people are just passionate about the subjects and with passion people are often excessively head strong (this is something disinformation campaigns prey on btw. They often play both sides because chaos is often more effective pushing a singular narrative. See "malinformation").

> Oh, and Signal is based in the US. That fact by itself pretty much means all bets are off when it comes to security or anonymity.

(anonymity. FTFY) I do agree that being US based comes with certain risks, but the US is not authoritarian and is unable to force companies to collect data. This is enough to raise suspicion but not enough to be damning. The suspicion is also reduced given that Signal publicly discloses subpoenas that they receive. Insiders like Snowden also advocate for its usage as well as many major players in the security community, globally. One can come back and suggest that this is disinformation but that increases the complexity of the honeypot campaign and as history has shown, complex conspiracies unravel quickly. Especially in high profile cases, and since Signal is universally suggested as the gold standard by the security community, I'd argue it is pretty high profile.

The problem with conspiracy theories is that it is easy to turn evidence against a conspiracy into part of the plot and coverup. But this just exponentially increases complexity. And anyone that has worked for or with the government will gladly tell you how ineffective they are (often in the form of complaints). After all, two can keep a secret only if one of them is dead. The fact that it is difficult to prove (and people have been trying for over a decade without yielding any more evidence than you have put here, +RadioFreeAsia) is actually evidence to the contrary. More should have been uncovered if there was a real plot (especially considering how complex it would need to be).


I'm pretty sure I've had to register my email with nickserv to join an IRC channel before.


Anonymity is a kind of security. There are other kinds that are trivially not anonymous but should be secured (and private between members), like talking to my family members about our private business.

To the best of my knowledge, Signal's use of phone numbers does not meaningfully compromise the security or privacy of my conversations with my friends or family.


>Sorry but anonymity is security.

Tell me your threat model, and then I'll tell you whether "anonymity is security" or not. Anything else is just you talking past other people.


I would say anonymity is necessary but not sufficient for security. So while you might be able to devise an example that _only_ uses anonymity that can be defeated, that would be meaningless because in real life there would be other measures at work, too.

This is why Omertà–basically a vow of secrecy achieving anonymity as far as the police are concerned–is taken so seriously and why it has been such a big deal the few times it’s been broken. But Omertà isn’t the only tool employed–they use guns, safe houses, all kinds of other tools and methods to ensure their security.


Let me just go ahead and send my mom an anonymous message asking what time she wants to see me this week. Should work great.


If your threat model assumes that you will be tortured, and you will immediately give up any desired information upon being tortured, then you better include some magic amnesia pills with your secure messaging protocol.


> Sorry but anonymity is security.

Anonymity is security through obscurity, by definition.


Obscurity is still security. But it is a speed bump, not a locked door. Security is probabilistic in nature and it is quite confusing that people portray it as a static concept. For example, remote wiping your phone when lost is a common practice. You ACT as if the information has been extracted but you still wipe. Why? Because not doing so gives a potential adversary infinite time to retreive the data. Similarly speed bumps make it harder. Every security researcher will tell you that there are no foolproof methods. If there were we wouldn't have this cat and mouse game with hackers. It is all probabilistic and anything you do to increase your odds against hackers is, by definition, security.


obligatory xkcd: https://xkcd.com/538/


> it would be a little more anonymous, but could be easily doxed from users' poor OpSec or poor OpSec of a users' peers

This is my problem, not my messaging service's problem.

> Given Signal is meant to be used by people who don't know what OpSec is

???


1) Signal uses phone numbers to generate your social graph. They don't know it.

2) Usernames are pretty close. In fact, you can build the beta and use them now. [0] So your worries will be over soon. I really think you're exaggerating the simplicity of replacing the identifier in a way that is not personally identifiable. I mean my name here is only semi-anonymous but it would be a grave mistake to think one couldn't figure out who I am fairly easily.

3) What alternatives do you have? Yeah there are some that don't require this but also haven't had as extensive of audits, open sourced code, or the proof of time. Personally it does matter who analyzed protocols, safety measures, ciphers, etc because I can recognize that I'm not an expert in everything AND even if I was that dual verification is better than singular. With something as important as this, N-verification is important. IMO.

[0] https://community.signalusers.org/t/usernames-in-signal/9157...


"easily replaced by other methods" ... like what?

How would this other method make it hard/expensive/infeasible for a malicious company to produce 100M accounts to cause problems with spamming, DoS, and related issues? Phone numbers are finite, cheap, but not zero cost. If sending a spam email cost $0.01 much of spam would disappear.

How would this other method protect a user's social network without having to upload that meta data in signal? Currently this is handled by a user's address book hosted outside of signal. How many users would you lose if you required some key exchange in person?

Sure it's easy to think about generating a key pair, uploading the public key to a DHT, and allow people to communicate directly without a central server. However that has a fair number of practical issues that would be worse for the average signal user, not the least of which would be much worse battery life and bandwidth charges.


>How would this other method make it hard/expensive/infeasible for a malicious company to produce 100M accounts to cause problems with spamming, DoS, and related issues? Phone numbers are finite, cheap, but not zero cost. If sending a spam email cost $0.01 much of spam would disappear.

complex captcha and/or expensive PoW. and an option to use a phone number instead

>How would this other method protect a user's social network without having to upload that meta data in signal? Currently this is handled by a user's address book hosted outside of signal. How many users would you lose if you required some key exchange in person?

store it on the server, encrypted by the client. jesus.


So you generate a handle, maybe even a key pair and upload the public half to a server. Then someone you know joins next week and does the same.

How do you find each other?

If it's feasible for an end user, say 30 seconds of a phone's CPU, you can be sure that a spammer could generate millions.


>How do you find each other?

by exchanging public keys over an established communication channel

>If it's feasible for an end user, say 30 seconds of a phone's CPU, you can be sure that a spammer could generate millions.

yeah, that's an inherent limitation of PoW, so it has to be an option rather than a requirement. then people who don't care about privacy can do the SMS thing, and those who do can opt to do the 30 minute PoW.

but this is irrelevant to Signal. it's not the phone number verification itself that people have problem with. if the only purpose it served was to prevent attacks on the service by ratelimiting it to one account per one phone number, that would be largely fine. the problem is that your phone number is your Signal identity, which is utterly moronic for a "privacy-first" service, and the reasoning and excuses for this stink to high heaven.


> by exchanging public keys over an established communication channel

I get the appeal, but seems like requiring out of band key exchange will add significant friction and make signal much less popular compared to the competition. I've got a few dozen signal contacts, only a few of them close enough to do a key exchange. If I only had a few, not sure I'd use signal.

Signal has been resisting making special cases for the few, and I get it. Why spend developer time on some feature that's not going to benefit the majority of users?

> the problem is that your phone number is your Signal identity, which is utterly moronic for a "privacy-first" service

I can't think of anything that would work as well and result in so many (40M monthly) people communicating securely. They do seem to be considering their options to help with this, in particular: https://signal.org/blog/secure-value-recovery/

Allowing social networks, surviving the death of your phone, frictionless on boarding, account recovery, etc all interact in complex ways.

Personally I like that there's not a contact database/social graph inside signal.


>requiring out of band key exchange will add significant friction

how do you exchange phone numbers and email addresses with people?

>and make signal much less popular compared to the competition

is popularity among the (COMPLETELY) tech-illiterate worth giving up privacy for?

>I can't think of anything that would work as well and result in so many (40M monthly) people communicating securely.

and virtually everyone who had ever used the internet has an email, despite its apparently cumbersome sign-up and contact discovery procedures.

>surviving the death of your phone, ... , account recovery

are less likely if your account is tied to the phone number or email. they can be lost, they can be stolen, they can be seized.


> How do you find each other?

You don't, because that's something that's fundamentally impossible to do in a privacy-preserving way - no way around it.

> If it's feasible for an end user, say 30 seconds of a phone's CPU, you can be sure that a spammer could generate millions.

You're picking overly conservative numbers for "feasible". Choose parameters such that it takes something like half an hour on a phone. Inconvenient? Yes, but privacy is inconvenient in general.

And yes, a spammer will still be able to generate lots of these fake identities, but now the cost is much higher, while the expected profit is roughly the same - and that's how you defeat spam.


Well there are alternatives to signal. Some use p2p, some use tor, some use a federated system of servers. None have the networking effect to be as secure and as popular as signal.

So if signal was privacy preserving, didn't show or tell contacts about others using the service, do you think it would be as popular? Or do you think we'd be discussing a different popular e2e encrypted app and complaining about the very features that made it popular?


It sounds like you want an app that sacrifices usability for a promise of security. That's fine, there are options. Of course, some are honeypots run by the US government, and you'd need to really really have an unusual threat model to warrant such care, but you do you.


> How do you find each other?

I don't want to find people on Signal, or be found by them. I want a way to securely communicate with people I've decided I trust enough to be contactable by them - that is, people I've met and sufficiently liked face to face.


I’ve always thought it is dumb for the same reason i think WhatsApp is dumb. In Latin America, WhatsApp exists as a way for people to message each other without paying per text, which is very much still a thing down there. In the US, we get all of our everything unlimited, so we don’t need a wacky app to send texts without paying more money.

But why then do you even need a phone number to use WhatsApp? If it is a texting and calling service that is used to circumvent carrier fees, and is almost exclusively used on WiFi (because data is even more expensive than texts), why not just let them make an account and not use the phone number? That way they don’t have to pay a secondary company to use WhatsApp…


Simply limit incoming traffic to users you've whitelisted yourself. Problem solved.


Let me pay $5 to get a private handle.



This is excellent news! Thanks for the link.


Sounds good, and it could help fund signal without getting advertisers involved.

So you pay $5 on a credit card, which connects you to said handle, much like buying a SIM. How is that any better?

Someone in your social network joins signal next week, how would they find you?


I don't want my social network to find me. I will give them a handle if I do.

I know which of my coworkers use signal, despite them not telling me themselves. That seems wrong.

Buying things anonymously is a solvable problem. Gift cards, cash, etc. Let people resell your $5 sign up card.


Well signal is designed for secure communications for most people. Not designed for anonymous communication. So a key piece of that is easy to use, nice on boarding process, and being useful to new users in that first minute or two.

So the reason signal is so useful is because so many people use it, which depends on things like contacts showing up immediately when you join.

Seems like it's a common theme that people want a messaging client with a huge number of users, but don't want the features that made it popular in the first place.


Mullvad shows that it is possible to receive anonymous payments (eg by non refundable cash by mail)

Signal could use payments (eg via Monero) as spam prevention and revenue stream.

This would not damage any current feature/user


How do you pay them without revealing your identity?


Ask mullvad


Monero?


Threema only requires an anonymous ID (and has done so for many years), so somehow it must be manageable.

https://hn.algolia.com/?query=Threema&sort=byPopularity&type...


Right, but they charge, which connects to your identity just like a sim does.


It’a one-time charge through the app store and unrelated to the Threema ID. You pay for the app, not for the ID. Threema doesn’t receive any PII about you.


Ask them how many users choose to sync their contacts :). I did.


Don't they store IDs on the server?


Not sure what you’re getting at?


It is as simple as: I dont care. Signal requires a personally identifiable information while the keys can be extracted using rubber-hose cryptoanalysis ( https://imgs.xkcd.com/comics/security.png ) once you know ---> WHO <--- to beat. This is not secure. Sorry.

I understand the whole concept behind it. I understand why they want a phone number. BUT. This is not secure (oh, really, my phone number is hashed, how long do i need to brute force a few numbers in range of 0-9, a few seconds on my laptop, maybe?)

On the other side, I can use a simple email/irc/counterstrike chat to be MORE secure. Why? As it doesnt require any PII from me.

So why bother? Yes, sure, I can use it as a nice chat interface, I can use it as another way beside ~20 chat applications I need to use to contact people I know ( https://imgs.xkcd.com/comics/chat_systems.png ). But dont sell me security. With requiring PII they broke it.


A big mistake you make is to think that someone would be powerful enough to find you personally on Signal, but would fail to find you on an insecure chat in a video game.

If you are not interesting for the NSA, giving your phone number to Signal is fine. If you are... run. The counterstrike chat won't hide you.


I'll have my 70 year old parents on another continent install counterstrike so we can text each other. Thanks for the suggestion.


Your 70 year old parents don't care about security of communication. Whatever http(without 's') chat is quite enough for them. They were talking on analog, unencrypted line for whole their life. Why should they care now? PRISM anyone ( https://en.wikipedia.org/wiki/PRISM )?


That is also not true either. My 80 year old mother uses Signal because she talks property management and money related issues to people. She does not trust WhatsApp because it is owned by Facebook (Which is weird because she uses Facebook all the time). For her, knowing that cell phone company employees can grab text messages is enough for her to switch to something that is secure.


So this is the crux of the disagreement. You believe that you either need maximum possible security or none at all, so when you need to be secure, it doesn’t matter if you have to use an inconvenient solution, because it’s that important. Whereas in reality there are a ton of less severe threat models that can be countered with more user-friendly methods of communication. Advertising companies, ISPs, local cops, etc won’t be able to read your Signal messages or find out who you’ve been talking to, and that can be important to many people. NSA or equivalent probably can, if they’re interested in you already, but that’s a concern for far less.


Are you suggesting that people who aren't devoted to the cause of secure communication are not allowed to derive any benefits from secure communication?


I come from point of ignorance and agree with you.

I do not understand what privacy security or anonymity I can have, when we start with sharing my most identifiable personal information with the app and all my desired contacts. (whatsapp takes it further by refusing to work without full access to your contacts which... Just, no).

I feel I'm in a twilight zone as everybody cheerful regresses to apps you have to bootstrap with your phone number and then proceed to basically only work well with your phone - teeny tiny screen with awful keyboard, instead of allowing me seamless access across my communication devices and preferences.

Messaging seemed like a solved problem until we made it inexplicably and massively worse.


> whatsapp takes it further by refusing to work without full access to your contacts

I’ve never given WhatsApp access to my contacts. Works fine. Just don’t get numbers matched to contacts, which makes sense.


Interesting. Can you initiate a message?

I can only talk to people if they've granted access to their contact list, and initiate conversation with me (which is not a solution, just a shifting of problem to others).

On android at least, I cannot initiate a conversation with somebody I don't have a going chat with. It won't let me just type in a phone number.

Specifically, Clicking "new chat" asks me for contacts permission, and if I say no it simply closes. I have to contact a person out of band and ask them to initiate whatsapp with me. By no definition is my experience "just fine". Is this not your experience?

Can two people who have not granted access, converse?


> Can you initiate a message?

Yes. There is a little write icon on the top-right of my iOS screen. Click that and paste in the number.


Fascinating. It will not allow me to initiate conversation on android without access to contact list. I wonder why the platform difference!


You can open wa.me/<number> on a browser, it'll redirect to a chat on whatsapp.


Neat!

(I would still not call it "just fine" in terms of normal app usage, but it makes my life easier so I thank you for that :)


Perhaps this has changed, but my experience was that the installation wouldn't proceed without granting access. I could prevent the app gaining access to my contacts by using the Android home/work profile so that it would only see an empty contact database. There are other tricks you can use on Android, but I never got it to work on iOS.


Messaging was a solved problem until we wanted to make it secure and private.


The Signal protocol does not require a phone number or any other PII. This is what has been "proven secure".

The Signal app is a messaging platform built on this protocol by the same developers. That is what requires a phone number.

Anyone can build their own messenger using the Signal protocol and use whatever means of authentication they want.

I think there is also a useful distinction between "anonymity" and "privacy". The protocol is capable of facilitating both. The Signal app itself primarily focused on just the latter. Signal knows who you are, at least in so far as they know a hash of your phone number. You are not totally anonymous to them. But they do not know what you are saying. Your interactions on their platform are private.


> (I am system software developer for ~31 years, don't even try to feed me with bs)

Whoa now ol' timer! j/k.

I like signal because it make switching people away from other platforms easier. It is not anonymous, but it is secure (until proven otherwise). I'll take the latter if I cannot have both.

That being said, they have room to improve. I dislike the work-around I have to do to use two separate phones and keep a single conversation going.


I get what you’re getting at but I would first like to know how you’re keeping your phone number totally anon prior to agreeing or disagreeing with you.

In my eval, phone numbers are barely PII, starting with when ISPs and carriers started selling mobile data 2 decades ago.

So, my phone number is very much out there now. No issues using it for secure comms IMO.

If you want truly anon comms, I’m not using signal, but that’s not what signal does anyway.


The argument I hear in favour of this practice by messaging apps is that, this design helps your friends find you easily by your number.

But I agree with you, asking for the phone number and telling me this app is secure is ridiculous. Let me register with something less intrusive, if I want to go the phone number way, then let me use that. Give me an option.


Is there another technique for effortlessly bootstrapping a contact list? Finding someone requires some kind of lookup/search. Using PII makes a lot of sense because that's how people identify each other. Anything out-of-band is quite complex.

If I told my mom to install some app and we'd connect to each other using some random IDs, we'd probably still be using SMS.


>If I told my mom to install some app and we'd connect to each other using some random IDs, we'd probably still be using SMS

This is surprising to me, since it was my mother's generation that got me into Skype (where you have to manually add the other person by their username), almost twenty years ago.

(Even worse, not long before that you had to manually punch the other person's "ID" into the device every time you wanted to talk to them!)


A Signal support claims "The Signal service does not have any knowledge of your contacts."

Can someone explain how this is possible? How does my phone know when to notify me that a friend has joined signal?

Edit: apparently it's "private contact discovery", I need to give this a proper read later.

https://signal.org/blog/private-contact-discovery/


Yes, their blog posts are really good. Read about the sealed sender, too! And then realise that even with sealed sender, if you send from your home IP, then you are identifiable. So you should send from a reasonably anonymous IP (w.r.t. your threat model)


I will never get over the fact that the Nintendo Switch operated on random character strings to find friends rather than usernames or something more straightforward for normal users. But the fact that they managed it does mean it's not so large a barrier.


I guess it works because people think of it as if they were phone numbers, plus it removes the barrier of having to choose a username. See also the now defunct ICQ which was fairly popular even outside of tech circles.


Not ridiculous at all, depends on your threat model. It's perfect for me.


So why do we need e2e encrypted messaging? Why do we worry about the "secure" part of messaging? One of the main reasons for me is, it's hard to trust a centralized authority.

Can you trust your information stored at your messaging app's servers stay secure forever? Can you trust that company to never get compromised? So yeh, they go through all the hassle of making things "secure" but attach everything to something(phone number) most of us can't get without revealing our actual details. So yeh, its ridiculous to go through all this hassle to make things private and secure, but force people to use something that de-anonymizes everything about them.


I don't want them to have access to my messages and make a profile out of me, even less of billions of people. That's why I also care about metadata: social graph is valuable for profiling.

I don't write anything that would be interesting to the NSA, I'm not a target for them, so I don't care if they see I use Signal.

BTW, say you have an anonymous Telegram username, I'm pretty sure it's trivial to find your name from the content of your messages. Or just from the metadata.


Repeat after me:

Security and privacy are not the same thing.

Among many other differences: security is a binary (can an attacker gain access to information) but privacy is a spectrum (what kinds of information do I want to leak).


> security is a binary (can an attacker gain access to information)

Only if you're looking through a very narrow lens.

There are a range of potential attackers out there with different motivations and resources available to them. The question you have to ask (for each attacker) is whether the value of the thing you are trying to secure is worth more to them than the cost it would take them to break your security.

In aggregate, that looks a lot like a spectrum of security, with different people being secure against more or fewer different attackers.


"The thing" you are trying to secure is either secure or it is not. That is what I meant by "binary".


"The thing" you are trying to keep private is either private or it is not, too.

Maybe you're saying that some information is more sensitive than others, which is true, but some security failures are worse than others (e.g. denial of service vs. remote code execution, or ability to forge messages vs. ability to decrypt messages).


I agree. Furthermore it's extremely distasteful that Signal alerts people in your address book when you install Signal


How else are people supposed to know that they could send a secure signal message instead of an insecure SMS.

I guess people could call this a growth hack, but without this kind of thing it's likely the number of signal users would be 10-100x smaller and the network effect would mean it's near useless for most.

Just imagine someone installing signal, not seeing anyone they can chat, and never using it again.


> How else are people supposed to know that they could send a secure signal message instead of an insecure SMS.

They should know only if I ask them to. I don't want to share this sort of information to everybody in my contact list by default. If I think somebody needs to know, I would tell them. If I don't think they need to know, then they shouldn't be told.

Anyway, the consequence of Signal being this way is that I refuse to install Signal at all, and don't recommend it to any of my friends or family. Instead I tell them that everything on phones is insecure and they should never believe or behave otherwise.


Can I ask you a question? I ask this in good faith, as a matter of curiosity, and am not being critical in any way.

If you do not want someone to message you, why have you given them your number?

My assumption would be that if you trust someone enough to give them your number then it shouldn't matter if they know that there is another app that they can use to message you through. Of course this is only my assumption and I am just curious as to your reason why you would not want some particular person to know that they can message you through Signal rather than some other, less secure, app.


I give people my number so they can call or text me. I don't give people my number so they can be notified when I install any application. Some of the people in my contact list are people I haven't met in 10+ years and I don't really care to prune my contact list, but neither do I care for apps to send them notifications about anything I'm doing.


Fair enough. Thanks for replying.


They aren't. I use Signal as my default messaging application so I get messages from people who don't know what signal is, and I'm fine with that. If they later install signal, I do not need them to know that I was already using it.

To be honest, it doesn't really bother bother me, but just because I communicate with someone does not mean that I want to share any information beyond the contents of my communications. We might both be Signal users, but you're not my friend. If you were, I would already know it.


> I can send email... and communicate completely secure.

But what about PII? Goes against your Signal argument.

Signal needs a phone number, not necessarily yours.


+1: the globally unique identifier that is reused across multiple accounts is an antipattern in my opinion. This is just as bad as reusing passwords. There should be no need for anyone to have to reuse email user id's or phone numbers or credit card numbers. These should be unique per online "relationship", the way your CC chip creates per transaction identifiers. The fake account problem needs to be solved in another way. You simply can't take these GUIDs back once they are leaked and that leaves your only choice to burn them or just accept the risks of having that data leaked all over the net.


Your cc has a per transaction password, but still a unique identifier. Signal works surprisingly similarly.


I really dislike that about the app too. In fairness, I think the goal was to replace Apple Message and SMS which require a phone number but it is also self defeating in a sense because it breaks the privacy and plausible deniability Signal is supposed to offer by tying an app instance to a particular piece of hardware.

Almost every communication app requires a phone number now. The claim is that this limits spam which is probably true but I suspect there is tremendous value in having real phone numbers and the ability to do pervasive tracking. We know mobile phone providers sell data on phones. The best that can be said about the Signal org is they are non profit but this is not very reassuring especially given the threat model of people who might be using Signal.


Half of the world phone numbers have leaked already (including with a Facebook leak a few years ago).

But that's not what's valuable. What's valuable is the social graph: who are your contacts, and who are you writing to. Signal works very hard to not know any of that. And it does it better than the alternatives I know.


software dev with thirty years experience says: the modern push to make phones into de-facto PII is bluntly obvious. There are historical reasons that this is dangerous in itself, yet USA and others have profited mightily when successfully implementing systems like this. Awake.


>>From my standpoint, software that requires that, is honeypot.

I couldn’t agree more. In August this story was trending here using basic 101 growth hacking tactics https://news.ycombinator.com/item?id=28340542

Trace the source. The account got deleted from Reddit.


What you say appears to be true (to the security amateur in me). What app would you suggest for voice chat that is overall more "secure" than Signal?


isn't one, publicly. make one?


Can you not just use a burner number?


Not possible everywhere, for example in Germany, you essentially have to do follow normal KYC protocols to get a number.


Signal contact ≠ Phone contact address book.

By disabling Signal’s access to phone contact address book, Signal server would not have a hash value of each and every (10,000+) phone numbers in your phone’s contact address book. This disabling comes with small costs of:

- being notified that someone in your phone’s contact address book has just recently signed up with Signal. (Pop your finger from your mouth)

- of ease of NEW lookups of a friend using your expansive 10,000+ contact address book that your phone maintains. Ummm, your focus is the secured messaging with just the targeted friends of yours, and not one of your crazed ex-girlfriend nor deranged boss.

You still have a separate but more secured form of a contact address book maintained by Signal (and yes, remote Signal server has a hash value of these smaller but limited set of Signal-capable phone numbers of your friends).

The key thing is no one else can see the content of your messaging … over Signal … except who you converse with … by Signal app, unless your phone ends up in the hand of a digital forensic guy before you did the steps of doing “Settings->Account->Delete Account”.

On a separate topic, you should refrain from using Avatar and discourage your friends from doing so. That’s an out-of-band lookup that is available for nation-state or hacker to profile further with.


You can't prove that something is secure by "hackers". You can only prove it's not secure.


If only Signal found a way to keep the desktop app linked for more than 30 days. There have been a series of FRs that constantly get ignored and "you're using it wrong" from the developers.

Thankfully the one friend that was using Signal finally gave up on it since everybody else was always missing his messages, so I no longer have to re-link every single computer every time I use the app. (with a 30 day expiry and 3 computers, it meant virtually 100% of the time I tried to use the desktop app I had to relink to my phone).


> "you're using it wrong" from the developers

This kinda response is common with Signal. It won’t even allow backups on iOS and refuses to handle scenarios where one may have a broken device or a lost device (the only way to retain chats and change devices is through a device-to-device transfer when they’re in close proximity; there is no iOS<->Android chat transfer either). Signal seems like it’s meant for chats that don’t have much value, where losing messages isn’t a big deal. But the app, at least on iOS, will never stop pestering you often if you don’t give it access to contacts. Something is broken between the vision for the platform and the implementation.


IMHO, anonymity takes a certain amount of unavoidable effort. If you're just chatting with friends, maybe you really don't need Signal. If you are whistleblowing a large corporation or a government, then it makes more sense?


A significant value of implementing security as "normal" or by default is that you don't look suspicious for having it.


Agreed. Think of all the other things we have come to take as normal.

Carrying car keys (or a fob)

House keys.

Wallets.

IDs.

They all require some mental effort.


> IMHO, anonymity takes a certain amount of unavoidable effort.

If anonymity was a factor, and not ease of use, they’d never only allow phone numbers as login.


After reading this thread, it seems like a necessary paradox. Some quotes in here claim that to use some other number doesn't really allieviate that risk since it is effectively transferred to the signal servers. E.g., there's no way to get around that fact that you must be identified to communicate, it is a matter of who owns that ID ... Hence the paradox. (I may be misunderstanding here.)


Phone numbers are, in some countries including Germany, tied to your identity (as in you can’t even get a number without government issued ID), it doesn’t get much worse for an identifier if anonymity is the goal.


So ultimately this is a service-provider problem? E.g., signal with server issued ID?


It’s the law here, KYC for phone numbers. So even a pseudonym would be better.


If you want a backup just copy the sqlite database from the Desktop app. The encryption key is stored in a json in the same folder. Not as convenient as an iOS backup admittedly.


I've used Signal on Linux for several years and have never been unexpected unlinked.


Same here. I do sometimes click a link, and it crashes or restarts. But then my full history is there, and new messages come in. Only have to relink when I buy a new phone.


This only happens to me exclusively after I've updated the software. But never have to relink. Just the crash after a link click. OSX handles it better by presenting me a "restart signal" button after an update.


Do you let 30+ days elapse between opening Signal? It happens to me ALL the time on machines I rarely use.


No, I use it regularly. Seems like unlinking after an extended period of inactivity would be a good thing, though.


Are you talking about going 30 days without connecting or using the app? I've had the app on my machine for years and never once had to relink. I do use it almost every day though.


What OS? I've used the Signal desktop app on Windows paired to iOS for more than a year and have never once had to re-link.


Same on macOS + iOS. Years, in fact. Not once I had to re-link.


Yeah same. On my last PC I went a good two years or so without ever having to re-link. For my current Windows laptop, it's been about a year and I've never had to re-link.


I last used Signal desktop on Windows with Android a couple years ago for several months and never had to re-link.


It really depends on your usage patterns with Signal. Most of my Signal usage is on the iOS client. However, I occasionally use it on MacOS and rarely on Windows.

If I keep the MacOS client open and active every few days, it doesn't have to relink. If I don't open it for a while, I'll have to relink.


OS X. It's a feature of the app; if you don't use it for 30 days on that computer, it requires a re-link.


Same setup but paired to an Android device. More than a year, no re-linking that I can remember.


It's quite common on a Windows machine in our house. Not sure which version.


I've had Signal on a Win11 machine for about 4 months now and haven't had this come up.


I, however, get this regularly.


The annoying thing to me is that every few year or so they switch how they distribute the signal app and don't provide a tool to migrate the old data. Yes I could do it manually by finding and copying their config dir from the old destination to the new one, but that's annoying and their whole selling point is that it'll just work for the kind of user who wouldn't think about that.

(I can't remember the exact switches, but if I recall correctly I was notified I needed to switch from a Chrome app to a Snap, then to a Deb, and finally to a Flatpack.)


It’s trivial to get Signal messages from Desktops synced. If you have anything more than a casual observer threat model, syncing to Windows/MacOS means digital forensic types like me can get it and parse it. Phones are actually more secure for Signal in this regard.


That's fine, I don't even need a chat history. I just don't want to have to unlink, scan a QR code, and relink, every time my computer unlinks, which is nearly each time i use the app (since a given computer is usually over the 30 day mark).


Yeah I get this a lot too with Signal on macOS


I've never had this issue on macOS, and use it almost daily...


That's why you don't. I have to relink every time I try to use it because I use it so rarely. I forgot what the cut off is these days (30 days or 6 weeks).


> And although the attack was formally a success, there is no reason to get scared and stop using Signal.

This conclusion — “no reason to…” — sounds strange and premature. Such attacks may not get older messages, but contacts of the person whose phone number has been used can still message the new device, which the hacker would get and could launch further attacks on the contacts. Since practically almost nobody verifies “safety number” changes, the contacts of the phone number that has been taken over may not realize they’re chatting with someone else. Isn’t that reason enough to be scared? This problem exists for any app that relies on an external identifier, especially one like a phone number that’s easier to take over (including through SIM jacking).

Signal may be secure for specific definitions, but your contacts may not be safe with such takeovers.


> By using end-to-end encryption, user messages are stored only on their devices, not on Signal’s servers or anywhere else.

How do we know with certainty that the messages are not stored anywhere else? Don't they go through servers to get to the end user?


Signal received subpoena to produce all data they have on a user and they provided just two data-points: last connection date and account creation date [1][2]. So I'm pretty sure they do not store anything else.

[1] https://signal.org/bigbrother/central-california-grand-jury/

[2] https://signal.org/bigbrother/eastern-virginia-grand-jury/


And how to we know they don't collect other data but don't advertise it?

If they have a special deal with government, the court wouldn't even know about those, or might be instructed not to ask them and not disclose anything.


Valid question, which I don't have the answer for, but since Signal is open-source [1][2] it's open to lot of analysis. Since we haven't heard anything damaging so far, I'm guessing it's _secure enough_.

[1] https://github.com/signalapp/Signal-iOS

[2] https://github.com/signalapp/Signal-Android


Another question then: can we verify (even if we have to jailbreak and check the binary) that the encrypted app delivered on iOS (and I presume Android) is built from the exact same source code?

E.g. could we build it ourselves in, say, Xcode, and compare a hash of the resulting app binary with the installed one?


The builds are reproducible, so you can compile it yourself and check to see that you get identical binaries.


Identical to what? Aren't iOS (at least) App Store apps encrypted, so even if I build one locally, I can't compare the resulting file with the encrypted app


I don't use iOS, so I'm not familiar, but surely there is a way to download app packages from their store to verify them and generally exercise due diligence and do security research?


Not impossible, but that's definitely not trivial to do.

If it matters to you, you should build it from source.

If it's just out of curiosity... yeah reproducible builds are a thing, but generally that's not super straightforward in practice, I would say.


With Telegram you can. They provide instructions. (At least used to do.)

Not sure about Signal.


I looked into it, and it was technically interesting, but honestly much harder than compiling from source.


Reproducible builds are notoriously hard even without having to worry about things like jailbreaking and app stores.

The point is not just that everyone can make their own, but that anyone who is competent enough can prove that their client code is exactly what is used to produce the version that gets distributed.

This means that with Telegram it shouldn't be possible to hide a backdoor in the client in a way that is impossible to spot.


Well it's far from easy with Telegram, too.


You can audit your Signal client app (which is open source) and see exactly what it sends, and how.

Now realistically, most people can do that, so they have to trust that somebody who can actually did it. Security experts have been and are looking into messengers like Signal: they have an interest to do so as security researchers, because that would look good on their CV.

Of course there is trust somewhere, you cannot do without trust. But Signal is amongst the best you can find.


>You can audit your Signal client app (which is open source) and see exactly what it sends, and how.

See my other question. Audit how? Sure, you can examine the source.

But how can one tell the app blob in their phone is the same as the one produced if you build the code?


Signal on Android advertises reproducible builds:

https://github.com/signalapp/Signal-Android/blob/main/reprod...

(I haven't tested this myself.)

Edit: You can also build the client yourself and use your own version. No need to use the Google Play store version.


Just build it from source...


I know that part, that's why I asked "how can one tell the app blob in their phone is the same as the one produced if you build the code?".

The problem being that apps as installed from the App Store come (to my knowledge, could be wrong) encrypted. So you can't just compare an unencrypted local build with the encrypted installed app (and you can't decrypt the app or encrypt your build the same way, because you don't have the key, e.g. Apple has it - and iirc, there's no access to it, because it's held in the secure enclave in iOS case).


Yeah that's harder. If it mattered to me (like for my own physical integrity), I would compile it myself.


Because this is end-to-end encryption, what's transmitted is only arguably "the message", we can't - and there is every reason to think never will be able to - decrypt it without the symmetric key and only the sender and recipient have the key.

So from an information theoretic point of view, arguably no, the message was never stored anywhere else, even though Signal is indeed a store-and-retrieve arrangement. The encrypted message was stored briefly by Signal, but only its sender and intended recipient could decrypt it.

The keys are ephemeral, so if you have exactly bit-for-bit copies of encrypted Signal messages I sent when I was arranging to play Red Dead Redemption 2 with friends, nobody knows how to decrypt those. I can't decrypt them, the people who received them can't decrypt it, even though we saw them at the time, and even though you have the encrypted messages and even if you have our phones, the keys are gone so that's that.


That's good, but I don't see how that means we know they aren't being stored. The NSA or another entity could be collecting them in bulk just in case the encryption is broken in 30 years.


So, the first time we standardised symmetric encryption was DES in 1977. DES was at that time already known to have keys which are too small (56 bits) and blocks which are too short (64 bits) to be safe against a powerful attacker (like, say, the US government) but it appeared otherwise to be a good design. The algorithm actually has one flaw in its pre-DES existence, unsuspected by academics, but which the NSA had in fact fixed in DES because they knew about it.

And sure enough, even today the most practical attacks on DES are in effect brute force on those too-small keys and too-short blocks - this brute force is practical although very expensive.

Now, in 2001 DES was superseded by AES, which is used for this purpose by Signal. In AES the keys are larger (128, 192, or 256 bits, Signal uses 256) and the block size is longer too (128 bits), thus fixing the only problem DES still had. We are done.

So, while of course nobody can prove it won't happen in 30 years, it is extremely unlikely.

You might think to yourself, surely computers get faster and smaller, so the brute force problem would still arise maybe in a few decades? Nope. In the late 20th and early 21st century humans experienced something that's only going to happen once, and it has distorted our perspective on the matter. The machines are not, in fact, going to keep getting faster and smaller, there are physical limits imposed by the universe. They will get gradually a bit faster than they are now, and we will make a lot more of them, but nowhere close to enough even if (for some insane reason) our civilisation decided its only goal is to decrypt my old Signal messages.


That's true of literally every possible messenger application.


What’s your alternative? Sneakernet?

Anything that goes over a network could be stored.


The article says they are not stored. That's what I'm questioning.

"user messages are stored only on their devices, not on Signal’s servers or anywhere else."


Sure, like I said, it's a information theoretic argument. Imagine if Signal used a One Time Pad instead of AES, this might make it easier to see what's going on if you have some idea about the One Time Pad.

Suppose I promise you that qMsVOrgWDZTo0Fet9xLhIQ is the base 64 encoded, 16 byte encrypted message I just sent, but I used OTP. Do you in some sense "have" a copy of my message ? No. That is completely useless without the key, it could have literally been any message, without the key there's no difference.


Thanks for the explanation! That helps.


I think it is fair to interpret that as: Signal is not storing messages on their servers, also Signal is not storing them on someone else's servers.

Whether or not a 3rd party, outside Signal's knowledge and/or control, is storing messages is entirely out of their control.


The whole point of e2ee is that you don't have to care about that!


Since you aren’t installing something that can be checked against an audited open source build, you always have to TRUST the publisher of the software.

You just take their word for it. They could always phone home any kind of data using steganography!

I replied to Moxie’s opposition to decentralized open source software, with arguments like this.


There is always trust involved. The question is: what's your threat model? You can compile Signal yourself, but maybe you don't trust your OS, or your hardware, or your cleaning lady, or your wife.

Signal cannot do anything if your phone got compromised by Pegasus, but they never claimed they could.

That's exactly equivalent with decentralised software.


I'm 100% in agreement that we should see the source, but it seems to me there could be a "black boxy" way to check for this? As in set up e.g. an emulator and sniff all traffic in and out?


is this not exactly the point of end to end encryption?


"And second, the numbers themselves aren’t stored there in plain text, but rather in the form of a hash code."

Very strange. There is no additional security by hashing phone numbers. Not that I trust anything form this source but anyway.


The Signal server knows your phone number. Not your contact list, just your phone number (hashed or not, I don't really care).

Your contact list is only ever shared with the secure enclave, which cryptographically ensures that nobody else can read it, and the code being run in the enclave is open source and authenticated (so you can verify that the enclave is not sending your contact list to the Signal developers).

If you trust the secure enclave, then your contact list is not shared with the server.



Do actually secure smartphones exist? Ever heard of Pegasus?


Pinephone because nobody has targeted it yet! (lol)


it seems like there very obviously is additional security in that


> However the data is stored, first, in special storages called secure enclaves, which even Signal developers can’t access. And second, the numbers themselves aren’t stored there in plain text, but rather in the form of a hash code.

I was out of context. You can use salt, pepper or both but if these attacks are done buy Signal developers it would most likely be easy to crack the hashes. In the case of a data leak it can help depending on how difficult it is to figure out how the hashing works.


Wrong. That's exactly the point of the secure enclave.


Explain. It is generally not possible to use hashing securely on finite sets like phone numbers.


The secure enclave guarantees cryptographically that you are running the code you say you are running. That code (private contact discovery) is open source and authenticated by your client using remote attestation.

It's not just a hash, it's an SGX enclave.


Not at all. The numeral-only search space is too small, the rainbow table fits on an inexpensive thumb drive.


it’s still useful if you use a private hashing algorithm


Does anyone know why Signal has decided to not let people sign up without a phone number?

That would avoid any such vector relating to SMS, albeit at the expense of making it more difficult to recover an account from a new device.


The problem is the social graph. You need to know the ID of all your contacts at some point, right? Either that's stored on the server (e.g. your Facebook friends), but then the server has access to it, or it's stored locally in your app, but if you lose your phone then you lose your graph. Signal has been working on a way to store your graph securely on their servers, but it's hard, and not there yet.

The great point about your contact list is that it is decentralised, and it exists already (so you don't have to exchange an ID with all your friends manually).


It feels like everyone keeps repeating the same dialog tree that 'but thou must store their contacts list' and I don't see why.

Solution? Store your contacts list on your device. Back it up with your offline chat backups (android does external chat backups now, IOS should be enabled plus give em icloud backups to boot).

with a new signal app on a new phone, you restore your chat backups and poof your contacts list gets loaded back. your new signal instance reconnects with the other clients and poof, Signal's excuse for using phone numbers and storing your contacts list on their servers is demolished.

It seems so simple to me, but I feel like I'm failing to understand something about why this is so hard.


Because you forget that you need to build your contact list (if you don't use the phone numbers), which for 99% of users (including me) is a major pain.

Then your backup idea works, but kind of sucks in terms of UX (at least in Signal's point of view), and therefore they decided to go with phone number first, and work on going towards usernames later.

Don't worry, you haven't invented anything. It's just that your idea is not at the level of UX provided by WhatsApp/Signal, and the solution is more complex than you think.


They have mentioned a few times that it is to reduce spam.

See: https://github.com/signalapp/Signal-Desktop/issues/2383#issu... and in this paywalled article https://www.sueddeutsche.de/wirtschaft/signal-meredith-whitt... where they roughly say

>We are currently working with high priority on the fact that you can assign usernames and hide your mobile number. However, it will still be necessary in order to register. There are several reasons for this, including the fight against spam.


Signal alternatives that are open source and cross-platform (at least Android and iOS compatibility):

https://jami.net/

https://berty.tech/

https://cwtch.im/

https://simplex.chat/

https://status.im/


Signal imo should secure its desktop apps better. The sqlite database of messages is stored right next to the json file containing the encryption key. Admittedly you would have to get access to the hard drive before reading it and I'm not sure other messaging apps do any better, but still an attack surface left open. This also means you can't trust disappearing messages. It's better than not having the feature. But it's trivial to setup an auto-archiving script on all incoming messages.


I don't think there is any defense against archiving disappearing messages - someone can always take a picture of the screen.


I would like to have a deep analysis about Telegram. The founder Durov always says like Telegram is secure and it encrypt the message while it's not.


No analysis needed. Telegram holds the keys to decrypt the messages and that's all you need to know.

It's why you are able to just login to your Telegram account on another device and magically get all of your message history.

So while the tech might be solid, the keys are still out there. They can be leaked. They can be subpoenaed, etc.


I like how Matrix handles this: You can either download and store locally a key that you enter into a new device to decrypt the encrypted messages stored on the server; or you have one of your other active devices decrypt its locally stored messages and send them to the new device (using some form of verification to prove you control both devices).


Until very recently (weeks not months), Matrix servers controlled group membership, and could add arbitrary accounts to your group without permission, thus allowing them to decrypt messages to the group. Matrix servers could also silently add "devices" to your account.

https://nebuchadnezzar-megolm.github.io/


Matrix servers still control group membership, and probably will for a while (ie, months).

The vulnerabilities that allowed such users and devices to steal keys have been fixed.


Control of group membership in Matrix is control of key distribution. That's generally how group secure messaging works. The vulnerabilities didn't allow unauthorized group messengers to "steal" keys; it added unauthorized members to groups, which causes authorized group members to negotiate key relationships with them.


>It's why you are able to just login to your Telegram account on another device and magically get all of your message history.

Secret chats are secure, and you cannot access them except on the device you start them with. It's one of the pain points for people that use them: they don't sync like normal chats.


You just said it yourself: "normal chats are not end-to-end encrypted". The normality on Telegram is messages that their servers can read.


Telegram's secure chats do not sync across devices. A secret chat is one device to one other device. If you start one on your phone, you won't even see that it exists on your laptop or tablet.

Secret chats by default wouldn't make sense for telegram. It's not a secure messages app anymore... It's a social media platform with a secure chat feature.


Exactly: it's not a secure messenger. It has an option of secret chats which is not as private as Signal.

If you want privacy, use Signal. If you want UX, use Telegram. Just don't pretend Telegram is private. Both are fine, but people need to know what they are doing.


Telegram isn't private. Its secret chats are.


> Telegram holds the keys to decrypt the messages and that's all you need to know.

That's an entirely different problem than TFA (an attacker accessing and being able to impersonate an account by subverting a third party 2FA middleman), which Telegram guards against as as soon as you have one device enrolled the code is sent over Telegram, not SMS.

> It's why you are able to just login to your Telegram account on another device and magically get all of your message history.

Being able to log in and get your history to sync is not a telltale sign that history is not encrypted and thus visible server side.

It could be stored encrypted and upon login decrypted locally (how to achieve that is left as an exercise to the reader, see 1password, restic, borg, and many others that store with zero trust yet are accessible by multiple devices, or even multiple parties)

(side note: claims that multi-device messaging can't be done because E2E are incorrect, e.g iMessage does it, by having each message encrypted multiple times, once for each device of the recipient account)

> So while the tech might be solid, the keys are still out there. They can be leaked. They can be subpoenaed

IIRC it was advertised that Telegram keys (presumably for data at rest) are stored split upon two (or more) different servers residing in different jurisdictions so that subpoenas would only get at most half of it or require international cooperation.

But then if you enter that ground, Telegram just as much as Signal could be court-pressured to produce a client that wiretaps data right where it's decrypted and phone home, so E2E only saves you if you audit every client version that this does not happen.

As always in matters of security, first step is to define your threat model, and who you want to secure against, as there's no such thing as perfect security.

> No analysis needed.

I would definitely like to see one done by an unbiased party, because everything I can find are blanket gut-feeling statements without reference.

EDIT: just found this, which is a bit light but still something: https://restoreprivacy.com/secure-encrypted-messaging-apps/t... and this: https://arxiv.org/pdf/2012.03141v1.pdf


Telegram has secret chats which are not the default


You can't really compare Signal to Telegram. Signal is a privacy first chat app. Nothing more. Telegram is a entire platform with channels, bots, payments, games, and more. Its more akin to Discord or even WeChat. It can't really be e2ee and do all that. Instead, when you need e2ee for a specific conversation, it has Secret Chats. Its just another feature of Telegram; not its main purpose.


Telegram has e2ee in the form of "secret chat"


Which you mostly don't want to use if you use Telegram. The nice features of Telegram are the ones that are not e2ee!!!

Having secret chats is mostly for marketing, to pretend it's private. But it's mostly confusing.


When using encrypted chats is it not secure? I thought the biggest problem is that we simply don't know - but your comment seems to suggest it's been proven insecure even when using encrypted chats?


The problem is that people misunderstand the security model in Telegram, and believe that it's e2ee. Except that most features are not, only secret chats are e2ee, and people mostly don't use those (because the whole point of Telegram is the nice group features).


> only secret chats are e2ee

IIRC audio and video calls are E2EE as well (in addition, the four emoji that appear on a call allows both parties to check that no one is eavesdropping)


That's not an addition: e2ee requires a way to verify keys. Without that it's not proper e2ee.


The main reason to not use e2ee is being able to access, send, and receive messages concurrently on multiple computers at the same time.


Yes, and that's a choice. You can choose UX (Telegram) over privacy (Signal), and I am fine with it. Just don't pretend that Telegram is private.


Telegram had bounties for years, and ran specific challenges and no one ever cracked it to collect the bounty


Except that if you are not using secret chats (most people are not), then Telegram is not end-to-end encrypted, is it? Meaning that the server can read the messages.


Yes except for that.

Well most people don’t use Telegram. So maybe their stuff isnt encrypted either. So what? If you want to use it it’s available.


If you want e2ee you can have e2ee by selecting it and using it. Simple enough.


No, it is not simple at all. In fact it is not obvious and I had many issues trying to use e2ee messages on Telegram: messages fail to deliver, the same chat appears multiple times, no group secured messages.


Let's be honest: if you want privacy, you use Signal. If you want the nice UX (at the cost of lower privacy guarantees), you use Telegram.

The problem is only when people fail to understand that the only private chats in Telegram are the ones that remove all the UX benefits, and instead believe that their "normal chats" and groups are private.


You say bounty as if it is just one... No, no, my friend. That is the public bounty. What the acronyms and scofflaws are paying is a completely different story.


wat

Look here is the most criticism that was given https://www.cryptofails.com/post/70546720222/telegrams-crypt...


Given the date and the reference to SHA1, it's about MTProto 1. MTProto 2 has been vastly improved in all the criticised areas.


Here's one major problem with Signal - you cannot delete contacts.

Following scenario:

1) X communicates with Y using Signal trying to hide from Iranian police

2) Y is getting arrested and who ever is found to have his phone number is getting in to trouble as well

3) X deletes Y from its contacts

4) Y stays in X's contacts on Signal no matter what

now what should X do? delete Signal? theoretically the police could reinstall it and see who you had in your contacts.

There have been several issues opened for this problem on GitHub for years. They all get closed by their bot after couple of weeks.

I have several ghost numbers and even ghost user names on my Signal clients. Super annoying and cluttering my list of contacts. For me Signal is just one option to avoid WhatsApp. But boy do I prefer Telegram ...


How comes the linux desktop client of signal is such a bloated mess, which can't be used standalone but only via phone app?


The weird part about this - why does Signal let anybody access the same account with just a SMS verification? I would think it would be better to send the verification message to be added to an account through their Signal account. Telegram does this already.


There will always be a tension between security and usability. A common situation is when a phone stops working (for instance, it fell into a puddle) or is lost or stolen. If it stops working, you can just pop the SIM and put it on a new phone (assuming the new phone accepts SIM cards of the same size, instead of requiring an even smaller one); if it's lost (or your SIM card is too big for the new phone), you can go to one of the phone company's stores, present your identity card, and get a new SIM card associated with the same phone number. Either way, you don't have access to the Signal account anymore, since it was on the broken/lost/stolen phone; the SMS verification (plus the optional password) is the only way to recover it.


I thought about that, and I'm not sure it actually changes anything.

So you want to register a new device with your Signal account and you don't have the previous device, either because you're a malicious attacker or you lost or destroyed it. Covering the attacker case, presumably you already can't access the message history. I don't see how you could reasonably do anything different for the lost device case, but I guess it's not that big a deal to lose message history. But if the message history is already gone either way, why not make it a entirely new and separate account?


Because of their threat model. Signal has very good e2e encryption, but somebody could take over the account over SMS (note: they would not access any encrypted message, and that would reset the key so the contacts would know that a change happened and could verify the key if it matters for them).

Telegram, on the other hand, is mostly not e2e encrypted, so their cloud can read all of your messages.

Pick your favourite.


I wish they’d decouple accounts from phone numbers, but to this point specifically, they do have a feature where you can add a PIN on your account that makes it impossible to re-register for some period of time (7 days? I forget). That’s at least enough to neutralize the risk from SIM swaps.


>As such, the cybercriminals managed to pull off the attack by impersonating the victim of the attack for roughly 13 hours. If Registration Lock had been enabled, they could not have logged in to the app knowing only the phone number and verification code.

and if Signal did not require a phone number, this wouldn't have happened at all


Maybe not, but Signal would be vulnerable to a much bigger problem: its server would, like other secure messengers, end up keeping an effectively plaintext database of every pair of users that communicates.


I don't see how that would follow?

Lets assume that today that Signal is able to deliver messages securely between users, without maintaining a plaintext association between sender and receiver's phone numbers.

Why would switching identifiers from phone numbers to some other identifier require that they change how that works? (Other than the obvious 'substitute out a phone number for some other identifier' thing)


Phone numbers are already attached to fairly complete contact databases for users. "Signal Usernames" aren't.


Couldn't this plaintext contacts database be encrypted on the server and only decrypted on the client?


Yes, and they are working on it. It's just harder than it seems, and it is not as important as people seem to think. I value much more the work Signal did on metadata than the username vs phone number debate.


Now you're building Matrix. Look how that worked out:

https://nebuchadnezzar-megolm.github.io/


So it's possible, it just takes work, which Matrix is doing right now. Is Signal also working on this?


This protocol was deployed for years before anyone even thought to check whether group messaging was secure (we just found out now that it isn't). Is Signal working on secure group messaging? Well, they won the Levchin Prize at RWC for the work they're doing on it.


> Is Signal working on secure group messaging? Well, they won the Levchin Prize at RWC for the work they're doing on it.

I perhaps should have clarified but I was talking about group messaging without phone numbers attached


Yes, and I'm explaining why that's harder than it looks; for instance, see where Matrix ended up with homeservers.


I completely understand the difficulty, but so far I haven't heard of whether or not Signal is even working on getting rid of phone numbers.

Also worth mentioning that Matrix isn't just working on encrypting contacts databases. They are also working on decentralization, which is a much more difficult problem. So I'm not surprised that they are running into issues every once and a while (though Signal has had security problems in the past too [1])

[1]: https://thehackerblog.com/i-too-like-to-live-dangerously-acc...


It sure seems to be a much more difficult problem, because they're working right now to recover from disastrous protocol breaks that allow servers to decrypt messages in groups. I'm not so interested in what novel stuff they're working on; they don't have the table stakes stuff nailed down.

This is secure messaging. It has to work, or it's just LARPing. Can the Matrix team honestly say that their system is ready to handle life-or-death secrets?


Hackers were able to get at least one person via the Twilio hack in OP's article (which wouldn't have happened if Signal was not reliant on phone numbers). So I wouldn't say Signal is great for life-or-death either. And afaik the Matrix vulnerability we are discussing was not actually shown to have been exploited


No, that's not what happened.


Do you mind elaborating? I think OPs article was pretty clear

> By default, Registration Lock is disabled, as was the case for at least one of the hacked accounts. As such, the cybercriminals managed to pull off the attack by impersonating the victim of the attack for roughly 13 hours


Or they could create a phone number like id that you then share with others, like a phone number. They would not need to store this


That only works if you don't care whether anyone actually uses your messenger. In the real world, people will only adopt a messenger that remembers their contact list across multiple devices.


They thought about that, but it's harder than you think. They even explain why on their blog.


> and if Signal did not require a phone number, this wouldn't have happened at all

I’d change “require” to “allow”.

If, for example, Signal operated like it did now, but optionally let you sign up without a phone number, with a warning to users that not using a phone number would result in not being able to receive messages to their phone number, I bet most users would still use a phone number.


In some countries a phone number is a sacred and private thing and in other countries phone numbers are like umbrellas. Over time there is no connection between the user and the owner. Where I live phone numbers are not safe to use for identification. In a neighboring country a phone number is more or less equal to a SSN.


What happens if you do have a registration lock when your number gets transferred away? The new owner can't use Signal?

Do they complain with their mobile company about getting a "broken number" or with Signal?


The registration lock is active for 7 days after the last time signal was used by that account at which point it is removed.

The edge case would be someone that continued to use signal with their old phone number, effectively blocking its new registration.


There is no way to get that number unless the person who no longer owns it accepts to relinquish it?


When they eventually move to use a new phone, they’ll need to know the code sent via text.

It’s basically a dumb idea to use the account while someone else owns the number, eventually you’ll lose access to it


That's my biggest issue with signal.


As I am fond of saying: identity is the issue with end to end encrypted messaging. So this is interesting in that it is a direct attack on the weakness of using access to a phone number as proof of identity.


Something I’ve been hearing about on podcasts is how the Jan 6 insurrection trials has a bunch of Signal messages in evidence. Does anyone know how these were obtained?


They took them from someone's phone.


Sure, but phones are themselves encrypted. Did they brute force access to the phones or did the people give up the messages voluntarily?


Last I checked, beyond the phones, iCloud backups are subject to warrants and not encrypted or are local backups encrypted. If you get an unencrypted Signal backup, possible to extract the user’s Signal encryption key and rebuild the backups to extract the messages.

Phones themselves are as secure as know vulnerabilities of whatever version of the phone hardware and OS version.


Misleading title. There was no "hack of Signal" but the authentication services, instead.


This old article only proves how vulnerable Signal is by relying on the security (or lack thereof) of other 3rd party service providers.

The phone number requirement should be removed if Signal wants to be taken seriously. Maybe the next hit might affect more users than 1900 people.


Imo Signal never actually transition from a protocol for secure messaging into an actual secure messaging app.


Surviving one attack != secure


[flagged]


For that matter how can you have any assurance of security on a device with an opaque binary blob running underneath the OS? If you are paranoid enough nothing is secure.


Not everyone uses Google Play services when running Android


This.

You have to trust something at some point. What you trust just depends on your threat model.


> You have to trust something at some point.

No, you can instead acknowledge that you can't trust any of it and behave accordingly. You don't have to settle on trusting something.


Not sure what your point is. You're saying that you can live without trusting anything at all?

Say you don't trust that absorbing any kind of liquid is safe, how do you survive more than a few days? Do you prove it first? Based on what, if you don't trust anything?

Trust is all around us, we constantly choose to trust (or not) things.


This is not a new issue. Philosophers have pondered for centuries about the possibility of a godlike deceiver that is constantly fooling your senses. How can you be sure you aren't plugged into the Matrix right now? Even worse, their arguments against said deceiver are weak, some saying that your own eyeballs can be ultimately trusted, others suggesting that since God is good he wouldn't let it happen.

Ultimately you have to trust in something or it is literally impossible to function. Even if that trust is only in your own senses. In practical terms you have to trust a lot more to function in society. Maybe a good first pass is to trust in things that most other people trust in, so that if there is a deceiver there will be a lot of other people mad with you and a greater chance of not only punishing the deceiver, but detecting them in the first place. You can't fool all of the people all of the time.


Politicians proof your point wrong. Amounts of people are for a fact a horrible metric. Look at religious people for example, or simply the fact that your eyes ( or brain ) actually is mostly deceiving you. But this is all horribly besides the point. There is no need to trust google play services yet it opens up a ton more possible attack vectors /and/ it exposes you to states screwing you over, considering that Google cucks to state authorities for various intents and purposes.


You miss the point. The point is that you have to trust something at some point. It's your choice what to trust, but you cannot live without trust.

Don't trust Google Services, that's great. Trust your favourite alternative instead.


You can live without trust. You can't live without taking risks. Some are sensible, some are not. Unnecessary dependencies are unnecessary risks.


Again, no. You have to trust your eyes before you walk on a bridge (maybe there is no bridge, and your eyes are being magically tricked into seeing one?), you have to trust that the bridge will not collapse while you are on it. Don't want to trust the bridge?

Trusting means taking risks, for sure. But there is still trust.


> You're saying that you can live without trusting anything at all?

I can live without trusting any communications on my phone to be secure from prying government eyes. I have no clue where you got "anything at all" from, sounds like a strawman.


Sure. I just don't get what your point is. Since you don't trust any communications at all, then let's use whatever has the best UX even if it is plaintext, and live with the consequences of mass surveillance?


In what way does a Play Services dependency make them insecure?


I've had my in-laws download malware onto their android devices multiple times, from the play store. Anything is a threat if you're paranoid enough with your threat model, debilitatingly so.


>I've had my in-laws download malware onto their android devices multiple times

I know in-law jokes are fun, but this just makes you sound like an evil arsehat. What in the world did they do to you that you had them do this not once but multiple times? /s

I'm assuming you probably meant to word that to read differently


Because you are unable to verify what the google play service binary is doing. It could potentially bypass any security built in to the app and allow google to read your messages.

And the answer to their question is that it depends on your threat model and the amount of risk you are willing to accept. There is no such thing as perfect security or a perfectly validated platform. At some point you have to accept good enough and get on with your life.


How would it do that? While the Play Services app has more privileges than many other apps on a device, it's still running in an application sandbox, and shouldn't have permission to read the Signal app's data or attach to its process or anything like that.


Yet it does.


How? Provide evidence please.


Presumably it's being alleged that the app store operator could switch binaries, not sure.

Bigger issue is privacy with notifications going through Google, right?


The notifications Signal pops up, with contact name and message text, are generated from within the app itself, not sent via Google.

It does receive some Firebase messages from Google, but not the type that automatically pops up a notification. Instead these are silent ones that trigger the app to perform a particular action, such as fetching any new Signal messages from Signal's servers.


That's great to know, thanks!


> notifications going through Google

That has never crossed my mind. Do notifications on Android normally go through Google servers? As in could be spied or data mined on?


Yes, unfortunately the platform providers are the only battery-efficient way of doing notifications on mobile phones.


I suppose you're talking about push notifications to save battery life instead of an app that is constantly polling for new messages?

I don't think all notifications are push notifications, and those that are would be https encrypted between the host and client, so any e2ee message would have to be decrypted on device before showing a notification, google should not have access to this, but of course it comes back to whether you should trust anything software tells you.


I might be wrong here, see sibling comment: https://news.ycombinator.com/item?id=33123325


I just want to backup my Signal online a rar and a pass, easy to restore in another phone or manually, without the rest of the joke steps.

Using PII which is also a joke, “secure” implies “private” also, else it is just “encrypted”. When someone knocks on your door in a Ukrainian or African or … village and asks for the pass, you give it.

The non private money sending also a joke of privacy. A monero-like solution would be good enough.

Less features than whatsup or viber or telegram so why bother? It is really easy to copy each others’ features, but each app’s developers think that they are the smartest people in the room.




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: