> how we think about concepts like privacy, security, and trust
I was disappointed to see that a mobile number is needed and that this number is shown by default in groups. Mobile numbers are much more trackable then email addresses in my opinion. And I do not understand at all why others should be able to see them so easily.
So I now prefer Telegram because at least it hides numbers in groups by default.
This is a fair concern, and I hope Signal Addresses it at some point, but Telegram is no replacement. From Telegram's Wikipedia page "The default messages and media use client-server encryption during transit.[19] This data is also encrypted at rest, but can be accessed by Telegram developers, who hold the encryption keys," and "the desktop clients (excluding macOS client) do not feature end-to-end encryption, nor is end-to-end encryption available for groups, supergroups, or channels." In addition, the Telegram server software is proprietary.
It really depends on your threat model: do you trust the people you are talking to but not Signal (though in practice this doesn't even work as they are way too centralized, but whatever), or do you not trust the people you are talking to but are willing to trust Telegram? Clearly we should be able to have something where we don't trust either, as there is nothing about these phone numbers that is critical to Signal's operation (if anything, Signal is compromising basic important things to use them), but between these two choices it isn't clear to a lot of people why Signal's choice is better, given the actual use cases and threat models most people have.
The fewer people who you're required to trust, the fewer breaches there will be, even if no one is malicious and those breaches are accidental.
And ultimately even if the people you are talking to and your messaging provider are 100% trustworthy and never make mistakes, they usually cannot resist a lawful government request for data. Signal just has virtual no data available to provide them; Telegram could give them entire chat histories, and could be required to provide access to in-progress chats.
To save time he mentions that owning a SIM card or a smartphone with a Contacts application that holds phone numbers is a portable social network by design.
Pushing his idea further it's much easier for anyone to learn the penetration of Signal by checking against existing phone numbers. I can't imagine someone with a 200+ phone numbers querying service to validate a phone number is registered with Signal.
And while nicknames sound perfect in theory they lack one specific thing — validation of authenticity of a speaker. By this i mean an impersonator can claim the username and pose as someone else. Given a lot of people use transparent nicknames on the internet, it can pose a threat. So even when the nicknames roll out as a feature of Signal, it's good to facilitate somehow that a party i'm in contact with is someone genuine. Checking against phone numbers can mitigate the risk of fraud.
You are defending why phone numbers are useful, but are completely choosing to ignore the point I am making about the tradeoff of who you trust.
To be even more explicit about it, in the hope you understand: the vast majority of people trust large companies (which is bad but frankly not insane) but absolutely do not trust random assholes/creeps on the Internet. The security model of Signal is saying you don't have to trust Signal, but you weirdly do have to trust all of the random people you interact with with your phone number.
And so, in the context of this thread, that explains why someone would claim Telegram is actually better than Signal, because what people do with Telegram is join massive group chats that are either 1) public, 2) have so many people in them that if you think what you say isn't going to be logged you are fooling yourself. So the value of end-to-end encryption in this context is essentially zero; but, being able to join some large group chat with a ton of strangers to talk about some open source project or whatever you are doing without any of those people now knowing your phone number--an identifier which is tied to a large number of "real world" concerns and is ridiculously difficult to change--is actually extremely valuable.
Honestly, even if you aren't quite in those sets, the tradeoff still isn't an obvious win for Signal. As an example, let's say you are in a group of people talking about a protest. Are you more concerned that the company relaying your messages will be served a warrant to monitor your chat activity (which generally has some requirement of probable cause for a specific action, and likely requires knowing about the existence of a chat in the first place) or that one of the people in your group chat is actually a traitor or even an undercover cop (which can get in a number of groups and pretend to be an ally while passively monitoring for things they want to shut down)? The latter is actually a much more realistic attacker model, and with Signal now that person has your phone number, which means you are screwed. Using Signal correctly here requires getting a burner phone, which is way more effort than is reasonable.
The use cases for the privacy and security model of Signal are thereby inherently limited to people you trust with your phone number. Like, it is sometimes difficult enough to get people to want to use text messages sometimes as they don't want to give out their phone number: Signal doesn't solve that, and so is confined to the subset of communication that people currently do over SMS(/iMessage) and can't really ever begin to carve into the market share of Telegram, or even Facebook.
And so, realistically, Signal does not, can not, and should not manage to displace Telegram, which I say with sadness as I am someone who has not and likely never will forgive Telegram for claiming security properties their system didn't have (like, I am not Telegram fan, and while I have the app I only use it a few times a year; that said, this is more than I am willing to tolerate Signal, due to a number of reasons that are mostly unrelated to anything in this comment).
(And FWIW, I personally would not recommend usernames, and in fact would personally be much more angry about that than phone numbers for various reasons; if Signal decides to roll out unique choosable usernames I am honestly probably going to hate on it even stronger because of it: you are arguing a strawman here :/. But to claim that phone numbers are fundamentally better is awkward regardless, given how phone numbers aren't even a good security layer due to the prevalence of number porting. This is just one of the many devastating things that Moxie is wrong about.)
>>You are defending why phone numbers are useful, but are completely choosing to ignore the point I am making about the tradeoff of who you trust.
You're swiftly jumping to conclusions on a comment that only describes a point of view. Not my own perception of how the trust model should be established and the capabilities current messaging services provide.
>>... but, being able to join some large group chat with a ton of strangers to talk about some open source project or whatever you are doing without any of those people now knowing your phone number--an identifier which is tied to a large number of "real world" concerns and is ridiculously difficult to change--is actually extremely valuable.
As someone who suffered harassment by getting the p/n exposed at Whatsapp and being followed by the same person almost everywhere where's it serves as an id (Signal, Telegram) i wholly share your concerns. But it's a privacy matter that almost none of the current messaging platforms really offers bundled with solid UX and data transparency.
What am i trying to convey by how i comprehend Moxie's rhetoric is that Signal tries to be a Whatsapp alternative (given the latter really can access messages in a group conversation when a participant reports contact) by not harvesting the user data (cheers Telegram) and providing little bit more control over the conversation on both ends (self destructing messages). Signal has bigger focus on security right now however the use case is to be accessible to a wider audience. From that point of view Signal really stands out. And it helps me personally to, sort of, separate communication spaces.
>>Honestly, even if you aren't quite in those sets, the trade off still isn't an obvious win for Signal. As an example, let's say you are in a group of people talking about a protest.
Look, if you propose Signal needs to accommodate guerillas, rioters and protesters organize and act with impunity it's a privacy issue and Signal isn't serving this purpose.
If you look in the past main attack vectors on messengers are via exposed phone number, third-party insecure cloud backups or just physically accessing your device. Telegram addresses all of these while also allowing e2e encryption, just not by default for now because UX. At the same time many other popular messengers advertise "e2e by default" while being not secure at all and having mediocre UX (no desktop clients, no seamless sync, no usernames etc).
They're laying the groundwork to change this. Part of the purpose of the PINs rolled out last month is to enable centralizing some data to allow for other addressing schemes.
Burner sim to setup and throw away addresses this concern.
Telegram, messages in plaintext on the server? Encryption that isn't open? Yeah telegram is a bit of a non-starter if you have these kinds of concerns as far as I'm aware.
If that’s the case doesn’t it matter even less that signal requires it since it’s already known anyway?
Signal’s use of phone numbers as IDs means they don’t have to have any of your contacts sent to their servers.
As shown in the article they have no metadata and nothing to reveal beyond your phone number and when you signed up.
These other apps send your social graph to their servers, track and store metadata, don’t have encryption on by default, roll their own cryptography, or some combination of all of these things.
The phone number obsession on HN seems dumb to me - a meaningless thing for people to repeat and complain about that doesn’t actually matter so they can sound like they know what they’re talking about.
I don’t get it.
The only real criticism I have for signal is that they’re not federated so they’re vulnerable to shutdown. I think that’s okay though because we have Matrix working on that problem and having both is probably a good thing.
Assuming you’re using Signal for organizing something the government doesn’t want you organizing, if one member of the group gets rubber-hosed into unlocking their phone, the govt instantly gets a list of verifiably correct names of people involved. In contrast, with a service that lets you use usernames that maneuver would reveal nothing but those usernames (which are as pseudonymous as it gets).
One of the other problems with using phone numbers, is that it provides an opening for adversaries. Now they know your phone number, which can be used for social-engineering attacks to attempt to bypass 2FA for any other online services tied to your phone number. Either for 2FA or for account-recovery/i-forgot-my-password functionality. 2FA by SMS is wrong and broken and nobody should use it, but they do.
Adversaries will attempt to social engineer customer service for your phone carrier into issuing them a new SIM or porting out the number, so they can receive verification SMS and phone calls.
Maybe I misread, but the GP doesn't seem to be talking about impersonating a user on Signal, but rather impersonating that user on other websites that depend on SMS 2FA sent to their phone number that is now visible through Signal.
I feel comfortable with giving my Telegram username out to random people on the internet and posting it on my website because it doesn’t mean anything outside of Telegram. I wouldn’t post my phone number publicly.
That’s fair, thanks - I think for secure communication with people you don’t know personally or trust, it’d be better to use something that doesn’t share your phone number.
And the social graph IS sent to servers by Signal. It's protected only by hashing (trivial to circumvent) and by the Intel SGX technology (a bit harder to circumvent, but I doubt that the US govt can't do it).
I didn't want to claim that Telegram's crypto was better or as good as Signal's. But saying that Telegram rolled their own crypto while Signal did not is misleading as both designed their own protocols.
Also, Telegram uses SHA256 now in the places relevant for security, so that point was resolved.
My (limited) understanding is numbers are queried to see if accounts exist, but those queries are not connected to the users sending them (and they are obscured in transit).
Am I wrong?
Signal’s cryptography has also gotten a ton of attention, I don’t think the same is true for competitors.
Transit encryption is employed by other services as well. Whether there is de-correlation on the cloud backend I don't know. Maybe there is, but you can't really verify that and it's easy to correlate them again, especially if clients use ipv6 or non-CGNAT ipv4.
>If that’s the case doesn’t it matter even less that signal requires it since it’s already known anyway?
Well, if you're operating on that premise, which is to say, a premise of complete and total resignation and surrender, then from that starting point of course you haven't lost anything. I don't think anybody is joining you though in agreeing that that's a legitimate starting place to analyze privacy concerns associated with the phone number requirement.
This is such an uncharitable interpretation of what I was saying that it's basically a straw man.
If you're required to use ID to get a SIM (as K2L8M11N2 stated in the parent comment I replied to), then what I was saying follows - that the person is already tied to the phone number anyway.
In this context Signal revealing the only data they have (that a phone number signed up on X day) really doesn't matter or reveal anything new.
K2L8M11N2's other response to my comment is a helpful clarification, it's less about what can be compelled from Signal the company and more about what can be turned over if a user's device is compromised. In that context the name to number connection is more serious because they also have the content.
I'm sorry but absolutely nothing about this is a straw man or uncharitable, and I'll explain why.
>If you're required to use ID to get a SIM (as K2L8M11N2 stated in the parent comment I replied to), then what I was saying follows
Yes, and this is what I was responding to. You want that "if" to be taken for granted as an unchallenged starting premise to your entire argument. And that amounts to a massive privacy concession. And the fact that I'm challenging that premise, and bringing up the privacy concerns that are associated with that "if", that is the thing that you're describing as uncharitable. Even though you don't appear to dispute that it is indeed a privacy concession. And it only follows if you don't contemplate alternatives to using a phone number, which is what I took to be their point about the problems associated with a phone number.
>and more about what can be turned over if a user's device is compromised. In that context the name to number connection is more serious because they also have the content.
That's going true in any context where your number can be revealed, which is why it has unique disadvantages that usernames wouldn't have, under any conceivable hypothetical scenario. I'm glad that you benefited from their clarification but that struck me as a truism about the nature of phone numbers versus the nature of usernames.
> "You want that "if" to be taken for granted as an unchallenged starting premise to your entire argument."
I don't want that 'if' to be anything. It was the premise, because the parent comment I was responding to was stating it as a fact for where they live.
My point is that signal revealing your phone number and when you signed up doesn't reveal anything new about you. The issue is the case K2L8M11N2 mentioned when they have compromised a device (and can now tie content to IDs via the phone number).
> "That's going true in any context where your number can be revealed...under any conceivable hypothetical scenario."
This is just false? Without access to the content on a compromised device a phone number alone doesn't reveal much (that's the entire point of the e2ee), if it limits the ability for Signal to hand over the social graph or any other metadata (which does reveal a lot) that seems like a win.
Obviously revealing the phone number still reveals more than a username would, and if you can get all the benefits of not having to upload your social graph to their servers or share metadata without having to use phone numbers that would be better - I think they're working on that.
People using apps that upload their social graph and collect their metadata so they don't have to use their phone number are probably making the wrong choice when considering the trade-offs.
"The social graph" (your phone book) is most likely already uploaded somewhere by third-party app or even Google/Apple themselves. Using separate contact list (even uploaded to some server) seems more secure to me than using your phone's one.
The phone number is much more valuable to any authority than other metadata because they are more likely to have access to cell service than to messenger services.
In the context of the post, signal is much more vunerable than even basic things like email or web chats because police can effortlessly identify anyone in the group chat with a single request to cell company.
Signal can have a great interest in buying up private data linked with phone numbers and add people that they chat with with their phone numbers, sell again with the added data.
That wouldn't be a problem in phones where the radio part is screened from the system (Pinephone?) so that implementing encryption can be effective. The SIM might be tied with my me, but any eavesdropper would see only noise because the outgoing data is encrypted by the system before entering the radio hardware and is decrypted by the system after leaving it. They can know I generated some traffic at a certain moment, which is the same information they could get from the carrier operator, but that's about it.
Yes, and I'm tired of hearing "burner sim" thrown around as a solution whenever some company stupidly demands a phone number. Getting a burner sim is WAY more effort than I'm prepared to expend to sign up to some service.
Stop asking for phone numbers!
My mother has my phone number. A few of my closest friends and family have it. That's it. I have like 10 actual people on my phonebook. My phone provides immediate 24/7 access to me, and there are very few people who are close enough to me that I want to be able to reach me at any time.
Some Silicon Valley company with an inflated sense of importance is never on that list...
Furthermore, "burner sim" implies an accompanying burner phone and all of the opsec that entails (eg only turn it on away from home), lest that "burner" number be trivially linked to your identity through databases that the telephone companies are happy to sell.
Moxie should know better, but this blog post repeatedly boasts about privacy while not addressing this at all. Getting basic encryption into the hands of the masses is a laudable goal. But when the popular threat model changes (eg pocket computer compromise incident to arrest at a protest - an "evil raid" attack), you either need to adapt or you've become an attractive nuisance.
Specifically I'm envisioning an attack where say ten protestors are arrested and found to have a shared contact, and then the police turn that phone number into a real world identity and go after that person for organizing.
Signal has an option to prevent this by locking the number with your PIN. This capability introduces plausible deniability that a phone number assigned to a SIM is actually associated with the number of a Signal account. Don't know if that matters legally or not.
Also the people doing shady things are generally hopping accounts regularly anyway.
PIN only stops registration for a fixed amount of time, believe 7-days, then the entity controlling the number would be able to reclaim the account. If the “attacker” maintained control, new devices that add the number from their contact list would get no alert; that is, the users would have to figure out the number is controlled by someone else.
That’s 7 days since last use. So if you continue to use the app at least once every 7 days, it will remain registration locked.
Also, anyone who had communicated with you before the switch would see a “safety number changed” notification if the number became affiliated with a new device.
Curious, where’s the “last use” in the documentation or code? Ask because I have seen other issues with the PIN vs docs and haven’t gotten to testing the recovery mode.
EDIT: Found the related docs, appears they had been edited since I lasted looked at them; for example, you can now disable PIN reminders:
And yes, “anyone who had communicated with you before the switch would see a ‘safety number changed’ notification if the number became affiliated with a new device” is correct, though so is my statement about new numbers adding the number. To be honest, I have caused the alerts to happen before, the other user had no idea what they meant, didn’t say anything, just clicked okay.
The "encrypted at rest" claim is completely meaningless since there's no way of verifying it. If their web client can read your messages by you reverifying your phone number, so can anyone else with access to their servers.
But that's not true. We can verify a Cloud Chat message is wrapped in a key generated on a user's device. You can look at the client code and prove this. We can also prove that client-server messages must be decrypted on their way back to a user's device. [1]
I wish it was different, but it isn't. Mtproto v2 seems OK, but it is not default, so hardly used. Server side encryption protects against very little.
I like Telegram, but not for security. Nobody should.
The way you use encryption matters as much as the quality of the crypto used. In Telegrams case, usage is all wrong and not even better than e.g. WhatsApp. Opt-in E2EE is worse than having it be default, and server side encryption with server side keys is bordering on the pointless.
Thank you for reflecting with an opinion. I also believe that opt-in E2EE is worse than having it by default. We have precedent for E2EE chats by default and syncing those E2EE messages across devices using the latest RiotX for Matrix. I'd love to see Telegram adopt that strategy.
I use Telegram because it delights its users. When I have tried to bring friends and family to first Riot, then Signal, the experience I encountered was one of frustration, whether it was messages unexpectedly not syncing or delivering, the pain of cross-signing new devices for E2EE (RiotX), or the paucity of features. I have tried nearly every messenger that exists in an effort to seek out a usable compromise for laypeople and Telegram has been the only one to make that bar.
We use Matrix in my workplace because I can count on the employees being technical and patient enough to forgive having to input a multitude of varying passwords in succession just to get their session going.
Thank you for your work! I run my own federated homeserver that I really enjoy digging in with on the weekends. Between selfhosting email and Synapse I really feel a sense of reverent ownership over these methods of communication.
Don't get me wrong, Telegram is a very nice app, and I especially like the wide platform support. It's great at what it does, and that's many things, but unfortunately not secure-by-default communication.
> PINs will also help facilitate new features like addressing that isn’t based exclusively on phone numbers, since the system address book will no longer be a viable way to maintain your network of contacts.
As I understand it, the purpose of using telephone numbers is that it can use the contact list stored on your phone without having to store contact lists on their servers.
Contact lists can be useful information for an attacker. If Signal doesn't have it, it can't be taken from them.
There's nothing inherent in phone numbers here. Both iOS and Android also allows you to add e-mail addresses (and other identifiers) to your local contacts. I'm yet to hear an argument as to why e-mail addresses or other identifiers can't be used in addition to phone numbers, or why it would be a complicating factor.
Telegram isn't really a subject to compare to Signal.
I own a cellphone number for an Eastern European provider and had been using Telegram solely for news reader purposes. None of the people i communicate knows the number.
Much to my surprise i learned that Telegram cooperates with local authorities, i would never found it out if not for COVID-19. The messages i got were addressing the precautions and regulations during the pandemic. While i appreciate the effort, i feel like it's a clear overreach on Telegram's part.
How do i know it was Telegram rather than the authorities themselves? The sender information clearly stated the sender was Telegram (same as a chatbox created to inform on What's new), i've got a push notification even though push notifications were app level turned off.
Can't imagine Signal compromising the trust of it's userbase by sending any sort of material that comes from the healthcare department of whatever country my phone number would be from.
As mentioned elsewhere it all about the threatmodel. I have a group where others like to be anonymous, at least to ppl they do not know (yet). They will very likely not care about authorities to see our messages as they have nothing to hide from that perspective.
>They will very likely not care about authorities to see our messages as they have nothing to hide from that perspective.
`Nothing to hide` isn't the point of my message but the willingness of Telegram to collaborate with 3rd parties be that for a good or bad cause is what concerned me. I already own a tool to get unsolicited messages — email.
Because mobile numbers enable much more easier use than emails. I talk to so many people on WhatsApp who don't even know what an email is, and would certainly never use the app if it couldn't pick their contacts from Contact List. The goal of Signal has always been "privacy for the masses". The experts always had 20 solutions available.
no, we don't have 20 options available. we also need to talk to people who don't understand the problem and can't be persuaded to use one of those other options.
there is a difference between enabling phonenumbers as an ID vs requiring them.
signal could easily support both. (i read somewhere that they plan to support alternatives to phone numbers in the future, just haven't made it a priority. well, i'll wait. edit: actually it's mentioned here: https://news.ycombinator.com/item?id=23435050 )
> Mobile numbers are much more trackable then email addresses in my opinion. And I do not understand at all why others should be able to see them so easily.
This is why I refuse to have Whatsapp and Signal, especially when you live in 3rd world country. It just take ONE SCREENSHOT of you saying something controversial in group chat, that screenshot goes viral to certain radical group, then your phone get spammed death threat.
Not to mention you can and will go to jail if those radical group sue you, thanks to draconian laws (UU ITE) here in Indonesia. You didn't get to jail because you said something that actually insulting someone, you get jailed because you are TRANSMITTING something that can be interpreted as an insult.
It is still very early stage and discoverability/groups are half backed (can't add people to group after creating it with 10 being the limit) but it solves the centralisation problem of signal.
I was disappointed to see that a mobile number is needed and that this number is shown by default in groups. Mobile numbers are much more trackable then email addresses in my opinion. And I do not understand at all why others should be able to see them so easily.
So I now prefer Telegram because at least it hides numbers in groups by default.