I love Signal and use it as much as I can, but I'm thinking of switching to Matrix solely because the desktop client is pretty bad. It won't show me messages until it syncs everything (so I can't even see old messages while things sync), and, what's worse, it skips messages, and multi-device just doesn't work. My laptop just shows "Message could not be decrypted" until I delete everything and reset.
I'm not sure why it works so poorly after years of existence, but unfortunately I'm losing hope that it'll be fixed. I sometimes feel like the Signal team doesn't use their product, or they would have noticed this. Meanwhile, yes, Matrix took years to add encryption, but it works much better than Signal, even with quite a few small bugs.
> Meanwhile, yes, Matrix took years to add encryption, but it works much better than Signal, even with quite a few small bugs.
I'm not sure which Matrix client you use, but clients like Riot don't even let you opt out of sending read receipts unless you edit `/etc/riot/config.json` to enable experiments and then go into the settings to disable read receipts. Problems like this (and issues like this [0]) give me the impression that Riot isn't designed for people who need privacy.
(Yes, there are other Matrix clients [1], but my understanding is that Riot is the flagship interface.)
I'm a massive Matrix fan and have high hopes for it but in experiments we've done with activist and journalist partners we've found the Riot.im client often gets a bit complicated for people to use. I think the main issue people have is related to keys. As I techie I love the options but I find many don't like having all the options. Signal of course is a lot easier as it hides many of those issues in the UI/UX.
The root problem is the requirement to verify that you are talking to who you think you are talking to. If you skip the identity verification stuff then you are inherently trusting a 3rd party. So if you are not exchanging your key fingerprints (safety numbers in Signal terms) then you are kidding yourself.
Exactly. That's the root problem and it's a problem that won't easily go away UI/UX and useability vs Security. To be honest right now most people have voted with their useability thumbs.
Even without verifying safety numbers, you’re still better off on Signal then you would be on another platform that doesn’t even offer the option of verification. If you’re looking to MITM a conversation on Signal you can only guess whether or not the recipients have verified each other, whereas on a platform like iMessage you know they haven’t because it’s not an available option.
SS7 spoofing is not a hard thing to do. Who knows if you really are initiating to +14055551212 who you think they are. I guess using multiple techs in serial could obfuscate the initiation correctly (voice, IM, Social Media, etc)
PGPFone had a neat thing where it'd show each participant of a voice call a short string they'd read out loud, and then the crypto would use those for handshaking. MITM'ing voice, as part of a freeform conversation, especially between friends, is a lot harder.
Out of curiousity, did you ever try keybase? It always struck me that usability as well as security were their primary focus. And I think they did the whole key management / chain of trust thing really well.
Hopefully none of that changes now that they've been acquired by zoom.
It's gotten better - I'm dual-running Riot/Matrix and Signal. Cross-signing has fixed the main issues affecting encrypted chat usability but there's still plenty of UI improvements to make.
I just opened my Signal desktop app that I had synced previously. It asked me to resync again with my mobile device, which needs camera permissions to take a picture of a QR code. I had previously removed Signal from my mobile device. Low and behold, my account no longer existed and I had to sign back up with a phone number. I then clicked sync and most of my messages on my desktop are gone. I don't see how this is easy by any standard.
If I understand your description, you reset your account. They delete the messages for safety when you reset. An attacker could reset by getting ahold of your phone number by sim jacking or the govt getting your text. It's a safety method so no one can take you texts. Of course many people want to carry their texts along, but this is a safety risk if you lost control over your number. So that's what signal is doing. If I recall when I had to reset my own number they did say they were deleting my old messages.
Signal allows backing up messages (though the UI and workflow for it is still rather clunky), so you should be able to restore them even if you switch to a different phone number entirely.
No, I had removed the app from my mobile previously, not deleted my account. When I resynced, they had removed my account and the messages saved on my desktop disappeared.
That is the same thing. The messages were stored on your phone never on any "account". The desktop was only ever a mirror of the phone. This is explicitly how Signal works. WhatsApp works the same way.
If you did not delete the Signal directory on your phone then there should be some old backups with your messages there. These will be encrypted so you will need to original password to unencrypt them.
I, for one, have a bigger problem with it forcing the use of phone numbers as a sign-in method. They're an arbitrary identifier from a legacy system that there's not really a point in continuing to extend, because if your device is capable of anything more advanced than SMS it's also capable of... well, this.
Also KaiOS and the like are making chat feasible even on feature phones.
Don't get me wrong, RCS will be a fine enough fallback (once it's E2E), but standardized chat is the dream.
> Don't get me wrong, RCS will be a fine enough fallback (once it's E2E), but standardized chat is the dream.
Is there a plan for RCS to be E2E? Given that RCS went under the GSMA umbrella in 2008, and it's 2020 and adoption is minimal, I don't have any hopes for a future update that supports E2E to come out any time sooner than 2040, with handsets supporting it in 2050, and all endpoints supporting it in 2065; Google will have released about 30 more messangers by then, of course.
Tackling this point separately: the entire reason they do this is because they routinely experiment with side projects and then build the ideas that work well into the services that gain traction. As much as it comes with the drawback of being scattershot in general, it specifically creates a track record for failure with messaging because successful messaging products, as a rule, have network effect - something you can't build when you're playing with three different approaches simultaneously.
For an example of where this works really well, look at how all of their adaptive UI efforts feed into each other:
* The enhancements to multiwindow that were built for foldables became Android's desktop mode, to the point that it was built specifically as a test environment and now underpins DeX etc
* Desktop mode's only hardware requirement is a display output, suggesting in addition that Android apps as a whole are no longer bound to specific 1:1 relationships of UI and form factor. (This is, imo, a much bigger deal than we're making it out to be, and opens up possibilities ranging from hybrid game consoles to mobile content creation to better takes at mobile-powered VR.)
* The existence of a base OS implementation and the fact that it's controlled by the system launcher, a component the user can rip and replace, pretty much ensures that custom ROM communities are already toying with this
* Android supports PWAs - installable, natively-scalable webapps - meaning that when desktop mode inevitably stops being feature-flagged there will be examples of convergent apps that work on day 1
* Desktop support for Android apps enhances those same apps when used on ChromeOS
* Flutter, the toolkit built for Fuchsia - an OS designed from the ground up with this sort of scalability in mind - is capable of targeting all of the above
>As much as it comes with the drawback of being scattershot in general, it specifically creates a track record for failure with messaging because successful messaging products, as a rule, have network effect - something you can't build when you're playing with three different approaches simultaneously.
I think what Google needs to do is to seperate the messaging protocol from the messaging software. The protocol needs network effects. The software doesn't. That's why shutting down Google Inbox didn't kill email and it's why any new experiments with email software can benefit from the network effects that email protocols already have.
Hard agree. I'd also argue that any meaningful antitrust action we may eventually impose on big tech companies should force this.
If the one thing so far that's led the feds to threaten this is that they wanted to build a modern protocol for cross-service messaging, then there's no sane reason we couldn't have asked for that exact thing as a spec.
Not within the spec. Which was sort of the point I was (poorly) trying to make - that it's a huge caveat, but otherwise a decent fallback if and when that changes.
Google is adding an implementation into Messages, and it's honestly not a critical problem if OS vendors are supporting it at that level, but there's still too much we don't know about it imo. Will that be supported by iOS, if and when it supports RCS at all? Will it work for third-party clients, if and when Android gets APIs?
I'm not sure how much optimism I have that this will be anything other than a fragmented mess in the short term.
The only thing getting RCS any real traction is Google seems to be pushing it in their SMS application, and is now running an RCS server for everyone (or something).
Which basically means, instead of having a federated mess as designed to replace the federated mess of SMS and MMS, we'll get a Google mess, maybe. But if Google was any good at making messenger apps, maybe enough people would use one of them that it wouldn't be killed.
Which just sort of loops back to how if Web-based chat had a spec with meaningful user traction we wouldn't have any real use for RCS in the first place.
Also how XMPP could have been that spec, if Google hadn't decided when launching (the first version of) Hangouts to go full Ayn Rand while doing it.
"even on feature phones"? It was perfectly possible at least back in the early 2000's. Where I'm from, we've gone through a lot of different IMs over the years, including XMPP (which I was a big fan of, but started to despise because it had such terrible support for mobile clients). Many J2ME clients had pretty advanced features like group chats and file transfers.
I knew Signal was against federation but I hadn’t realized they had pretty much banned third party clients. That would otherwise have been a really easy win for people that actually care about the system integration, performance, and architecture of desktop clients enough to shun electron “clients.”
The Signal team has always been open about the reason why they reject third-party clients: they claim that XMPP adoption was hindered by the inability of a user’s software to know if the software on the other end supports the same feature set. XMPP had grown into a large set of features that some clients supported and others did not.
If Signal introduces a new feature, it knows that all users’ devices will support that feature, because its own software is the only game in town.
Cases where a client is completely broken are never the problem: users will be forced to switch to a different client. It sucks, but it's no worse than the current state of affairs. A security-mandated change in protocol/behavior would fairly fall under that category.
You can always define backwards compatibility that only goes to a certain lowest common denominator feature set, and no lower. For instance I have a number of httpd that support TLS1.2+ and specifically disallow SSLv3, TLS1.0 and TLS1.1. The population of browser user agents that don't understand TLS1.2 is infinitesimal at this point.
They also time bomb their own software, so if you don't take automatic updates from them it just stops working.
[Yes, you could manually compile it and update it... which is what I did for the first year I used signal, until it expired on me with no warning while I was away on a trip and had no way to update it.]
Beautifully mediocre, but yeah it mostly works. (Never been able to use Signal so I can't compare, but Riot's encryption was completely broken until a few weeks ago, and now there is only the occasional "hold on, gotta fetch keys for 10 seconds" and slow scrolling up, etc. It can't compete with something like Telegram in terms of UX; it is infinitely better for privacy but it doesn't "work beautifully".)
It was a non-stop shitfest of buggy encryption and broken UX last I tried. A group of us tried using it for a bit and gave up because we kept losing messages, couldn't figure out which messages were "secure" in various ways, and had confusing horizon effects where only some people could see some messages.
That was a couple months ago. Maybe it has improved massively since then? I guess I'll try it again later this year.
Definitely improved somewhere last month. Not bug-free by a long shot, but while I also gave up on it after trying it in February, I'm now using it in favor of Keybase that got bought by some shitcorp.
No idea what problems you have, but I have been using encryption heavily since it got enabled by default.
And I have never had any issue with it - even the transfer of encryption keys in the background to verified 3rd party clients worked like a charm.
But, also this experience may not have been possible before may (when they released a big update with e2ee by default).
I do get the occasional missing messages in the desktop client but what I notice most is that it gets unstable and weird when there is a update available (which I don't get notified about). Desktop and laptop with Ubuntu btw, signal installed via snap if I recall correctly.
The desktop was pretty terrible for me (on Linux) for until a few months ago, but now it works perfectly. The only thing missing is sms from the desktop but from what I understand Signal won't ever add that because sms isn't secure
I really hope this improves, I had to reset my laptop's Signal Desktop just last week because all the messages were "couldn't decrypt" and I was missing messages on my desktop just today.
I work on calling for Signal, and the last few months we have been working on desktop calling. It's nearly ready and will hopefully be in public beta in a matter of weeks, and (again, hopefully) fully available in a few months.
Riot and almost all of the other clients I tried all phone home to some kind of metadata/identity or push server that is run by some private company I’ve never heard of (which isn’t part of Matrix/Riot).
Riot only ever talks to identity servers (which simply maintain a directory of matrix users by phone number or email address) if you actually try to look up a user by one of those identifiers, and if you actually opt in to it. We used to do it without opt-in when you searched, but this was fixed at https://matrix.org/blog/2019/09/27/privacy-improvements-in-s....
For Push, whatever app you use needs to have a push server that talks through to Apple/Google if you use their push. For Riot, that server is run by the New Vector (vector.im), the outfit which makes Riot.
It does not appear that your statement is accurate.
I just fired up Riot 1.6.2, the latest available from the site.
It makes three connections, one each to matrix.org, vector.im, and riot.im on startup.
Even after removing application settings and preferences, it connects to all three of those on startup, just sitting at the login screen, not signed in to anything. I didn't opt in, and I didn't look anything up; it was automatic (and silent!) on a blank install.
That's completely unacceptable, and counts as telemetry to three separate parties, whether intentional or not.
The local app should not send any traffic whatsoever when launched and sitting at the login/signup screen. It should make a connection to the homeserver chosen to log in to, and that server alone.
This is semi-unrelated, but indicative: I posted a bug about photos taken in landscape showing up as portrait (ie sideways) if the screen is locked, and it was closed as "we'd like to have this fixed at some point".
How is rotating photos properly not a priority for a messaging app? It's really annoying to have half your photos rotated sideways, and it screams "buggy app" to both participants of the conversation.
I would understand the urgency if this issue would occur in the app when you open it normally but the lock screen? Really?
At this point I'd say you desperately look for something to complain about. Why don't you just not use it and check your PC? There is obviously something wrong with it.
I wasn't talking about desktop with your photo issue.
And I don't know what you mean by screen lock if it's not the lock screen. However, as I said: the photos are properly oriented within the app so whatever your issue is, it can't be that urgent...
Stickers are important for people like my sister and wife, who are put off if the messaging app doesn't support them. I wouldn't knock their importance, but yes, good syncing should come first.
My 6 year old son just discovered the augmented-reality tricks in the LINE app (adding a mustache, hair, glasses, sound effects etc). He got my mother (68) to install it and now they use it pretty much for all their video conversations. This replaced FaceTime, despite FaceTime having better sound and picture quality (edit: and even some effects).
He's in the age (and COVID-19 environment) where he starts speaking to cousins and friends online. So I imagine LINE spreading pretty quickly in similar circles based on those features alone. That's the stuff that drives adoption, as silly as it might be.
That said, yes, the plumbing needs to be in working order, or no silly feature can cover over the leaks (or smell, if we're going with the metaphor).
Favoring mustaches over security is always the choice you can make. I don't care about adoption unless it falls below levels to make signal worth developing.
> Favoring mustaches over security is always the choice you can make
Unfortunately not. If most of my friends and family favour mustaches and effects (which seems to be the case, I would imagine as a general rule), and I favour security (the minority, as a general rule?), then I won't be able to talk to them securely.
If they don't want to talk securely to you, then they don't want to, why do you want to force them to? They will spill secrets to other people all the time, using Signal or not. They will talk to other people about things you thought was a secret between you. They don't patch their computer. They use trivial passwords to their computer, they reuse passwords on all websites, they don't encrypt their drives or their Micro-SD on their mobile phones. Why should they care about things you want them to care about? They won't.
I use WhatsApp to people who don't care about security, and I can only be contacted by Signal for business stuff.
> how we think about concepts like privacy, security, and trust
I was disappointed to see that a mobile number is needed and that this number is shown by default in groups. Mobile numbers are much more trackable then email addresses in my opinion. And I do not understand at all why others should be able to see them so easily.
So I now prefer Telegram because at least it hides numbers in groups by default.
This is a fair concern, and I hope Signal Addresses it at some point, but Telegram is no replacement. From Telegram's Wikipedia page "The default messages and media use client-server encryption during transit.[19] This data is also encrypted at rest, but can be accessed by Telegram developers, who hold the encryption keys," and "the desktop clients (excluding macOS client) do not feature end-to-end encryption, nor is end-to-end encryption available for groups, supergroups, or channels." In addition, the Telegram server software is proprietary.
It really depends on your threat model: do you trust the people you are talking to but not Signal (though in practice this doesn't even work as they are way too centralized, but whatever), or do you not trust the people you are talking to but are willing to trust Telegram? Clearly we should be able to have something where we don't trust either, as there is nothing about these phone numbers that is critical to Signal's operation (if anything, Signal is compromising basic important things to use them), but between these two choices it isn't clear to a lot of people why Signal's choice is better, given the actual use cases and threat models most people have.
The fewer people who you're required to trust, the fewer breaches there will be, even if no one is malicious and those breaches are accidental.
And ultimately even if the people you are talking to and your messaging provider are 100% trustworthy and never make mistakes, they usually cannot resist a lawful government request for data. Signal just has virtual no data available to provide them; Telegram could give them entire chat histories, and could be required to provide access to in-progress chats.
To save time he mentions that owning a SIM card or a smartphone with a Contacts application that holds phone numbers is a portable social network by design.
Pushing his idea further it's much easier for anyone to learn the penetration of Signal by checking against existing phone numbers. I can't imagine someone with a 200+ phone numbers querying service to validate a phone number is registered with Signal.
And while nicknames sound perfect in theory they lack one specific thing — validation of authenticity of a speaker. By this i mean an impersonator can claim the username and pose as someone else. Given a lot of people use transparent nicknames on the internet, it can pose a threat. So even when the nicknames roll out as a feature of Signal, it's good to facilitate somehow that a party i'm in contact with is someone genuine. Checking against phone numbers can mitigate the risk of fraud.
You are defending why phone numbers are useful, but are completely choosing to ignore the point I am making about the tradeoff of who you trust.
To be even more explicit about it, in the hope you understand: the vast majority of people trust large companies (which is bad but frankly not insane) but absolutely do not trust random assholes/creeps on the Internet. The security model of Signal is saying you don't have to trust Signal, but you weirdly do have to trust all of the random people you interact with with your phone number.
And so, in the context of this thread, that explains why someone would claim Telegram is actually better than Signal, because what people do with Telegram is join massive group chats that are either 1) public, 2) have so many people in them that if you think what you say isn't going to be logged you are fooling yourself. So the value of end-to-end encryption in this context is essentially zero; but, being able to join some large group chat with a ton of strangers to talk about some open source project or whatever you are doing without any of those people now knowing your phone number--an identifier which is tied to a large number of "real world" concerns and is ridiculously difficult to change--is actually extremely valuable.
Honestly, even if you aren't quite in those sets, the tradeoff still isn't an obvious win for Signal. As an example, let's say you are in a group of people talking about a protest. Are you more concerned that the company relaying your messages will be served a warrant to monitor your chat activity (which generally has some requirement of probable cause for a specific action, and likely requires knowing about the existence of a chat in the first place) or that one of the people in your group chat is actually a traitor or even an undercover cop (which can get in a number of groups and pretend to be an ally while passively monitoring for things they want to shut down)? The latter is actually a much more realistic attacker model, and with Signal now that person has your phone number, which means you are screwed. Using Signal correctly here requires getting a burner phone, which is way more effort than is reasonable.
The use cases for the privacy and security model of Signal are thereby inherently limited to people you trust with your phone number. Like, it is sometimes difficult enough to get people to want to use text messages sometimes as they don't want to give out their phone number: Signal doesn't solve that, and so is confined to the subset of communication that people currently do over SMS(/iMessage) and can't really ever begin to carve into the market share of Telegram, or even Facebook.
And so, realistically, Signal does not, can not, and should not manage to displace Telegram, which I say with sadness as I am someone who has not and likely never will forgive Telegram for claiming security properties their system didn't have (like, I am not Telegram fan, and while I have the app I only use it a few times a year; that said, this is more than I am willing to tolerate Signal, due to a number of reasons that are mostly unrelated to anything in this comment).
(And FWIW, I personally would not recommend usernames, and in fact would personally be much more angry about that than phone numbers for various reasons; if Signal decides to roll out unique choosable usernames I am honestly probably going to hate on it even stronger because of it: you are arguing a strawman here :/. But to claim that phone numbers are fundamentally better is awkward regardless, given how phone numbers aren't even a good security layer due to the prevalence of number porting. This is just one of the many devastating things that Moxie is wrong about.)
>>You are defending why phone numbers are useful, but are completely choosing to ignore the point I am making about the tradeoff of who you trust.
You're swiftly jumping to conclusions on a comment that only describes a point of view. Not my own perception of how the trust model should be established and the capabilities current messaging services provide.
>>... but, being able to join some large group chat with a ton of strangers to talk about some open source project or whatever you are doing without any of those people now knowing your phone number--an identifier which is tied to a large number of "real world" concerns and is ridiculously difficult to change--is actually extremely valuable.
As someone who suffered harassment by getting the p/n exposed at Whatsapp and being followed by the same person almost everywhere where's it serves as an id (Signal, Telegram) i wholly share your concerns. But it's a privacy matter that almost none of the current messaging platforms really offers bundled with solid UX and data transparency.
What am i trying to convey by how i comprehend Moxie's rhetoric is that Signal tries to be a Whatsapp alternative (given the latter really can access messages in a group conversation when a participant reports contact) by not harvesting the user data (cheers Telegram) and providing little bit more control over the conversation on both ends (self destructing messages). Signal has bigger focus on security right now however the use case is to be accessible to a wider audience. From that point of view Signal really stands out. And it helps me personally to, sort of, separate communication spaces.
>>Honestly, even if you aren't quite in those sets, the trade off still isn't an obvious win for Signal. As an example, let's say you are in a group of people talking about a protest.
Look, if you propose Signal needs to accommodate guerillas, rioters and protesters organize and act with impunity it's a privacy issue and Signal isn't serving this purpose.
If you look in the past main attack vectors on messengers are via exposed phone number, third-party insecure cloud backups or just physically accessing your device. Telegram addresses all of these while also allowing e2e encryption, just not by default for now because UX. At the same time many other popular messengers advertise "e2e by default" while being not secure at all and having mediocre UX (no desktop clients, no seamless sync, no usernames etc).
They're laying the groundwork to change this. Part of the purpose of the PINs rolled out last month is to enable centralizing some data to allow for other addressing schemes.
Burner sim to setup and throw away addresses this concern.
Telegram, messages in plaintext on the server? Encryption that isn't open? Yeah telegram is a bit of a non-starter if you have these kinds of concerns as far as I'm aware.
If that’s the case doesn’t it matter even less that signal requires it since it’s already known anyway?
Signal’s use of phone numbers as IDs means they don’t have to have any of your contacts sent to their servers.
As shown in the article they have no metadata and nothing to reveal beyond your phone number and when you signed up.
These other apps send your social graph to their servers, track and store metadata, don’t have encryption on by default, roll their own cryptography, or some combination of all of these things.
The phone number obsession on HN seems dumb to me - a meaningless thing for people to repeat and complain about that doesn’t actually matter so they can sound like they know what they’re talking about.
I don’t get it.
The only real criticism I have for signal is that they’re not federated so they’re vulnerable to shutdown. I think that’s okay though because we have Matrix working on that problem and having both is probably a good thing.
Assuming you’re using Signal for organizing something the government doesn’t want you organizing, if one member of the group gets rubber-hosed into unlocking their phone, the govt instantly gets a list of verifiably correct names of people involved. In contrast, with a service that lets you use usernames that maneuver would reveal nothing but those usernames (which are as pseudonymous as it gets).
One of the other problems with using phone numbers, is that it provides an opening for adversaries. Now they know your phone number, which can be used for social-engineering attacks to attempt to bypass 2FA for any other online services tied to your phone number. Either for 2FA or for account-recovery/i-forgot-my-password functionality. 2FA by SMS is wrong and broken and nobody should use it, but they do.
Adversaries will attempt to social engineer customer service for your phone carrier into issuing them a new SIM or porting out the number, so they can receive verification SMS and phone calls.
Maybe I misread, but the GP doesn't seem to be talking about impersonating a user on Signal, but rather impersonating that user on other websites that depend on SMS 2FA sent to their phone number that is now visible through Signal.
I feel comfortable with giving my Telegram username out to random people on the internet and posting it on my website because it doesn’t mean anything outside of Telegram. I wouldn’t post my phone number publicly.
That’s fair, thanks - I think for secure communication with people you don’t know personally or trust, it’d be better to use something that doesn’t share your phone number.
And the social graph IS sent to servers by Signal. It's protected only by hashing (trivial to circumvent) and by the Intel SGX technology (a bit harder to circumvent, but I doubt that the US govt can't do it).
I didn't want to claim that Telegram's crypto was better or as good as Signal's. But saying that Telegram rolled their own crypto while Signal did not is misleading as both designed their own protocols.
Also, Telegram uses SHA256 now in the places relevant for security, so that point was resolved.
My (limited) understanding is numbers are queried to see if accounts exist, but those queries are not connected to the users sending them (and they are obscured in transit).
Am I wrong?
Signal’s cryptography has also gotten a ton of attention, I don’t think the same is true for competitors.
Transit encryption is employed by other services as well. Whether there is de-correlation on the cloud backend I don't know. Maybe there is, but you can't really verify that and it's easy to correlate them again, especially if clients use ipv6 or non-CGNAT ipv4.
>If that’s the case doesn’t it matter even less that signal requires it since it’s already known anyway?
Well, if you're operating on that premise, which is to say, a premise of complete and total resignation and surrender, then from that starting point of course you haven't lost anything. I don't think anybody is joining you though in agreeing that that's a legitimate starting place to analyze privacy concerns associated with the phone number requirement.
This is such an uncharitable interpretation of what I was saying that it's basically a straw man.
If you're required to use ID to get a SIM (as K2L8M11N2 stated in the parent comment I replied to), then what I was saying follows - that the person is already tied to the phone number anyway.
In this context Signal revealing the only data they have (that a phone number signed up on X day) really doesn't matter or reveal anything new.
K2L8M11N2's other response to my comment is a helpful clarification, it's less about what can be compelled from Signal the company and more about what can be turned over if a user's device is compromised. In that context the name to number connection is more serious because they also have the content.
I'm sorry but absolutely nothing about this is a straw man or uncharitable, and I'll explain why.
>If you're required to use ID to get a SIM (as K2L8M11N2 stated in the parent comment I replied to), then what I was saying follows
Yes, and this is what I was responding to. You want that "if" to be taken for granted as an unchallenged starting premise to your entire argument. And that amounts to a massive privacy concession. And the fact that I'm challenging that premise, and bringing up the privacy concerns that are associated with that "if", that is the thing that you're describing as uncharitable. Even though you don't appear to dispute that it is indeed a privacy concession. And it only follows if you don't contemplate alternatives to using a phone number, which is what I took to be their point about the problems associated with a phone number.
>and more about what can be turned over if a user's device is compromised. In that context the name to number connection is more serious because they also have the content.
That's going true in any context where your number can be revealed, which is why it has unique disadvantages that usernames wouldn't have, under any conceivable hypothetical scenario. I'm glad that you benefited from their clarification but that struck me as a truism about the nature of phone numbers versus the nature of usernames.
> "You want that "if" to be taken for granted as an unchallenged starting premise to your entire argument."
I don't want that 'if' to be anything. It was the premise, because the parent comment I was responding to was stating it as a fact for where they live.
My point is that signal revealing your phone number and when you signed up doesn't reveal anything new about you. The issue is the case K2L8M11N2 mentioned when they have compromised a device (and can now tie content to IDs via the phone number).
> "That's going true in any context where your number can be revealed...under any conceivable hypothetical scenario."
This is just false? Without access to the content on a compromised device a phone number alone doesn't reveal much (that's the entire point of the e2ee), if it limits the ability for Signal to hand over the social graph or any other metadata (which does reveal a lot) that seems like a win.
Obviously revealing the phone number still reveals more than a username would, and if you can get all the benefits of not having to upload your social graph to their servers or share metadata without having to use phone numbers that would be better - I think they're working on that.
People using apps that upload their social graph and collect their metadata so they don't have to use their phone number are probably making the wrong choice when considering the trade-offs.
"The social graph" (your phone book) is most likely already uploaded somewhere by third-party app or even Google/Apple themselves. Using separate contact list (even uploaded to some server) seems more secure to me than using your phone's one.
The phone number is much more valuable to any authority than other metadata because they are more likely to have access to cell service than to messenger services.
In the context of the post, signal is much more vunerable than even basic things like email or web chats because police can effortlessly identify anyone in the group chat with a single request to cell company.
Signal can have a great interest in buying up private data linked with phone numbers and add people that they chat with with their phone numbers, sell again with the added data.
That wouldn't be a problem in phones where the radio part is screened from the system (Pinephone?) so that implementing encryption can be effective. The SIM might be tied with my me, but any eavesdropper would see only noise because the outgoing data is encrypted by the system before entering the radio hardware and is decrypted by the system after leaving it. They can know I generated some traffic at a certain moment, which is the same information they could get from the carrier operator, but that's about it.
Yes, and I'm tired of hearing "burner sim" thrown around as a solution whenever some company stupidly demands a phone number. Getting a burner sim is WAY more effort than I'm prepared to expend to sign up to some service.
Stop asking for phone numbers!
My mother has my phone number. A few of my closest friends and family have it. That's it. I have like 10 actual people on my phonebook. My phone provides immediate 24/7 access to me, and there are very few people who are close enough to me that I want to be able to reach me at any time.
Some Silicon Valley company with an inflated sense of importance is never on that list...
Furthermore, "burner sim" implies an accompanying burner phone and all of the opsec that entails (eg only turn it on away from home), lest that "burner" number be trivially linked to your identity through databases that the telephone companies are happy to sell.
Moxie should know better, but this blog post repeatedly boasts about privacy while not addressing this at all. Getting basic encryption into the hands of the masses is a laudable goal. But when the popular threat model changes (eg pocket computer compromise incident to arrest at a protest - an "evil raid" attack), you either need to adapt or you've become an attractive nuisance.
Specifically I'm envisioning an attack where say ten protestors are arrested and found to have a shared contact, and then the police turn that phone number into a real world identity and go after that person for organizing.
Signal has an option to prevent this by locking the number with your PIN. This capability introduces plausible deniability that a phone number assigned to a SIM is actually associated with the number of a Signal account. Don't know if that matters legally or not.
Also the people doing shady things are generally hopping accounts regularly anyway.
PIN only stops registration for a fixed amount of time, believe 7-days, then the entity controlling the number would be able to reclaim the account. If the “attacker” maintained control, new devices that add the number from their contact list would get no alert; that is, the users would have to figure out the number is controlled by someone else.
That’s 7 days since last use. So if you continue to use the app at least once every 7 days, it will remain registration locked.
Also, anyone who had communicated with you before the switch would see a “safety number changed” notification if the number became affiliated with a new device.
Curious, where’s the “last use” in the documentation or code? Ask because I have seen other issues with the PIN vs docs and haven’t gotten to testing the recovery mode.
EDIT: Found the related docs, appears they had been edited since I lasted looked at them; for example, you can now disable PIN reminders:
And yes, “anyone who had communicated with you before the switch would see a ‘safety number changed’ notification if the number became affiliated with a new device” is correct, though so is my statement about new numbers adding the number. To be honest, I have caused the alerts to happen before, the other user had no idea what they meant, didn’t say anything, just clicked okay.
The "encrypted at rest" claim is completely meaningless since there's no way of verifying it. If their web client can read your messages by you reverifying your phone number, so can anyone else with access to their servers.
But that's not true. We can verify a Cloud Chat message is wrapped in a key generated on a user's device. You can look at the client code and prove this. We can also prove that client-server messages must be decrypted on their way back to a user's device. [1]
I wish it was different, but it isn't. Mtproto v2 seems OK, but it is not default, so hardly used. Server side encryption protects against very little.
I like Telegram, but not for security. Nobody should.
The way you use encryption matters as much as the quality of the crypto used. In Telegrams case, usage is all wrong and not even better than e.g. WhatsApp. Opt-in E2EE is worse than having it be default, and server side encryption with server side keys is bordering on the pointless.
Thank you for reflecting with an opinion. I also believe that opt-in E2EE is worse than having it by default. We have precedent for E2EE chats by default and syncing those E2EE messages across devices using the latest RiotX for Matrix. I'd love to see Telegram adopt that strategy.
I use Telegram because it delights its users. When I have tried to bring friends and family to first Riot, then Signal, the experience I encountered was one of frustration, whether it was messages unexpectedly not syncing or delivering, the pain of cross-signing new devices for E2EE (RiotX), or the paucity of features. I have tried nearly every messenger that exists in an effort to seek out a usable compromise for laypeople and Telegram has been the only one to make that bar.
We use Matrix in my workplace because I can count on the employees being technical and patient enough to forgive having to input a multitude of varying passwords in succession just to get their session going.
Thank you for your work! I run my own federated homeserver that I really enjoy digging in with on the weekends. Between selfhosting email and Synapse I really feel a sense of reverent ownership over these methods of communication.
Don't get me wrong, Telegram is a very nice app, and I especially like the wide platform support. It's great at what it does, and that's many things, but unfortunately not secure-by-default communication.
> PINs will also help facilitate new features like addressing that isn’t based exclusively on phone numbers, since the system address book will no longer be a viable way to maintain your network of contacts.
As I understand it, the purpose of using telephone numbers is that it can use the contact list stored on your phone without having to store contact lists on their servers.
Contact lists can be useful information for an attacker. If Signal doesn't have it, it can't be taken from them.
There's nothing inherent in phone numbers here. Both iOS and Android also allows you to add e-mail addresses (and other identifiers) to your local contacts. I'm yet to hear an argument as to why e-mail addresses or other identifiers can't be used in addition to phone numbers, or why it would be a complicating factor.
Telegram isn't really a subject to compare to Signal.
I own a cellphone number for an Eastern European provider and had been using Telegram solely for news reader purposes. None of the people i communicate knows the number.
Much to my surprise i learned that Telegram cooperates with local authorities, i would never found it out if not for COVID-19. The messages i got were addressing the precautions and regulations during the pandemic. While i appreciate the effort, i feel like it's a clear overreach on Telegram's part.
How do i know it was Telegram rather than the authorities themselves? The sender information clearly stated the sender was Telegram (same as a chatbox created to inform on What's new), i've got a push notification even though push notifications were app level turned off.
Can't imagine Signal compromising the trust of it's userbase by sending any sort of material that comes from the healthcare department of whatever country my phone number would be from.
As mentioned elsewhere it all about the threatmodel. I have a group where others like to be anonymous, at least to ppl they do not know (yet). They will very likely not care about authorities to see our messages as they have nothing to hide from that perspective.
>They will very likely not care about authorities to see our messages as they have nothing to hide from that perspective.
`Nothing to hide` isn't the point of my message but the willingness of Telegram to collaborate with 3rd parties be that for a good or bad cause is what concerned me. I already own a tool to get unsolicited messages — email.
Because mobile numbers enable much more easier use than emails. I talk to so many people on WhatsApp who don't even know what an email is, and would certainly never use the app if it couldn't pick their contacts from Contact List. The goal of Signal has always been "privacy for the masses". The experts always had 20 solutions available.
no, we don't have 20 options available. we also need to talk to people who don't understand the problem and can't be persuaded to use one of those other options.
there is a difference between enabling phonenumbers as an ID vs requiring them.
signal could easily support both. (i read somewhere that they plan to support alternatives to phone numbers in the future, just haven't made it a priority. well, i'll wait. edit: actually it's mentioned here: https://news.ycombinator.com/item?id=23435050 )
> Mobile numbers are much more trackable then email addresses in my opinion. And I do not understand at all why others should be able to see them so easily.
This is why I refuse to have Whatsapp and Signal, especially when you live in 3rd world country. It just take ONE SCREENSHOT of you saying something controversial in group chat, that screenshot goes viral to certain radical group, then your phone get spammed death threat.
Not to mention you can and will go to jail if those radical group sue you, thanks to draconian laws (UU ITE) here in Indonesia. You didn't get to jail because you said something that actually insulting someone, you get jailed because you are TRANSMITTING something that can be interpreted as an insult.
It is still very early stage and discoverability/groups are half backed (can't add people to group after creating it with 10 being the limit) but it solves the centralisation problem of signal.
What happens to Signal when the EARN It Act passes? I assume that eventually the Apple App Store and Google Play Store will just stop allowing it to be downloaded if they do not add the backdoor in? Is there a workaround that will allow people to use it still? I've heard people mention locating the servers in other countries, but wouldn't the various App stores be bound by US law and still not allow them?
Thanks for the link. There's a subtle threat in there, that they'll move out of the country if they have issues which I think a lot of tech companies would.
This bill is so stupid in that tech companies can relatively easily move.
The legal entities can move to other jurisdictions, sure, but it doesn't matter because app distribution still occurs primarily through USA-based Google Play and USA-based Apple App Store—both of which can easily geofence apps as they please (or as they're required).
This is one of the reasons I've started to appreciate Matrix a lot more lately.
Putting aside Signal officially declining to the add option to discover or manually add a server via the client, theirs nothing stopping anyone from going to GitHub, downloading the code for the server and client, editing the code however they see fit as long as it follows the legal guidelines.
I don't think that the earn it act will affect Signal - they aren't a publisher by any reasonable standard so they don't need the 230 exemption in the first place.
As I don't know much of the details of the legislation, or more importantly its references or modifications, can you elaborate more on what "they don't need the 230 exemption" means in the context that this act would likely not apply to them? Are you implying that the EARN IT act focuses on publishers of content and thus it less likely apply to Signal?
Disclaimer: IANAL, an not really well informed on these subjects.
I think with sufficient funding for a legal department, Signal could work without the section 230 exemption. In practice, they don't have that money and would be forced out of business long before they were able to prove their case.
I recall reading something recently about how in a coming release, Android will disable sideloading. The sole permitted way to sideload will be to enable ADB and then install the app with adb install. Some techies will continue to do that, just like some people unlock the bootloader and install LineageOS on their device, but removing Signal from the Play Store would make it as good as dead for the general public. (Even Signal’s website discourages people from downloading the APK from them, and prefers that people use an app store instead!)
There is a concern with getting ordinary non technical users accustomed to the concept of sideloading apps... It might be totally safe to download the official Signal APK and sideload it. But people will then think that's a suitable and acceptable way to install other things, and will then be more likely to be easily social engineered or phished into loading other malicious APKs.
The ordinary non technical user has no idea of how to manually verify the sha256 checksum of a APK they've downloaded from the "official" software developer of an app.
"Given that these features are already functional once enabled, it may not be long before the sideload protections arrive for those who enrolled in the Advanced Protection Program."
Perhaps developers will one day be required to provide proof of ID to Apple or Google before being allowed to carry out the "dangerous" activity of installing unapproved software.
Some googling leads to this [1]. From what I read it seems to be an opt-in program (for now). Was initially very concerned when I read your post, especially because Google recently broke Magisk (likely forever).
Basically Google has actually implemented remote attestation properly (using hardware) so Magisk can't hide unlocked bootloaders anymore unless someone finds a crypto flaw. It's slowly being rolled out to Play Services but I believe cts still passes for now.
Signal started open-source, it was TextSecure, I'm sure there'll be an open-source alternative if the commercial entity fails, though I hope they do not.
> Signal Messenger, LLC, is a software organization that was founded by Moxie Marlinspike and Brian Acton in 2018
Did you google it? You're simply wrong.
It is open-source as it was, I didn't dispute that, but they are liable for their users if this bill passes, and they will easily go bankrupt. If there's no commercial entity, then liability falls to the developers most likely--whose identity can be obscured since they're developing Signal hopefully.
No; I didn't have to Google it because I've been paying attention to Signal for the better part of the last decade. I'm afraid it is you who are mistaken.
If you've googled it, surely you're aware of the Signal Foundation[1], the 501(c)(3) parent organization of the LLC.
And while OWS/Signal Messenger did not have formal non-profit status prior to the Signal Foundation, it was never acting as a for-profit entity[2]:
> in general, Open Whisper Systems is a project rather than a company, and the project's objective is not financial profit.
I've been paying attention as well, I've been using it since the early days, I also have an M&A lawyer by my side.
The Signal Foundation is responsible for all costs associated with the project--without it the project wouldn't function.
OWS was also largely a corporation, but it wasn't require for TextSecure to operate--a key difference.
> And while OWS/Signal Messenger did not have formal non-profit status prior to the Signal Foundation, it was never acting as a for-profit entity[2]:
A corporation, non-profit or not, is still a corporation. Based on my knowledge of full-cycle accounting for non-profits, they tend to make more profit, they just pay less taxes.
Having a non-commercial parent doesn't mean your business is non-commercial, Signal Messenger LLC is the corp which is associated with Signal Messenger. Signal Messenger could offer private equity, since Signal Foundation is not the exclusive shareholder.
I have an M&A lawyer right here if you want me to ask him.
In any case, I doubt they have enough funding to do this in a standard donation-based non-profit, it seems like there's a non-profit which is used to funnel money to the commercial entity, Signal Messenger LLC.
It's a standard structure, I'm sure, but it's definitely not your standard non-profit.
I'm more concerned about the coloquial use of commerical in any case--does the corp make profits or not, not whether they pay tax or not. I'd say they're looking to transition into a fully non-commercial entity, but for the purposes of this thread they are still a corporation and can still be held liable for their users' actions--in this case the app can still operate without a formal legal entity, but it would be removed from app stores, I'm sure.
As far as I know, while the DevOps code is not open source, the server and app code are on GitHub; that is you’re able to roll your own version however it defined by the licensing; recent attack on Signal by security researcher used a self-compiled app as a proof of concept; Signal patched the issue.
I'm glad to see the discussion about what privacy should look like progress, with increasingly big actors treat data as something too delicate to play warden until the next breach.
But it just feels cheap and detrimental to LARP as a tool for revolutionaries. I want a clear concept of privacy for a stable society to rely upon. That even the most trustworthy authority be kept out by design as a security principle. I don't want my messaging app to be opinionated, pick sides or declare themselves part of the ongoing "progress". Will they sing the same tune when it's other group taking the streets? Because everyone has a different idea of the kind of revolution that is needed, and certainly my phone should not have a say.
I'm talking about the discourse sorroundig privacy, not literally specifically about my phone. I hate to see it marketed as a tool of revolution instead of just common sense practices.
Your use of "ideologically driven" is confusing. Of course they should be driven, by their ideology on how a messaging app should be. But how could their opinion on the protest drive them? Just some weeks ago a whole different crew was taking the streets fighting for their own "freedom reasons". Some years ago there was the sad tiki parade.
It's like setting up a gun shop in Syria and claim your guns kill "the bad guys", whatever that means to the customer.
I'll choose the app that works best for me, and will call it out when someone feeds the meme that privacy is for rioters, unwittingly even.
Some people believe in first and fourth amendment rights.
Moxie, one of the original developers of the Signal protocol:
Tracking everyone is no longer inconceivable, and is in fact happening all the time. We know that Sprint alone responded to 8 million law enforcement requests for real time customer location just in 2008. They got so many requests that they built an automated system to handle them.
Combined with ballooning law enforcement budgets, this trend towards automation, which includes things like license plate scanners and domestically deployed drones, represents a significant shift in the way that law enforcement operates.
Police already abuse the immense power they have, but if everyone’s every action were being monitored, and everyone technically violates some obscure law at some time, then punishment becomes purely selective. Those in power will essentially have what they need to punish anyone they’d like, whenever they choose, as if there were no rules at all.
Even ignoring this obvious potential for new abuse, it’s also substantially closer to that dystopian reality of a world where law enforcement is 100% effective, eliminating the possibility to experience alternative ideas that might better suit us.
ACLU: For more than a decade now, Americans have repeatedly encountered illegal and unnecessary spying by local, state, and federal law enforcement on lawful and peaceful protesters.
But I am not upset in the least about the rioters or anyone else, I can assure you.
Guarding people's privacy and rights to a voice (during protests or otherwise) is a good use of the first and fourth amendments. But let's not confuse form and content. The protests are not advocating privacy, and I'm not even going to voice my opinions about them. There's a difference between backing the content of the protests, and defending the rights that incidentally enable them. The blogpost in question ended with "it’s your powerful voices that are out there organizing and advocating for change". I can only construe that as either explicitly siding with whatever ongoing protest there is, or an empty general statement (I assume it's the former though, but makes little difference to me).
I think my point got side-tracked by fault of my own. I don't require every single thing I use not to have ideologies attached, because it's simply impossible. And everyone has every right to voice their idiotic opinions, God knows I'm doing that. But it saddens me that more often than not people who could choose to be content-agnostic instead leverage their position to fight the good cause. And there's a thousand conflicting good causes.
I love Signal. I've convinced a multitude of people to switch and some use it day to day currently, but mostly to talk to me. So i feel my contacts do it out of respect and `compatibility` of communication.
What baffles me is the the incompatible feature matrix.
First of all, for some reason iOS users get the updates faster than the Android. I was exploring emoji reactions yesterday while my Android contact admitted the feature was not yet available for his device. I had to double check with Play Store to confirm.
I've found peace with the sync issues for the desktop client though, it got much more stable compare to 8 months ago. What still feels like a massive UX problem is inability to forward messages on the desktop. Given, i have lots of people coming from different places that do not know each other but share same interests it's just painstaking to copy/paste the same URL five time in a row.
And at the same time, there's no support for the Android tablets as secondary devices.
For a person deep in Apple ecosystem it felt weird to learn that Android users don't share the same experiences i do. That makes the sales pitch to try Signal way less appealing for the Android folk.
Never tried Signal, but from what you're telling, it seems that they redevelop the core functionalities for each platform.
That's interesting, because i've been looking for a way to share core logic code across platform (mobile & desktop at least), and still haven't found something really user friendly. From my brief lookup (rust & gomobile are the ones i've looked at), it seems that most dev environment seem to support some kind of C-style interfacing, but it becomes much more clumsy as soon as you're trying to have it run on java.
Has anyone found a solution that he's comfortable with and would recommend ?
Honestly there are no perfect solutions to this. There are cross platform frameworks (React Native, Xamarin etc) that you can try but you'll definitely hit the wall when trying to access low level native APIs. Kotlin Multiplatform seems to be the step in the right direction. It has first party support on Android and works seamlessly for Android apps, but it's too early to tell for the other platforms. In an ideal scenario, you can share the core business logic using Kotlin Multiplatform and then implement UI and platform specific stuff using native APIs (or something like Flutter (again, not production ready)).
I’ve heard of people running all the logic in a self-contained local http server (developped in whatever language) then rely on local socket connection + protobuf for communicating between the native and the cross-platform codes.
i had hoped some people here would chime in with this trick, but it doesn’t seem like it
iOS may get updates faster than Android and have emojis, but iOS can't backup or transfer your messages to a new phone. This basic feature has been an open request for nearly 3 years.
Not having a backup is seen as a disadvantage by many but i literally feel it's for the best. There's plenty of scenarios when parties benefit from it.
Purchasing a new phone is only a small subset of the scenarios where a backup is useful, for example, in the last two years Apple support has requested that I reset my phone and reinstall iOS twice. Having to buy a second phone to temporarily store my data is not a viable alternative.
Or losing your phone.
Or accidentally breaking it.
Or having it stolen.
A backup protects my data in these situations where a transfer utility does not.
Backups disabled by default would be a sensible approach, but is very much being childish not to let users access their own data.
There were mentions of multi-device coming later on Twitter.
While it's not the desired backup feature but a mean to duplicate data at least and backup with Android.
Lately I've been wishing Signal had a bridgefy type mesh mode that enabled operation peer to peer over wifi direct or bluetooth, either directly or via a mesh of such devices.
First I think it would be useful at protests, to preserve privacy (and whatever side of the political divide you are on, I hope we agree that covert government surveillance and tracking of people at a protest is wrong, here, in HK, or wherever).
But it would also be nice to have when outside the reach of cellular or wifi internet. Think camping, traveling, people living with intermittent power, or those who lost power in some sort of disaster or emergency.
I'm honestly not sure if this would be feasible, as I don't grok the signal protocol fully, but the signal protocol does support async messaging.
"PINs will also help facilitate new features like addressing that isn’t based exclusively on phone numbers, since the system address book will no longer be a viable way to maintain your network of contacts."
I suspect you are massively outnumbered by people who do use it because it integrates with their existing phone book, and is a drop-in replacement for the default SMS app. Without those things it'd be just another niche app for weird nerds.
The fact that you can reliably add Signal users using only their phone number is a feature for the users who don't care about that level of anonymity, though apparently not a feature the developers care about preserving.
If it didn't have this feature of being a drop-in replacement for SMS, then I would be using Matrix to communicate with my two privacy-minded friends, and plain-text SMS to talk to everyone else in my contact list. With this feature, I've been able to convince about half of my Android-using contacts into switching (iOS is a harder sell).
it's their negotiation of the bare minimum invasive activity they've decided they must do to give us some threshold of network effects in an otherwise anonymous service
Is it worth trying to move my friends from WhatsApp to Signal? As I understand it, they're both e2e encrypted.
I'm also trying to move my chats from SMS and Gchat to something encrypted, but am torn between WhatsApp and Signal. The former has more of a buy-in with my contacts already.
I realize WhatsApp is owned by Facebook, but isn't the whole point of e2e encryption that you don't have to trust the intermediate infrastructure? And if you enable the setting to warn if the key changes, there's no danger, right?
That is an impossible question to give one answer to. There are lots of considerations. For example:
1) Do you trust facebook with messaging metadata? For example, do you want them to know that you messaged your friend Jane a specific number of times on specific dates, knowing that they can coorelate this information to facebook profiles, and will give up this information for ad, and law enforcement purposes.
2) Do you trust Signal's infrastructure to be reliable when you need it?
3) What is you're threat model? For me, I want to be kept out of all ad-related data sets, and I want to have conversations with friends without having to think about a paper trail.
4) Do you care that you will have to convince people to install yet another messaging app?
5) Do you trust that Signal and WhatsApp have correctly integrated the underlying crypto protocol? I assume that you would, considering its history. Still, I would trust signal more in this regard.
6) Do you think that Facebook or Signal is more likely to cut a privacy corner in support of a ease-of-use feature? Do you care?
For my friends, the answer is that I use signal. Everyone who doesn't use signal falls back to getting SMS, or phone calls. The problem here is that SMS and phone calls are both huge privacy holes. For my threat model, I don't care, but you might.
The important distinction here is that WhatsApp uploads your entire contact book to Facebook. They have up-to-date information on your entire real-life social graph (including people who are not on Facebook and/or never shared their phone with Facebook) and the groups you belong to. If you don't believe me, request a copy of your data in WhatsApp and see for yourself.
So, by using WhatsApp, you are basically snitching on your friends and uploading their data.
Think about the implications — one day, you or one of your friends will add a phone number to Facebook (of course, "for security purposes only, to recover your account". From that moment, Facebook will be able to link an online identity to an offline one, and mine a trove of data: friends, groups, locations.
If you want a different way of looking at it, Facebook paid $19 billion for WhatsApp. That's how much it was worth to them. You don't spend 19 billion dollars just to watch e2e encrypted messages fly by.
I admire the way WhatsApp markets itself as the "encrypted" communications app, somehow hiding the whole problem with groups and contacts as insignificant. Another frequently seen spin is on Signal: that it "only hashes" the phone numbers and that it's "effectively the same thing". Good PR moves, both.
It is possible to use WhatsApp (on iOS, at least) without giving it contact access. I do this and it isn't too bad—you still see people's WhatsApp nicknames, so it's not like you're just looking at phone numbers. Of course, that does nothing to stop them from tracking who you actively communicate with.
There are specifically two points that make me not confident in WhatsApp keeping my messages secure:
* I don't have time to dig out a good source for it right now, but I remember seeing a credible claim that they already have in-place a mechanism to instruct the client app send back cleartext messages to a server. If they do not, it would be trivial to add it in an update.
* The client will keep nagging you to enable cloud backups of message history. If you our your counterparty does this, message history is stored in cleartext on one of the major cloud providers. Are you confident that the person you're talking to hasn't enabled this, maybe even by mistake?
Exactly! And it was worth $19 billion to Facebook. That's what they paid to acquire WhatsApp, one would suppose not with the goal of watching encrypted messages fly by.
> isn't the whole point of e2e encryption that you don't have to trust the intermediate infrastructure
If the client app is open source. There's no way to know FB isn't just copying your keys or doing something else nasty. That whole spiel on their site about how E2EE protects you/makes the world a better place/yada yada is completely moot since they use a closed-source client.
WhatsApp is opportunistic encryption by default, trusting the server to provide it keys. Pretty much nobody turns on showing when someone's key changes, and if you do and you verified each 's keys, the server can still learn the plaintext of at least one message by changing keys right when you send a message. Which is a very tiny risk, but it's just not what one would expect. It's as if your browser would continue loading a page with an invalid HTTPS certificate and show you a warning afterwards -- and only if you turned on showing that warning in the settings.
Even if you turn it on and you're not afraid of the server swapping the key when you send a message (which is fair enough), do all your contacts also have it turned on? And would they tell you immediately and out of band (otherwise the attacker can suppress or change the message) that they saw your key change?
But an even bigger reason I don't have whatsapp is because of Facebook. Metadata has already been mentioned by others so I won't repeat that.
* Message sync is like non-existent. Messages on my phone or laptop aren't kept in sync at all. Delete one one place, they don't delete in both.
* Let me edit messages, like every other message platform. I also want to be able to delete messages from the group. When I delete it deletes locally but not for the group, not even between my own devices I don't think. This sucks because deleting the message implies to anyone who has used a message system that the messages are deleted from everyone, but they aren't. Oof.
* When you set messages to expire, you can't make them expire. It only applies to future messages. I want to set this at the conversation level, not on a weird message-by-message basis with no way to change it globally after.
* I want to be able to sign in without using a cell phone number. Let me sign up with anything else, don't tie it to a cell phone line that can be hijacked.
* Let me add emoji responses to messages. Like every other message platform.
* Bonus, be peer-to-peer somehow. Dunno, like Blockchain magic it or something. Don't make me rely on some server somewhere. Just makes me feel uneasy that there's a middle man with all my messages.
"...Today, we are launching the Signal Foundation, an emerging 501(c)(3) nonprofit created and made possible by Brian Acton, the co-founder of WhatsApp, to support, accelerate, and broaden Signal’s mission of making private communication accessible and ubiquitous. In case you missed it, Brian left WhatsApp and Facebook last year, and has been thinking about how to best focus his future time and energy on building nonprofit technology for public good.
Starting with an initial $50,000,000 in funding, we can now increase the size of our team, our capacity, and our ambitions. This means reduced uncertainty on the path to sustainability, and the strengthening of our long-term goals and values. Perhaps most significantly, the addition of Brian brings an incredibly talented engineer and visionary with decades of experience building successful products to our team."
Somewhat tangential, I got into an argument about Signal last night. The other guy claimed that Signal was insecure because:
* It's "Custodial E2EE"
* Needs a phone number
(I'm not going to bother with his complaints about the crappiness of the desktop client or convenience of the design because those are non-sequiturs to the security of the app)
I asked him to define "Custodial E2EE". His words: "They have ownership of my keys, use phone number auth to access them and I cant expatriate them"
I managed to suppress my xkcd-386 instinct and go to bed, but my intuition is still that he's quite wrong about that. I may or may not resume my arguement with him; I got the impression that his disagrements were rooted in a Matrix fanboyism, but I'd like to be equipped to refute such arguments in the future.
I can somewhat sympathize with the phone number argument, and I think it comes from a concern about metadata leaks or opsec. I think that concern ultimately stems from a wrong threat model, but I'm not sure how to refute that. I have however, come across a number of tutorials which cover how to register a Signal account without using your phone numbers, so I feel confident I can refute the argument that signal must have your phone number, even if I can't refute the underlying wrong thinking.
Regarding the "Custodial E2EE" argument, I'm not sure where to begin. Anyone have any suggestions?
On the first point, their understanding of Signal's key management doesn't seem to be correct. The private key is held only by the client on the mobile phone, which is one of the reasons that the desktop client is rather clumsy - it is dependent on the mobile app for initial key setup. It's also just very slow compared to the mobile app.
On the second point, I think they are quite correct. Yes, this depends on your use-case, but I consider the ability to have multiple and disposable identities to be somewhat critical to any messaging system calling itself suitable for security-sensitive use. Unfortunately, Signal is designed explicitly to make multiple or disposable identities impractical. Above and beyond their desire to be a drop-in replacement for SMS, a reason for this is almost certainly to reduce spam, as the need for a valid phone number makes it difficult to register accounts en masse.
In general, this method to mitigating abuse means that Signal cannot practically be used anonymously, which somewhat conflicts with the popularity of Signal as a mechanism for e.g. contacting journalists.
In the vein of criticizing Signal, I would also throw out that it largely abandons the problem of key distribution, effectively implementing a TOFU model that is lightly enforced at that, with "safety number changes" being pretty much normal. It is possible to verify Signal identities out-of-band but not common, and the Signal app does not really provide much tooling to make it easier.
I wouldn't say that Signal is bad, it does a great job of implementing effectively the identity semantics of SMS (including the shortcomings) but with the addition of E2E encryption and TOFU. I would stop short anyone who claims Signal to be a "perfect" or "complete" solution for encrypted messaging as there are common use cases that it has actively decided not to address.
I'm also, to be honest, somewhat baffled that your starting position seems to be that no criticism of Signal could possibly be valid, when attributing the other's viewpoint to fanboyism. Don't take that too seriously, it just stood out to me on reading. :)
>I'm also, to be honest, somewhat baffled that your starting position seems to be that no criticism of Signal could possibly be valid, when attributing the other's viewpoint to fanboyism. Don't take that too seriously, it just stood out to me on reading. :)
Touche. That wasn't my position, but re-reading what I posted, I agree that's it unnecessarily implies that position. Part of me wants to correct that, but I think I'll own my words and leave it there.
Been a big fan of whisper systems since the redphone project. Great to see them maturing.
However, given the topic title it would have been nice to see some actual documentation on how signal actually works rather than just claims that it doesnt work like the others.
>"this phone number is using signal" is still a pretty large metadata leak.
>Especially when state actors and probably a fair few non state actors can remotely compromise devices via the stuff in the baseband processor.
A more accurate phrasing would be "this phone number was used to activate signal". If you only care about messaging other Signal users, you only need to have a baseband connection exactly once, when you receive the text message to confirm the number. After that you can toss the sim card and put in a different number, or run without a sim card at all and just use WiFi.
You don't even need to have the sim card in the same phone you will use with Signal when you receive the confirmation text.
There have been mutiple baseband RCE exploits published in the literature and demonstrated at blackhat - and they dont include any that were put there intentionally.
If you are not using a sim smartphones are pretty useless.
Im a long way from convinced that centralised servers have any role to play in reasonably secure e2ee, they certainly are not a requirement for other services such as firechat used to use before they got shutdown and bridgefy is making use of.
> There have been mutiple baseband RCE exploits published in the literature and demonstrated at blackhat - and they dont include any that were put there intentionally.
You can't target a WiFi exploit against a phone number though, so that's irrelevant to the Signal situation.
>If you are not using a sim smartphones are pretty useless.
Maybe I spend too much time hanging around places with WiFi, but I almost never need to have a sim card. I don't even have data on my current plan.
Signal is open source; I'm wondering, has anyone here ever built the service and one or more clients and run it on-prem, or even as a white-labelled competing service?
The anti-Signal rhetoric has already started. News articles about Antifa specifically mention that they communicate via Signal.
If Antifa is designated as a terrorist organization, then we'll see all the counter-terrorism tools brought to bear against them. If the state can't break Signal encryption, then you'll see renewed energy for anti-encryption / anti-privacy policies.
My intuition would say that I would doubt this would change much in terms of investment. Of course without specific knowledge I would suspect that there are terrorist organizations in the world already using Signal such that resources are already being spent doing what you describe. I would also suspect that for any and all cryptographic applications.
Genuine question because antifa is a mixed bag - there isn’t really any consensus on whether what they do is good or bad. Seems to be all over the damn place.
There is no consensus on wether being against fascism is good or bad?
Being against fascism is not a group or organisation.
It’s like calling bird watching an organisation. Yes, there are bird watching organisations, as there are antifa organisations, but neither bird watching nor antifa is an organisation.
This is like asking if there's no consensus on whether "all lives matter". Or no consensus on being "pro-life". Phrases often have context that mean more than their literal dictionary definition.
Edit: The comment I replied to originally asked "There is no consensus on wether being against fascism is good or bad?"
Well, there isn't. "All lives matter" and "pro-life" represent specific political views, but they don't represent any specific groups or organizations. Just like antifa.
You're being obtuse for no reason. This is like saying ISIS isn't an a specific group because it is comprised of transregional autonomous cells instead of any specific group. Sure, maybe in the most literal application of terms, but it's obvious what's being discussed from context clues. Decentralized movements are the status quo of activism these days, that doesn't mean you can't reference the movement as a collection when discussing how to combat it or or if it should be combatted.
I'm arguing that antifa isn't even a decentralized movement. Wildly different parts of the left, both in ideology and preferred tactics, ascribe to the antifa label.
I think a better analogy than ISIS would be "groups that say they uphold the ideals of Islam."
Antifa is a term people associate with far-left pseudo anarchist groups that engage in property damage, physical violence and harassment. Wikipedia has more context if you're interested[1].
> And how was the "is there not a consensus on antifascism" question invalid?
I understand what you're saying and I agree to some extent. And I'm coming at this with someone who's been adjacent with the left for a long time.
But even reading the Wikipedia article, it describes such a broad range of ideologies and tactics that I think the only thing really unifying the term is literally being anti-fascist. This has been my personal experience as well - I know a lot of people who identify as antifa, but they vary wildly in how that manifests.
It's fair to argue that what people associate with "antifa" does not accurately describe all the groups or people who use it to describe themselves. But that doesn't really change the connotation of the term.
You could similarly say, Well, what is there to "all lives matter" except the simple claim that all lives, do in fact matter? Which is wrong: people who say "all lives matter" as a rule have some amount of shared views, such as not thinking police abusing their power is a major issue and opposing the current protests. Similarly, people identify as "antifa" tend to have shared views, for example being anti-capitalist, left-wing, and having a higher affinity for violent protest than protesters at large.
You are being downvoted because the current understanding here, AFAIU, is that there is no good identifier of the "they" where you say "they purport" because there does not currently exist an organized group that identifies themselves as "ANTIFA".
If you are aware of sources that indicate otherwise, would you mind sharing them?
There are definitely organized groups that identify themselves as "antifa" (e.g. [1]). I think you're getting at though is that antifa is not a singular organization but rather a collection of autonomous activist groups and individuals who share a roughly similar set of ideology and tactics.
It's easy to get caught Trump's BS because he makes lots of claims.
It's impossible to designate Antifa as terrorist organization in practise because it's not an organization. It's a movement. It's also impossible to designate white nationalism or alt-right as terrorist organizations.
If they can name actual organizations and name people participating in the movement, it might work work against them, but they are not Antifa.
I'm surprised Briar isn't coming up more in these discussions. Maybe it's locked down too tight for everyday use, but for protesters and individuals at risk it seems ideal.
I admit I could be missing something though, as I've not used it heavily.
agreed. I recently removed Signal and started using xmpp (again). It'll always be there (Signal probably won't) and getting a friend to install conversations.im and register a name on a public server is not much harder than getting someone to install Signal. To me it somehow feels like it's a good time for xmpp
Sign in to matrix.org with Riot Android / IOS / Web
Like sibling poster, XMPP with OMEMO encryption is an option, but it requires setting up an xmpp server with ejabberd or prosody software, then installing Gajim on your PCs and Conversations (paid via play store, free via F-Droid) for Android / a good iOS Xmpp client that supports OMEMO.
Except for techies, ^ is waaaay too much for people who are just looking to have an alternative to Signal's lock-in. I can barely get people to install the signal app and click open :)
I currently recommend installing Riot IOS / Riot Android and logging in to the free Matrix.org instance, which does not use your phone number or anything.
In the future I hope to recommend whatever P2P Matrix phone app comes out, and you just share your ids with friends via QR code handshakes and boom, you are off to the races with nothing but an internet connection and a friend ID. Metadata is all local to phones, so nothing to see.
In the medium future, I also hope to recommend cwtch.im which is the replacement for richochet, a metadata resistant p2p e2ee messaging app.
In the far future, I hope to have something like a smartphone mobile ad hoc network-based p2p e2ee app where you scan qr codes to handshake and you either communicate over the internet (like p2p matrix) or over mesh network as long as you are within 10 hops of another device (like bridgefy or firechat). Bonus points if one can DIY a passive repeater that works with the mesh network.
Really nice design, in the future I see a market for "privacy tools that also happens to ..." kind of software, for example a Privacy tool that also happens to browse websites, or a Privacy tool that also happens to help you to send messages (like Signal); in summary, privacy aware services.
iCloud contacts is not e2e encrypted. Apple, as well as FBI/DHS and the US military intelligence apparatus can access your entire contacts list and infer your whole social graph.
This is why a messenger needs an e2e contacts sync, whether Moxie wants to be responsible for it or not.
Thanks, makes sense, though guess I would expect a non-profit to communicate this; don’t believe Signal even links to the 990 I linked to on their own website; possible they don’t want the public to (easily) know salaries, assets, etc - but unclear.
True, but remember that Signal (the app developer) is an LLC. The 990s may eventually be posted on the Signal Foundation website if they ever finish it: https://signalfoundation.org/
Thanks, good point, was aware of that, but others might not be; never seen a good explanation for their legal structure, both as it relates to the present & future.
I have checked both URLs for 990s, only way I found it was via a link in the footer of Signal’s Wikipedia page.
You are wrong that all encryption is broken with a quantum cracker. But as far as I know Signal is not quantum resistant. It's certainly something to keep in mind - that whatever you say might resurface later.
I'm not sure why it works so poorly after years of existence, but unfortunately I'm losing hope that it'll be fixed. I sometimes feel like the Signal team doesn't use their product, or they would have noticed this. Meanwhile, yes, Matrix took years to add encryption, but it works much better than Signal, even with quite a few small bugs.