Hacker News new | past | comments | ask | show | jobs | submit login
Worried About the Privacy of Your Messages? Download Signal (nytimes.com)
268 points by JumpCrisscross on Dec 8, 2016 | hide | past | favorite | 239 comments



I'm worried about the freedom of dissenting opinion in civil society from surveillance, not the privacy of my messages. Privacy is something I would easily give up for such freedom.

Signal seems to me to one tool in a large profile to maintain freedom of information dissemination and information gathering activity.

It seems to me that speaking to a large number of people anonymously is not possible with any of the existing tools, Signal included, and it seems that this set of tools is the next set that is needed.

Candidates for this include implementations of "Dining Cryptographer Nets," though there exist scalability and DoSability concerns.


An incomplete sketch of a browser plugin/feature that would allow posts to be signed from an anonymous source. (hiding your IP isn't addressed; use something like Tor)

* Blocks of text that start and end with magic numbers are signed in-band in the style of "gpg --clearsign".

* The pubkey pair is automagically created on first use.

For legacy support with existing infrastructure (such as HN):

* When you submit a form with a <textarea>, the plugin provides a UI to sign the contents before submitting the form.

* When browsing a page that contains magic number wrapped text, the text is automagically verified and the key pinned.

* The structure of the signed text should allow the plugin to hide the magic numbers, signature, etc, so the text looks normal.

However, newer software would have other options:

* Define a mapping between the in-band data and a tag structure that holds the same data. This needs to be strictly defined, so it is possible to remove the tag structure and verify the original signed text. This gives full presentation control back to the website, while allowing individual posts on the page to be verified.

The big problem - as usual - is key distribution. In the latter case where it is easy to hide metadata with CSS, the public key can simply be included with the post. Unfortunately, in the legacy case I don't think there's a good way to include two pages of public key in each "--clearsign"d post.


I have considered, and even worked on, a Chrome extension for this exact purpose. Some hurdles I discovered:

* Websites do weird things with textareas. If it's a plain-ol HTML form, it's pretty easy to intercept and do what you want. It seems most websites intercept button clicks then pass your text around to twenty different JS functions and libraries before doing something useful with it. You wind up writing code to handle edge-cases more than anything else.

* To address your final comment, embedding the pubkey inside the text would defeat the entire purpose. A malicious actor (see recent Reddit controversy) can just create their own pubkey, sign the modified comment with it, and noone would be the wiser.

Some things I discovered, and found worked well:

* Keybase has a fantastic library for generating PGP keypairs in-browser. It really added that extra bit of "magic" to the extension.

* Chrome extensions can get CORS exemptions on a per-website basis, and uploading to a standard SKS keyserver is just a POST request that can be done by Javascript.

* Expanding on the last bullet point, verifying other user's comments was tricky. During development, I just downloaded all keys on the keyserver that were uploaded by my extension on a scheduled interval and stored them in localstorage. Unfortunately, I don't think this would scale.

If anyone is interested in picking up where I left off, I could upload the source to Github.


Could you commit to github? Im working on something similar and would like to check out what you got/maybe pick it up. Feel free to send me an email (in my HN profile)


>If anyone is interested in picking up where I left off, I could upload the source to Github.

I'm probably not going to pick it up, but I'd definitely read the source.


  > It seems to me that speaking to a large number of people anonymously is not possible with 
  > any of the existing tools, Signal included
While it's still at the testnet stage, I have very high hopes for Maidsafe's SAFEnet. This will allow a completely anonymous internet, providing both users and publishers complete anonymity. Upon release of the actual network, it will live in the wild, just like bitcoin, with no company in control of it. Maidsafe's business model is to be one of many developers that will consult to companies who want to migrate their data storage onto the network. The network itself will be controlled by nobody.

It's peer to peer, but there are intermediate nodes which isolate content storage from those who view it, so neither IP addresses of storage nodes nor IP addresses of content consumers can be determined. Since there is no centralization, there's nobody for state actors to attack, legally or by force. All traffic is encrypted and even the ports are randomized, to avoid state firewalls from determining what the traffic is and blocking it.

I think it's our best hope to distribute large amounts of data to the world and avoid the censorship that many countries already have and that even those of us in the west see looming upon us.

You can learn more about it at https://safenetwork.org/ and https://safenetwork.org/documentation/ and https://safenetforum.org/


Data distribution is next to meaningless if there is not the accompanying power to affect political change.


In a democracy, dissemination of information to the voters can theoretically affect political change.

In countries that are not democracies, I'm uncertain what to do with your complaint. Would you dismiss giving oppressed people access to literature, news, entertainment, or technical and scientific information, on the premise that the information is unlikely to allow them to affect political change? I don't believe that premise is true, but even if it were, their lives are still made better, which means it's far from meaningless.


Recent events have disproven the cypherpunk-era belief that freely available, secure communications will lead to a better (read: more rational) world. I don't dismiss the desire to give access. I think its utility has been naively overstated; these same tools can be, and have been, used by old guard oligarchs to maintain and extend their political and economic power.


I agree that bad actors and state actors will also use this technology. Every time I read a podesta email I am reminded that such hacker induced government transparency will be gone, when they all move to the platform.

But governments do a pretty good job of keeping their secrets now, so in the end, I think this will balance the scales towards private citizens.


> I'm worried about the freedom of dissenting opinion in civil society from surveillance, not the privacy of my messages. Privacy is something I would easily give up for such freedom.

I see these two subjects as being inseparable. Before I feel comfortable asserting a dissenting opinion in a public forum, I would much rather discuss the subject among my peers where I don't feel I will be immediately eviscerated for a poorly constructed thought. If I don't feel that I have any privacy, I'm more likely to be subject to chilling effects.


> Before I feel comfortable asserting a dissenting opinion in a public forum, I would much rather discuss the subject among my peers where I don't feel I will be immediately eviscerated for a poorly constructed thought.

While I certainly emphasize with the anxiety of publicly expressing an unpopular opinion, I believe it is, especially in times like these, your moral duty to endure in spite of the repercussions and speak your mind. Even though I went to Stanford and had direct access to Milgrom's famous prison experiment, the lesson from it didnt hit home until many years later I read this article: https://aeon.co/ideas/the-desire-to-fit-in-is-the-root-of-al...

Now, the phrase "All it takes for evil to triumph is for good men to remain silent" resonates with me.


This phrase does not mean "All good men should continually natter on". It is also your moral duty to present a cogent argument, lest we drown in a sea of mostly inane babbling and constant infighting, completely burying the points that need to be made.

Speaking out matters. So does when, where, and how you speak out. And that is something best discussed privately.

I'd have much more to say on how privacy and anonymity are inextricably linked, but I'll let the EFF say it: https://www.eff.org/deeplinks/2012/01/right-anonymity-matter...


It's not a moral duty, it's a Robb Report level luxury. One does not need to be particularly articulate or present a rational argument, especially when dealing with matters of morality. Sadly, rhetoric and appeals to emotion, empathy, and common sense are more effective if your goal is to sway others.

> This phrase does not mean "All good men should continually natter on". It is also your moral duty to present a cogent argument

What I was trying to say is there will be times where you are not as prepared as you like. It's not about presenting an airtight, cogent argument, it's about having the courage to say something like, "Officer, stop beating the shit out of this guy in a wheelchair. He's clearly isn't resisting and you're being a dick."


>It seems to me that speaking to a large number of people anonymously is not possible with any of the existing tools

NNTP, mixmaster remailers, and mail-to-news gateways.

Old school. Those were the days.


If you speak to a large group of people anonymously it's not private.

If it's not private, what you say, how you say it and when you say it can be analyzed to identify you.

It's a people problem not a technical one.


Would it not be trivial to algorithmically obfuscate this identifying information? As I write this, Im thinking of Ender's siblings finding their voice under the pseudonyms Desmothenes and Locke


> I'm worried about the freedom of dissenting opinion in civil society from surveillance

A good reason to support Wikileaks.


It was, back when WikiLeaks seemed to be politically independent. Now it's hard to tell whether I'd be supporting worthy efforts to promote free dissent, or whether I'd be supporting one man's (or a small group of people's) interest in influencing U.S. politics in ways I find objectionable.

I'm also not sure I see the connection between 'dissent' and the dissemination of stolen private emails.

I think I'll give to the ACLU instead.


Theres Telegram and 'channels.' However many insist it's a honeypot.


Can a book (print or electronic) not be published anonymously or under a pseudonym?


I continue to be disappointed by headlines like, "Worried About the Privacy of Your Messages? Download Signal" which implies that only some people should be worried about privacy.

You'd think that the revelations about the NSA, things like the UK law that requires ISPs to collect and store your Internet browsing history would have more people "worried" but yet it's still pretty much a lost cause to try to explain to smart friends why this stuff matters.


It's pure exhaustion in my case. I can (and did, for a while last year) switch to Signal, but even getting my wife to care enough to bother was more work than I was willing to put in at the time, and she's the one person in the world who's most likely to listen to me. There's basically nobody else in my social circle who'd bother if she won't.


That the NSA is able to spy on you is an existence proof that criminals can also spy on you.

The downside of becoming known to the NSA is that you can get caught up as an "associate" in someone's drug, criminal or terrorism investigation, and you had no idea that the person was doing anything like that. You might even have no idea that the person exists, since the NSA is "allowed" to go out some number of degrees of freedom from the supposed suspect.

The downside of becoming known to criminals is that you can fall victim to criminal schemes. You don't even have to be known by name or association, your device just has to be discovered; owning it is the all too easy next step. You can be known by installing malware (see the articles on phone downloaded apps), or by criminals seeing your communication on the open internet and becoming aware of you.

That's what I'd tell my friends, if any of them were still listening.


That's because most people don't have a clear idea of what the downside is if the NSA gets their information.

I don't have a clear idea of what the downside is if the NSA gets my information. So I haven't bothered to download Signal.

I don't mean this in the disparaging way that folks who use "If you have nothing to hide, you have nothing to worry about" do, but the truth is that most people assume that Facebook and Google know a huge amount about them already, and that if the NSA wants to find out about them, so what?

Maybe a better education campaign to convince those people who don't believe that they have anything to hide why they need apps like Signal would be an important first step.


The difficult thing is convincing people that something could go wrong with all the trust they're investing in the government and large corporations. The thing which I think has been most frustrating is that when demonstrable situations come up [1], people seem eager to dismiss them as "just a few bad actors" and not as evidence that it's a bad idea to continue investing trust. Watching people in interviews like those in the "Inside the Creepy Russian Internet" video [2], feels to me like they're resigned to the fate of having social networks know everything about them, not that they want them to know everything about them.

What I want people to consider, frequently, is whether or not we should fight these anti-privacy measures not because anything is wrong now, but because of who might eventually be given access to that information. We should fight to make sure that tools to oppress and enforce terrible regimes aren't freely available, even if we find the current leaders trustworthy. The current administration in the Whitehouse is trying desperately to dismantle the sweeping powers to exercise drone strikes without oversight [3], because the next administration doesn't feel as trustworthy to them.

I don't know that there is any downside to having the NSA know everything about you now. I don't know that it will be a bad idea within the next administration. But I know that it's a bad idea to let any random person have that trust, and I don't know that the election system has enough safeguards to prevent abuse by the next person elected. Or the next. Or the next.

[1] https://www.washingtonpost.com/news/the-switch/wp/2013/08/24...

[2] https://news.ycombinator.com/item?id=13102972

[3] https://theintercept.com/2016/12/06/after-8-years-of-expandi...


The NSA doesn't care what I'm doing, and if it does, I'm doomed anyway.

More concerning is stuff like the US Democratic National Committee emails showing up on Wikileaks. Being blackmailed, ransomed, or leaked is much more of a real threat for most people than some abstract harm that may be posed by the NSA or some ISP logging.


I've had success in framing it from the perspective of "Want to use an app that can help activists/journalists be able to do their work more safely and securely while also protecting yourself from criminals/those you wouldn't want to own your data."

It seems to resonate much more with the millennial demo, as we can empathize with the struggles of journalists/activists/state-dissidents even though we may see our own data/life as in less need of securing, especially in light of the "Facebook/Google already have it so why bother?" sentiment that gets thrown around.


My messages are pretty dull. I'm not at all worried about their privacy.


Perhaps in the future an authoritarian government will single you out, together with others, for being politically “neutral” and thus fit for taking part in some kind of “neutral” policy enforcement corps.

Rare possibility? Yes. But it’s a possibility. «Nothing to hide, nothing to fear» arguments cannot stand before this sort of scrutiny.


All the privacy solutions on the market are varying degrees of bad (from a privacy/security/freedom perspective), by which I mean they're all flawed in their own ways.

Signal requires Google Play Services on Android. That means it's put simply not a privacy messenger. Yes there's crypto, but it's also tied into Whisper Systems' infrastructure, there's no federation. I use Signal reluctantly, and only on IOS.

Threema, which is popular in parts of Europe (and is what I use to an extent) is well established, but not open source, doesn't do voice chat or federation.

Wire is moving in the right direction with respect to being open source and having lots of good features, but is still not open source.

It's 2016 and our best crypto messenger options are worse than what we had 10 years ago when Skype was peer to peer, or Jabber with federation.

I can understand the reasons for not supporting federation, but I disagree. The Internet was built to be decentralised, literally to withstand nuclear war. A walled garden does not provide us with the redundancy or control the Internet offers.

There are other metadata related issues that pretty much every messenger suffers from but I'll leave this out of the scope of this comment.

What Open Whisper Systems and Wire need to do is open source the server components of their solutions, and try to remove the reliance on servers as much as possible. Only then will we have proper message privacy.


Chaps.

I think it's time we just admitted that everything we type/tap/say into any device, regardless of how it's being wrapped up in transit, is absolutely and irrepairably insecure and we're not going to be able to fix that anytime soon.

Even with really smart crypto or some slick app; what makes you think that your messages aren't just being read by your os? Would you really notice? Why would you assume that the api this app talks to is trustworthy? There's no way to even tell if the code for the client on your device is the same as what's in that repo, as if it would make any kind of difference anyway since we can just load code from anywhere at runtime and you'd probably not even notice.

This whole thing is a fucking stageshow, I can only assume we're here to create an illusion that we still have any kind of security. Why? Is it to get some low hanging fruit/tech amateurs who will trust it and expose themselves? I don't know. But, HN, I don't know why WE aren't admitting this to ourselves or why we defend such obvious ruses.

Well, maybe some of you have signal stock I guess.

It's over, we lost, we have no security, we will probably never be able to reverse that situation, things are getting worse rapidly, we can't even know what any of our devices are really doing anymore (even the switches in your dc, the firmware on the UPS's, whatever), and apps like signal are obviously, to me, impossible to trust and I don't know why that's not obvious to anyone who has spent a moment looking at our situation..

Argh.


> I think it's time we just admitted that everything we type/tap/say into any device, regardless of how it's being wrapped up in transit, is absolutely and irrepairably insecure and we're not going to be able to fix that anytime soon.

We dont have to set the bar that high. Just lower it slightly and we can achieve great success.

Can we make the job of NSA and CIA harder? Not necessarily impossible, just to increase the costs they have to spend on whatever it is they are doing, much higher than it is now.

That would be nice.

We can do that with bitmessage, avoid Signal, use Telegram which means outsource to Russia/KGB - so NSA will for sure need to spend more resources if they want to access your "secure chat" on Telegram.

EDIT: but of course, as you say, the problem is socio systematic - any technical system which can or would be used to subvert the state is a threat to the state and will be neutrialized, by takeover, sabotage, lack of funding, the means the state has are enormous. The list we have of secure or safe systems are still zero. Only another state can go against the state.

For example, it was mandated that GSM had backdoors. Any system used for communication, especially mass/group communication, is not free and not allowed to exist. So simple is that. They even tried to go after the creators of secure systems, wasnt the PGP guy scrutinized?


I'm not sure there are many ways to make things harder when the complexity of the systems we have even in our pockets make them so very hard/impossible to even observe let alone protect.

There are too many places, even on a phone, where anything at all can be hiding and we don't really have any way of ever knowing what's there without stepping through the binaries.. That's even more difficult when we've got dalvik and VM's involved. How many people alive do you think could work out exactly what your mobile does in any one second within say, 6 months? ..

For all we know the thompson compiler backdoor really happens, and so how can you even trust the binaries you're producing?

Do you know anyone who built, for instance the openssl.so on their phone personally? Did that person build it on a machine they trust? How can they trust that machine? A single lib with a nefarious function on the few GB of binaries which ship with your phone is enough to completely negate any 'security' these sorts of apps could provide; even if they were trustworthy to begin with which they're obviously not.

It's a charade, assume everything you do is completely observed, regardless of crypto/apps you use.

Even if you have a fully audited and trustworthy (in your view) OS -- then what about IME? Who says that the CA that issued the certs you trust and talk to isn't also feeding them out to the gov or whatever? Who says your HDD isn't talking to the zigbee thermostat in your house and sending your rsa keys to the pentagon? Who says yo home wifi point isn't acting as a decrypting proxy?

We cant tell, and that's the point. There is no way to ever observe these levels complexity and verify security and we don't admit it.

We will never know if the code we are running on our various devices is doing anything extra or not, and so rationally, we should never trust it at all.

Not to say all sec is pointless, you don't want to be low hanging fruit, but above 'insta-pwn' levels of stupid there isn't much more available, even if you're told there is and we keep pretending like we have any control over what's going on anymore...


You're right, anything can be monitored, but we shouldn't take that as a reason that we should lay down and accept it. You should assume that any sufficiently motivated person/organization/state can get access to your information. But what we should absolutely do is make sure that state actors and companies have to demonstrate that they are motivated to take that information. That they have to devote resources to getting it. We shouldn't hand them broad access to everyone's information just because any one person's information is possible to obtain.

But regardless of that point, your other argument feels to me like a discussion against the problems of mono-culture: we should all endeavor to have diversity of implementation, if possible. The more we can work towards federated standards and the ability to quickly pivot when someone loses our trust, the more capable we are of defeating attempts to undermine that trust. If we only have 1 implementation of signal and everyone uses it, then all an "evil" party has to do is find flaws in that 1 implementation. If we have 2 implementations, then some users are protected. If we have 1,000 implementations, and they all inter-communicate, then we have some hope that only some users are affected by each compromise. I'm digging matrix.org as a federated protocol, and I'm hoping that it takes root because it has many clients; an open, federated protocol; and the ability for me to run my own server.

I would advocate that even though we're relatively mono-culture on linux, supporting it at least gives linux access to the source code which would allow other implementations (anyone else keeping a close eye on Redox [1]? I know I am!). Support open protocols! Support federated protocols! I'll advocate that signal is a great, easy-to-use system for secure communication, and that the ideas of Signal are percolating elsewhere (see "olm" [2] the implementation of Signal's crypto protocol for matrix). But I would like to see stabilization in Signal leading to federation.

[1] http://www.redox-os.org/

[2] https://matrix.org/git/olm/tree/


Indeed, another cost-increase on NSA and other evil enterprises invoice, is to misinform/disinform, let them monitor what is useless, waste resources finding the needle in the haystack.

Basically we are in Intelligence vs Counter Intelligence game with the state.

You suspect your phone or device is listening? I know that Facebook is listening/monitoring, so I let it hear. Yes, I did go to school in Zimbabwe and have my Zimbabwian friends confirm it, and I did visit it look the GPS data in the pictures show it.

The data that facebook has about me is worthless shit in disguise as marvelous worthy data, but they do sell it well, if everyone did this, it would be obvious their data is not worth the ads they sell, facebook would crumble like a stomped on paper plane.


I'm seeing the infosec community divide into several groups, and it's equal parts amusing and alarming:

* The "everything is fucked and will never get better" group, which I'd put your message into

* The "everything is fucked but you're not being targetted by Mossad" group - at least these people do threat modelling.

* People who are actively helping journalists in dangerous areas (see my first point)

I agree with lots of things in your message but I don't think it's helpful. Would you say we should drop back to HTTP because there are lots of ways to break HTTPS?

There are people doing fantastic work even on closed platforms like Android and iOS. People are looking in detail at what's running on it - mostly thanks to the jailbreaking and rooting communities. Everything will be OK, it's just going to take a lot longer than you'd like it to.


Maybe hiding our messages from Google is worthwhile even if we can't hide them from the NSA.


Yes indeed. And from criminals. While I don't want innocently become part of a government investigation merely because I communicated with someone who communicated with a suspect, it's well worth the effort to secure (or avoid) communication that criminals can intercept and subvert. That's the day to day danger to the day to day person, and that's why we need a secure by design internet.

However, I despair that a secure by design internet is impossible in this civilization, because it will be designed in an environment that includes "stakeholders" like Google, Facebook and the NSA via NIST. The only thing that you can do, beyond waiting for a white knight service, is button down your end, try to convince the other end to do the same, and use niche and custom built tools.

There is no secure future for the internet as we know it, or as it might come to be in the current civilization.


Chap (presumably),

> But, HN, I don't know why WE aren't admitting this to ourselves or why we defend such obvious ruses.

What exactly makes Signal an obvious ruse? Many of us don't admit that evading surveillance is impossible because we work full time in commercial surveillance. This is the currently dominant business model of the technology industry. Many of us have also worked in defense / government. At the very least, the fact that we have a non-commercial FOSS alternative to data-collection-traps, which is gaining traction, is fantastic.

> Well, maybe some of you have signal stock I guess.

Signal is run by a non-profit.

You just sound like you're spreading FUD.


> It's 2016 and our best crypto messenger options are worse than what we had 10 years ago when Skype was peer to peer, or Jabber with federation.

I strongly disagree, Skype has always been a black box. The usability of Signal's crypto is a lot better than it has ever been for Jabber.

> I can understand the reasons for not supporting federation, but I disagree.

Yet you don't address any of the problems with federation.

> What Open Whisper Systems and Wire need to do is open source the server components of their solutions, and try to remove the reliance on servers as much as possible.

https://github.com/WhisperSystems/Signal-Server


> Skype has always been a black box

I saw a BBC documentary in the mid-early 2000s (sorry, can't cite it right now) about law enforcement and intel having a really tough time with suspects using Skype, only being able to collect metadata and not call content due to its split routing protocols.

Now, there's nothing to say the whole service wasn't backdoored at some point, but it appeared to be a real concern, at least to LE. A few security audits at the time also gave it a very positive report (ditto for citation). In the post-MS sale era, of course there's zero trust. Prior to that though, things seemed different.


Matrix.org/riot.im has working federation, end-to-end cryptography, works without Google Play and is completely open source.

The encryption is audited, but not yet enabled by default, as there are still some rough edges. But the apps have tremendously improved over the last few weeks alone and I am very happy with it.


Thanks for posting this, I'm going to try and take a look over the christmas break.


Re: Google Play Services. Here's is the way I understand this to work, so please do correct me if I'm wrong.

Signal uses Google Play Services to notify me that I have an incoming message from Signal. The Signal app is then woken up and the signal app then retrieves my (encrypted) message. So, therefore, the only thing Google knows is that I received a Signal message, what time and if I have location services on, where I was at the time, yes?

On iOS, how is this any different? If I get a notification, iOS knows I got a notification from signal, when and where i was at if Location is turned on.

What am I missing?

Note: I _don't_ like that either system has that much meta data about my conversations available to them, but I don't see how one is better than the other, here.


> Re: Google Play Services. Here's is the way I understand this to work, so please do correct me if I'm wrong.

> Signal uses Google Play Services to notify me that I have an incoming message from Signal.

More importantly:

a. That dependency, along with other "restrictions", sets an artificially high barrier to actually using the product independently. This is presumably so that they can maintain the pretence of being "open source".

b. Keep track of application downloads and whatever else Google provides to developers.

Moreover, as the paper that someone has linked elsewhere says, if you are allowing Google services in your app (Google Cloud Messaging, to be exact), you're at the mercy of anyone with control of it. A relatively trivial attack via GCM, as the paper hints, would involve simply replacing your application with a backdoored version, and you'd be none the wiser. It is a massive attack surface that just cannot be ignored.


> So, therefore, the only thing Google knows is that I received a Signal message, what time and if I have location services on, where I was at the time, yes?

Considering that Signal compiles code from Google into their app, there’s an actual issue there: the Google code could just intercept and take control of any of the messages. And there’s a lot of code from Google in there that no one has audited yet.


What about on iOS? Couldn't the libraries that are used on iOS do the same?


The metadata is considered super important in modern surveillance. Your receipt can be correlated with a sender for example, over the course of many messages. Now they can start on a network analysis, start correlating with other people or events etc...


Oh, yes, I understand this. I don't understand how it is different for iOS than Android. Sorry if my initial post was not clear on that.


Matrix has the crypto and the federation. But of course its still not hiding metadata.

The Signal Server is open source, as far as I know.


> The Signal Server is open source, as far as I know.

Yes, except for the voice call component.


I just tried out Signal earlier this week with my wife and a friend.

The friend had a crash shortly after opening it on iOS, but it went OK after that.

You can't paste or drag in images directly on any client, as far as I can tell.

The "desktop app" is a Chrome app (not even Electron). Not loving that.

There were some hiccups with contact syncing.

It decided it should be responsible for alerting me to receiving ordinary phone calls (!?) on my iPhone, for some reason, which was annoying.

We abandoned the experiment pretty quickly :-(

[EDIT] it also lacked nice-to-haves one might be used to from other messengers, like auto-emojification.


I agree that decentralization and federation is an important step for security and privacy, reducing the dependence on a signal entity.

That said, based on OWS' writeup on the subpoena they received[1], the only user metadata they collect are the user's phone number(or a hash of it? for identification), that user's last connection date, and the dat etheir account was created.

[1] https://whispersystems.org/bigbrother/eastern-virginia-grand...


> It's 2016 and our best crypto messenger options are worse than what we had 10 years ago when Skype was peer to peer, or Jabber with federation.

Actually, Jabber with OTR is pretty solid. If need be, you can use throwaway addresses too.

> I can understand the reasons for not supporting federation, but I disagree.

I disagree and can't understand the reasons, which however I suspect to be rather shortsighted. Just imagine if email had not been what came to be called a federated system.


I don't know why SilenceIM isn't more popular. It's a fork of Signal's now-discontinued SMS encryption. Usernames are still telephone numbers (because SMS) and metadata is still leaked (to telcos, because SMS) but there's no reliance on any centralised servers or organisations, no requirement for Play services, and delivery is as reliable as regular SMS.

I don't know the real reason for Signal getting rid of SMS encryption but it's the only feature I'm interested in. I can get glorified internet messaging anywhere, but OTR encryption over SMS is a difficult problem due to the inability to conduct handshakes. We should not throw this work away.


What about WhatsApp, Telegram? I don't use them, but saying that there are no options today for secure, encrypted communication raises my eyebrow.


Telegram is a farce. They don't have end-to-end encryption by default, and their encryption protocol(MTProto) is home-made and basically a secret. They haven't opened it up to testing or open sourced it, so it's pretty useless at ensuring 'privacy'. Plus recently 15 million phone numbers linked to Telegram accounts were hacked because of an SMS verification loophole.

WhatsApp at least has end to end encryption- though that alone isn't enough, but it is not open source and is owned by Facebook. By default, it shares your number and your contacts' numbers with Facebook.

Neither Telegram nor WhatsApp are viable alternatives to anyone interested in privacy.


> though that alone isn't enough

Why? I'm not trying to nitpick, I just don't know much on the subject. Not being open source doesn't really concern me nor the fact that my number is known by Facebook. My telecom company knows it, my mother knows it and all my friends know it + also all the companies know it who are used by my friends.


>They haven't opened it up to testing or open sourced it

https://core.telegram.org/mtproto https://telegram.org/apps#source-code


> Neither Telegram nor WhatsApp are viable alternatives to anyone interested in privacy.

Are you suggesting that the application under discussion here is a viable alternative? If so, how?


Geniune question. For Wire: what is missing from what code they have available?

https://github.com/wireapp


Have they released the full server source yet? Last I checked they hadn't.


Yeh on researching a bit after this comment, I found exactly that. Server source code is still a "To be released" without a strong date. Other criticisms are generally around lack of full crypto audit, which as far as I'm aware (i.e, not very), is a criticism of most of the modern crypto-chat apps (with exception to OpenWhisper when it was TextSecure).


Servers are closed source, will open source in 2017.


Check out Matrix/Riot.


There are a lot of people on this thread crapping on Signal for its various flaws and downsides. It's nice to be smarter and more informed than everyone, but for the rest of us we really need advice about what we SHOULD do if we want a baseline amount of privacy from mass surveillance.

I'm honestly asking here--I don't have the knowledge that many in this community do about the exact risk of Signal requiring Google Play Services for example. If Signal isn't the answer, what should I and others use who require a usable, reasonably accessible to non-tech-professionals solution?


Use Signal. It's the best you can get right now.

I understand all the concerns people have about Signal and I share most of them. But these discussions really are for people who want to build something better, not for people who look for something they can use now.


MicroG can replace play services if that is your concern.

previous discussion: https://news.ycombinator.com/item?id=12864429


The Tox project is specifically designed to meet your usability requirements.

There were also forks of Signal which fixed the privacy issues, however Moxie Marlinspike (the founder of Open Whisper Systems) ordered them to cease and desist.


I think it is important to note that Tox is still considered alpha. From their own wiki's FAQ:

"Tox is by no means complete. You may encounter bugs ranging from simple visual defects to segfaults on file shares. We cannot guarantee what works today will work tomorrow; Tox is an alpha program and code changes daily. Certain commits may break existing APIs, and we strive to give proper advanced warning to all client developers, etc. when such changes will be made. Additionally, Tox has not yet received a full security audit. While we believe Tox is secure against attackers who want to decrypt your messages, you may wish to use a more established solution if you are in a life-or-death situation."

If you're in a life-or-death situation, I think you really ought to be doing some intensive research into secure communications to find the solution that will work best for you.


however Moxie Marlinspike (the founder of Open Whisper Systems) ordered them to cease and desist

Would you be kind enough to provide links? I can't find any information about this claim via Google, and Signal is distributed under the GPLv3, which grants rights to fork and modify it: https://github.com/WhisperSystems/Signal-Android



Thank you!

tl;dr: You can fork the Signal code if you want, but if you do, Moxie Marlinspike asks you to change the app name and run your own servers. The server source code appears to be available at https://github.com/WhisperSystems/Signal-Server, but I've heard it doesn't include the voice component.

So, for example, if you think that Signal ought to support a feature like iMessage's "Invisible Ink" (https://mic.com/articles/146347/i-os-10-s-invisible-ink-feat...), which Moxie has specifically refused to support (https://github.com/WhisperSystems/Signal-Android/issues/5103), then you can't just fork the client, implement it yourself, and use the new client normally, because you'd no longer be able to talk to regular Signal client if I understand correctly. I mean, even if you figured out a way for two clients to tell each other whether or not the feature was available.

So Signal is open source, but not in a way that's useful if you want to change something.

I don't want to discourage people from using Signal. It's a great app. But I thought this was worth pointing out, assuming the LibreSignal is representative of what will happen if people want to make changes to their client.


I didn't realize that open source licenses required the maintainers to run infrastructure for you.


The big issue with maintenance is if the fork wants to continue using OWS servers and continue being able to communicate with 'normal' Signal users


And I didn’t realize companies can prevent people from using or creating third-party clients.

Especially if the original client is under an open source license.


Open source requires the ability to fork and Signal isn't forkable.


The client and the server (sans voice) are forkable. They don't need to grant a license to use their infra. You can run your own, it just won't federate.


He did not ordered any cease and desists, he just asked them to use another infrastructure. If you don't like Signal, absolutely nothing prevents you to fork the client AND the server to run something new. I don't expect debian to provide me CI/repo/mailing-list servers if I decide to fork them.


My understanding of the issues with the Signal forks was mainly the inclusion of the word 'Signal' in the name, resulting in brand confusion by more casual users.


WhatsApp integrated Signal's end-to-end encryption¹ into their communications platform.

Nearly everyone I know uses WhatsApp. This change made the platform much more secure for everyone. It's on by default and works transparently. Since then, at least one government was unable to compel WhatsApp to produce messages² from users under investigation.

I also have Signal installed but nobody I know uses it so its utility is diminished. I told some friends about Signal; they installed the app but won't use it because they can't message anyone except me. The only times people talked to me via Signal was during the temporary WhatsApp blocks ordered by my country's government.

¹ https://whispersystems.org/blog/whatsapp-complete/ ² http://www.forbes.com/sites/parmyolson/2016/05/03/whatsapp-f...


Or you could download https://wire.com/ which allows developers to build their own clients and still use their infrastructure. Also, it doesn't force you to use a phone number for registration. It supports audio/video calls. Also, if you're really privacy minded, it doesn't need Google Play Services. That way it can be used in CopperheadOS.


It doesn't have the scrutiny regarding cryptography and implementation that Signal has had.


They claim to have secure end-to-end encryption. In fact, I would say using wire.com is better than Signal because Signal does not make any guarantee of anonymity.

With the wire desktop app you can take steps to anonymize your connection to their server, although on a mobile device it's probably roughly equivalent to Signal.

Also, here is a handy chart that compares the differences between these two services (and other similar competitors). https://wire.com/privacy/


You're promoting Wire for anonymity? Check out their privacy policy, Wire maintains a server side copy of your entire contact list, all the groups that you're in, the plaintext metadata for your groups (membership, plaintext group title, plaintext group avatar), etc etc.

They have broken voice encryption, and leak enough data to reconstruct the audio of your calls. They leak tons of plaintext directly back to themselves, like GIF searches, and rolled their own messaging crypto that experts say is broken.

They have been caught lying about what kind of encryption they provide, they lied about being open source for years, they lied about being based in switzerland. From what I can tell, the only people promoting Wire after all that are usually on Wire's marketing team.


Could you provide something that backs up these claims?


They did in another comment on this page. I do not see any evidence that worries me. Perhaps if you're a famous terrorist, you won't want to use it, because your GIF searches might expose your evil plans. But I only needed a way to talk to family and friends that was more private than Facebook and Google, while not sacrificing features and usability. I think Wire has done an excellent job. I've not found anything else that checks all the boxes.


Just wondering, but why not just use XMPP? You can choose any server that you like or trust, or run your own (on your own or third party infrastructure, up to you), and use OTR for end-to-end encryption if you feel you need to¹.

I have been using XMPP since 2000/2001. My current address is nine years old (and I control the server). I have a choice of clients on every platform that I use. All my contacts have the same set of choices regarding provider, accounts, and clients. As a bonus, not just my human contacts but also my infrastructure uses it for messaging and monitoring, so I can literally control some of my servers from my XMPP clients.

There must have been at least thirty major IM "solutions" going in and out of fashion during these 16 years. Individually, each might have been somehow "more convenient", but if start counting the cumulative migration effort, I'm not sure the convenience argument holds much water.

¹ I used it in anger once myself, initiated by one of my contacts. It was an interesting experience.


For one thing, XMPP does not have the same features. Try Wire and see for yourself. There's audio, video, voice messages, multiple device encryption. Maybe XMPP has improved, but it didn't work as well for mobile use.


Is OTR really a practical option? You message seems unclear about it.

Also, how do you get all your contacts to use XMPP and your server?


> Is OTR really a practical option? You message seems unclear about it.

It depends on your threat model, like other alternatives.

I have never initiated an OTR session myself, but I have received one from a contact in anger (whistleblowing).

> Also, how do you get all your contacts to use XMPP

In my case, I have been using XMPP for the last sixteen years, so my contacts have "grown organically", to use an en vogue marketing term.

It appears that Mac computers come with an XMPP client preinstalled (and possibly pre-configured?) Linux, the same. Android, there are a number of clients available. Windows, I think there is Gajim, ...

As far as a stereotypical common user is concerned, they are quite happy to install whatever applications their peers are using, which is how they end up with so much cruft on their computers, phones, etc. :-)

> and your server?

Nobody needs to use my server. It is like email, you can use whichever provider you like.

I hope this answers your questions!


Yes, it is.

Convince them :-) Same as with any other.


Do you know how Wire makes money to sustain what they are doing?

Don't want to invest my time into something that will eventually sell out and do that opposite of what they stand for today.


Rolling out paid stuff next year. As someone said - VC funded so have had the luxury to focus on building a great app, see if people find value in it. This has been true, lots of interest form B2B world from groups, teams, organizations where privacy and security important.


> Don't want to invest my time into something that will eventually sell out

And what do you think the gentleman behind Signal is out there for?

His modus operandi:

* Start a company, call it something catchy, like Whisper Systems.

* Get media coverage and some users.

* Sell to Twitter

* Start a company, call it something catchy, like Open Whisper Systems.

* Get media coverage and some users.

* ...

Start to see a pattern? :-)


On Twitter, they said they plan to sell premium features. They are currently funded by a founder of Skype.


Last time i checked they have no revenue possibility. So I decided to skip, I am curious on this too


There's quite an interesting discussion here:

https://www.reddit.com/r/privacytoolsIO/comments/57n8ee/inst...

As I see it, there's some sort of trade off for all solutions for private instant messaging right now. Wire comes close to a good solution. The biggest problem for me is that the official Android client is quite buggy, UI wise...


Fair enough critique on Android bugs, we're fixing them at a steady pace. But if you know any good Scala developers in Berlin / interested in moving to Berlin then we're hiring https://wire.com/jobs


A problem for some people with Signal is that it requires you to use a phone number, and they may not want to disclose theirs.

You can set up an SMS gateway number just for this purpose, but you shouldn't need such workarounds.


> A problem for some people with Signal is that it requires you to use a phone number, and they may not want to disclose theirs.

That's true, although those of us with unlocked phones can purchase "throwaway" SIM cards with distinct phone-numbers fairly easily with cash, use the number for secure communications only, throw the card away, and then go back to our public lives.


That will expose your IMEI and not really protect you from people that would be tracking you based on metadata.

You gotta burn the phone.


Ah, ok, fair enough.

So other than full-on burner smartphones (ie: throw away $600 every time you need to do something privately), what's a reasonable heuristic for conducting private business? Maybe collectively owned or rented burner phones so that the metadata collector only ever knows, "Someone rented this device for encrypted communications" rather than who in specific is communicating with whom?


> what's a reasonable heuristic for conducting private business?

You need to do a threat analysis.

I did not immediately find any good introductory resources via a quick Google search, but try it yourself. Very very briefly, it involves identifying the threats and their possible consequences, then working on either removing the former or minimising the latter.

Be aware that this is not merely a technological process. It is primarily a social one.


My friends and I have used GroupMe (which is owned by Skype) for a few years now and some of us wanted to migrate due to privacy. We tried out Signal, but there were some major usability problems in you intend to use it as a "group chat" replacement. My biggest issue with it is that there is no mute feature for iOS. Yes you can turn off notifications at the iOS level, but I want the ability to only have @-mentions to have access to notifications.

I made us switch to WhatsApp instead as I think the usability/privacy balance was better (for our use case).


I recently switched from Android to iPhone and was pretty surprised by how much lower quality the Signal app is for iPhone than Android.

Signal for Android was really amazing, and the switch made me think this could be one reason why so many people seem to speak both highly and poorly of Signal. The iPhone app just unfortunately seems to be way behind the Android app.

Not sure why they have prioritized development in that way.


I think it's because moxie uses Android.


This is the correct answer. Moxie dominates signal-android commits. He doesn't work on the iOS version.

The programmer working on the iOS version left early this year. If you look it was pretty stagnant for awhile with him, only really getting bugfixes. Now they have a new maintainer that's been playing catchup and things really only started to get rolling in the past couple months.

See: https://github.com/WhisperSystems/Signal-iOS/graphs/contribu...


Android owns close to 90% of the market.


I found a weird issue on a friends phone when I suggested they download it for iOS. It says "This item is no longer available". They were able to download other apps from the store. When emailing Signal support they said nothing was wrong on their end.

Still havent figured out why my friend cant download Signal from Apple App Store.


Maybe device management or policy?


No device management policy. :/


What is the device and its version? What iOS version is being used?

At the very least Signal requires iOS 8.0 or later.


Yeah thought of that. Made sure it was on the latest version and rebooted. Still didn't work.


Which country store were they on?


I'm glad that infosec has broken out of the era of endlessly debating the imperfections and faults of every solution but instead can clearly recommend practical advice for everyday users that is easy to understand.

Having the NYTimes advocate for crypto so clearly and to such a large audience will make a real and tangible difference to users - more than can be said for the years of online nitpicking by experts.


That said, Signal is not perfect. It lacks some features of other messaging apps, like the ability to send stickers.

That this is a deficiency worthy of mention reveals just how different the average person's values about what a messaging app should do are from mine.


I know multiple people who stay on Telegram because it has stickers and Signal doesn't. It's a real thing, to the point where I've considered diving into Signal's code and implementing them on top of the existing support for sending photos.

Missing feature #2: only the Android app can be paired with Signal Desktop; the iOS app is left out. What's up with that?


#2 has been implemented since September: https://whispersystems.org/blog/signal-desktop-ios/


Thanks for the update, I missed that!


I'm disappointed that this article doesn't mention iMessage. From what I understand, it's end-to-end encrypted and Apple doesn't have a key.


But we can't audit it because the source code is not available, which makes it a no-go for people that actually care about their privacy.


Except a pair-locked iPhone running iMessage is possibly the most secure and private device around - people that do care about their privacy (e.g. the grugq) recommend it.


Using signal myself I think it has the best trade offs for security and user friendlyness since I have been able to convince my friends to use it .


> Open Whisper Systems said it planned to add these features, noting that GIFs are already supported in the Android version of Signal.

And iOS for a while now! https://github.com/WhisperSystems/Signal-iOS/pull/886


Why isn't anyone talking about Wire? https://wire.com/

The little I've used it, I found it pleasant and effective. But it almost seems like there is a campaign to ignore Wire; nearly every article written about Signal and alternatives fails to even mention Wire.


That's like saying there's a targeted campaign to ignore Skype because these articles don't mention it. You might like Wire, but it's not an app that people concerned with privacy should be using.

They have done a bunch of shady things:

1. They lied about having end-to-end encryption in their app: http://www.pcworld.com/article/2855745/new-communications-ap...

2. They lied about being open source for years.

3. They lied about being based in switzerland.

They also have serious problems with their app:

1. The "encrypted" calls leak enough information to be able to reconstruct the audio.

2. Many features in the app, like GIF search, transmit plaintext directly back to Wire.

3. They rolled their own crypto, and experts disapprove of the choices they made.

Journalists who write articles like this and don't mention Wire are doing their job. They've consulted with experts and aren't spreading misinformation.


> 1. They lied about having end-to-end encryption in their app

That was before I joined but that got fixed in 24 hours. Not sure how the incorrect claim made it live in the first place.

> 2. They lied about being open source for years.

Wire open sourced it's crypto protocol in March 2016. It never claimed to be open source before that. It further open sourced it client apps in July 2016 and will open source server some time in 2017.

> 3. They lied about being based in switzerland.

Citation needed. Wire is registered and headquartered in Switzerland with an office in Zug.

> 3. They rolled their own crypto, and experts disapprove of the choices they made.

Link?


You can't list a bunch of serious claims like the audio can be reconstructed with no additional information. There was a feud between the Signal people and Wire, so there is a lot of false info trying to smear Wire.

GIF searches are obviously going to use a 3rd party service, and nobody should expect some kind of anonymous encrypted channel for GIF searches. That's ridiculous.

I've not seen any lying about being open source. They haven't released every piece of code, but I don't recall them ever claiming they did. https://github.com/wireapp

I've never seen any crypto experts who have audited Wire and said there's anything wrong with their choices, and you supplied no links.

Between all the options, including Signal, I personally think Wire is best, and nothing you've provided has any reason to change that.


I thought the same as you about gif search until I saw what signal is planning for the feature: https://whispersystems.org/blog/giphy-experiment/

By proxying the encrypted request through another server, signal never sees the content of the request and giphy never gets the identity of the requester. I'm not sure that this is strictly necessary, but it certainly increases my confidence that the signal team are serious about their work.


> GIF searches are obviously going to use a 3rd party service, and nobody should expect some kind of anonymous encrypted channel for GIF searches. That's ridiculous.

GIF searches aren't being transmitted to a third party service, they're being transmitted directly to Wire in plaintext: https://github.com/wireapp/wire-android-sync-engine/blob/4d5...

There's tons of stuff like that which leaks in the app. They store your entire contact list server-side, your plaintext group membership, group info like plaintext group name and plaintext group avatar, etc etc.

> I've not seen any lying about being open source. They haven't released every piece of code, but I don't recall them ever claiming they did.

Since their launch several years ago, they've had a "feature" matrix on their website that lists Wire as being open source (and their competitors as not being open source). That was long before their recent "open source" announcement (which still isn't even fully open source). When pressed, they said it was because they used some open source libraries. That's really shady.

> I've never seen any crypto experts who have audited Wire and said there's anything wrong with their choices, and you supplied no links.

Here's one example I saw recently:

https://www.cs.jhu.edu/~cwright/oakland08.pdf

They're vulnerable:

https://github.com/wireapp/wire-audio-video-signaling/blob/c...

Even worse, they also apparently include plaintext RTP headers with audio level information in them.


Thanks for trying, but it appears to me you're paranoid. The user directory in Wire is public. It is no secret that you're on there. They need information about whom you're connected to for the service to work. I'm not worried about GIF searches, or contact list. I'd prefer the contents of my conversations with family weren't archived on remote servers. Wire accomplishes that in the best way. It's a useless academic concern that audio level information would be in headers, and the paper you linked is of no relevance. I still see no reason why I'd not want to use Wire. For talking to family and friends in a reasonably private way we cannot get from using services from Facebook and Google, I think Wire is an excellent application.


> They store your entire contact list server-side, your plaintext group membership, group info like plaintext group name and plaintext group avatar, etc etc.

One has to remember that Signal also stores some social graph data, which is equally problematic.

I can’t reocmmend either.



Well, now I've downloaded and installed Wire alongside Signal on my phone.


Wire has a really great user interface and it works quite well! My wife and I communicate almost exclusively using it.


This looks like a really good replacement for Skype, which we currently use to manage communication between remote working team members.


Until it's owner decides to sell it to eBay or MS like he did with Skype.


Are you aware that the guy behind this signal app sold his previous "secure messaging" thing to Twitter?


Which is a risk with every service we use...


The NYT article even mentions Signal weaknesses (e.g. multi-device support) that are addressed in Wire, without mentioning Wire. Poor journalism.


Now gents, a number of you in the comments have wondered about what other alternatives are out there. You may have seen that I specifically advise against Signal, and other users have also expressed concerns about a number other applications, amongst which Wire and Telegram.

One that, to my knowledge has not been mentioned yet, but which would appear to meet some common measure of functionality versus convenience expressed here, is XMPP messaging application Conversations (https://conversations.im/).

I am not going to give it my personal recommendation because having tested it, I did not like a number of design decisions the developers have made, and I did not like their overall vision for the app. With that said, it's horses for courses.

On the end to end encryption front (all the rage these days, eh?), it appears that the Conversations devs have taken Signal's protocol and _done it right_. That means, they actually specced it (https://conversations.im/omemo/) and had it audited (https://conversations.im/omemo/audit.pdf), along with a couple important improvements such as eliminating the requirement for a trusted server or Google Play, which greatly reduces the attack surface.

Again, I personally do not like Conversations and I'm not going to use it myself--that's a personal preference thing, but kudos to the devs for doing a professional job, especially while everyone else are busy selling snake oil.

I say, go give Conversations a try, it may be the thing you were looking for.


One thing I'm very worried about with Signal is the high levels of permissions the app requires when you first install it.

It seems very counter intuitive,. Worried about privacy & security of your messages? Oh firstly you need to trust all your personal contact info to us. Yes they say they don't use it off of the phone, but it's asked for and that's based on trust.

On iPhone it requires access to your contact list. Surely it could run without this? Snapchat for example makes it easy to add friends in person, by scanning a code on one phone from another. How is sending unencrypted text messages announcing your social graph the only way to use the app.

Why is the app asking for so much permissions trust itself?


I ditched Signal in favour of WhatsApp given I believe the tech behind it was the same and the fact that a lot of my contacts are already on WhatsApp. The NYT article indicates that WhatsApp is still able to collect more information than Signal does collect.


The difference is you never know when Whatsapp may switch off that end-to-end encryption.

Second, Signal only stores account creation date and I believe account deletion, or some other super-basic info like that. But that's it. Meanwhile, not only does Whatsapp collect much more metadata, including the groups of which you're part, but it has also started sharing that data with Facebook recently.

I'm not saying you should stop using Whatsapp, though. I still think it's a better alternative to iMessage/Hangouts/Skype/Telegram. But at the same time you should slowly transition your most important contacts to Signal, too. One nice extra feature Signal has over Whatsapp now is Snapchat-like disappearing messages, so at least you may be able to convince some friends with that.


One important thing to note is that, as far as I know, WhatsApp backups are stored in plain text. So to have secure messaging, you must disable backups and you must trust that the other people you're messaging have also disabled backups.


You _assume_ that WhatsApp uses signal and not clear text (or some other broken lib)


For anyone wondering - this is an example of an ad posing as an article.


The mere act of using Signal is suspicious. But if usage is universal then that suspicion can not be acted on.

The suggestion that the motivation for this article is profit for the NYT or Moxie is quite destructive.


> The suggestion that the motivation for this article is profit for the NYT or Moxie is quite destructive.

No, that's literally how it works. Media need to sell copy (clicks these days) and companies need to get coverage. And that's perfectly OK if companies are acting ethically and journalists and editors are doing due diligence.

The person that you mention has good connections in the media and uses them to self-promote and promote the tat he sells.


>...tat he sells...

It's free.

And normally one would associate such vehement and repetitive insistence on counter-factuals with trolling


> It's free.

You misunderstand. He is running a company, what do you think their exit strategy is? You may want to look at his previous company and "red phone", I think was his product called.

> And normally one would associate such vehement and repetitive insistence on counter-factuals with trolling

I am sorry that you do not like to hear this.


I still haven't been able to register to Signal. It always fails with "error establishing connection with server". Does anyone want to suggest how I could get this working?


I had this issue as well until I cleared the data/cache for my Google Play services app.


Yes, I've seen people mention this. Unfortunately that didn't work for me.


I believe it also has a voice call option. Have you tried that?


Yes, the problem is after. Step two is confirming the number (by SMS or phone call). Step three is "generating keys". Step four is registering to the server, and it always fails there.


Not until this basic usability scrolling bug is fixed. Almost a year and people rarely complain about it I started to wonder if people actually use it heavily.

https://github.com/WhisperSystems/Signal-iOS/issues/769


I wish I had friends/family that were worried enough about their privacy to use an end-to-end service with me lol


zkc, zero knowledge communications, was released today. It is just the first release, and a minimal tool with minimal features, but its exactly the solid foundation I have been waiting for.

https://blog.decred.org/2016/12/07/zkc-Secure-Communications...

"zkc is a blending of what we consider to be the best parts of both of these projects, Signal and Pond" "The UI is text-based and emulates the appearance of irssi, in order to keep UI-related complexity low and avoid large GUI toolkits as a dependency." "intended to provide the highest level of communications security balanced with minimal complexity in its code, configuration and usage."


But downloading Signal after news like these will tag people as trying to hide something from the state.


I think it is a mistake to buy into the logic that using encryption means you have something to hide.

Regardless of the framing of the article, many people likely read this and thought, "yes, it would be nice to be confident my messaging is private even from mass government surveillance."

This does not mean they are drug dealers. It means that they have concluded wide-dragnet surveillance programs are worth opting out of, especially if it's very easy to do so.


>encryption means you have something to hide

It means for me, but I fail to see why it is bad. Even direct "not your business" does not prove any guilt. It is okay to hide, we don't have to be open book.


Not only tag, but also get their identity and possibly use backdoors, because Signal is only on Google Play, which in order to access it requires a gazillion identifying data.

Signal is a joke if your adversary is NSA and its likes. You cant run Signal unless you give up who you are, and that you are using it, and its not very difficult to analyze _when_ you receiveor send a message due to the Signal requiring google messaging system.

Signal should just really rename itself to "Google Signal, no really we are not evil for certain."


Most people only have the NSA as an adversary in the minimal sense that they're caught up in a very broad mass surveillance net. It's true that if you're a whistleblower with sensitive government information Signal is not a complete solution for you. But who cares? Almost nobody is in that situation, and people in that situation have access to other tools that actually do help.

Your last sentence is extremely unfair to Signal and dramatically underrates the privacy it provides to ordinary people. It's dangerous to suggest that because it requires a phone number and lives on an app store it provides no useful protection at all, because that implies it's no better than e.g. WhatsApp when in fact it has a much better featureset.


> Almost nobody is in that situation

Almost everybody might be in that situation, because the NSA is not the only group that has access to the data they collect. They call it "parallel construction", where the NSA's "customers" like the DEA simply have to fill out a form[1] to get surveillance data, and promise to lie about the source in court.

[1] https://www.techdirt.com/articles/20140203/11143926078/paral...


LOL, its not better than WhatsApp, that is my point.

In fact, its worse than cleartext SMS, when you use Signal you are a target, when you use cleartext SMS, you can still hide among the noise.


Not everyone's threat model is nation-state adversaries. Often times it is simple theft, coercion, snooping friends/family, etc.

What messaging system would your recommend instead? Riot? Tox? Last I checked, neither were nearly as polished and feature-complete as Signal. In my use case, unless it is good enough to become a daily driver, many people won't even bother with it. What good is a messaging system that nobody else uses?


Use what the masses are using, anonymity is being in a large enough group and not distinguished enough from the rest.

Telegram, cleartext SMS, email.

Normal people do not send anonymous or secure messages, but if you need and want to, then you cant just install an app and be done with it, you have to first find and install I2P which is not traceable to you (good luck, first get a degree in information security and become Linux guru), then join an I2P only IRC channel and begin messaging with your group of people about plan to watch football and drink beer, after you had the secret communication with bitmessage over I2P.


I think it says more about you being so willing to accept the state's logic that if you use such apps you have something to hide and you'll be automatically put on lists, than it says about the people using it. That's exactly how the government wants people to think about this stuff, and it just wants you to give in to plaintext mass surveillance.

Why make it so easy for them?


How can I use Signal (on iOS) without giving it access to all of my contacts?


Download Signal? No, thank you.

The fact that the guy behind it is hyping it via the New York Times, a generalist publication, instead of validating the thing through professional cryptographers (which he isn't) and recognised privacy champions such as the EFF is very telling.

The thing has not been properly validated or verified (for a start, because there is no design document to validate against, and no published goals to verify against), it uses an ad-hoc encryption scheme from a non-cryptographer, it is not open source (see F-Droid discussion why it's not there), it uses hardwired servers controlled by a party or parties which are not known to be trustworthy, and apparently it requires Google Play services, which nobody who is truly concerned about their privacy is going to use in the first place (and definitely one should not).

From the way this is going, it is becoming clearer by the day that this is just another start-up, their target market are unsophisticated but paranoid users and hipsters with no real need for privacy but who think they should make some kind of statement. Their plan is to hype it up (e.g., via the NYT), get enough users, then get bought by one of the so-called "social media" players. It is more attractive to them than Telegram because the latter is run by a Russian, which to the American public sounds sinister (Mr Brin and countless other great scientists and innovators notwithstanding), and their servers are probably based in Germany, which is a bit more of a problem since there are (still) some proper privacy laws over there, and which would cause some headaches to the acquiring party. Besides which, there is a good chance that their current investors come from those "social media", or are the usual Silicon Valley VC crowd, so things stay between friends, as it were.

So, in brief:

* If you want a new Skype, go for it.

* If you care about the privacy of your communications, you should avoid it.

* If you need to keep your comms private, you must avoid it.

Anyone disagrees? Feel free to reply and tell me why!


I'm probably just inviting myself to get trolled by replying to this, but this comment is just ridiculously wrong on so many levels.

> The fact that the guy behind it is hyping it via the New York Times, a generalist publication, instead of validating the thing through professional cryptographers (which he isn't) and recognised privacy champions such as the EFF is very telling.

Cryptographer Matthew Green on Signal's crypto and code quality (it was called RedPhone/TextSecure at the time of this writing): https://blog.cryptographyengineering.com/2013/03/09/here-com...

Version 1.0 of EFF's Secure Messaging Scorecard gave Signal 7/7: https://www.eff.org/node/82654.

> The thing has not been properly validated or verified (for a start, because there is no design document to validate against, and no published goals to verify against)

Signal has been analyzed, with favorable results, by academic researchers at least twice:

- https://eprint.iacr.org/2014/904.pdf - https://eprint.iacr.org/2016/1013.pdf

> it uses an ad-hoc encryption scheme from a non-cryptographer

Moxie Marlinspike and Trevor Perrin probably wouldn't call themselves "cryptographers," but almost anybody in the field would agree that they are experts on applied cryptography.


> I'm probably just inviting myself to get trolled by replying to this

I'm sorry that you get that impression, but I do appreciate your input.

> Cryptographer Matthew Green on Signal's crypto and code quality (it was called RedPhone/TextSecure at the time of this writing)

That's the application that they sold to Twitter, not the one being talked about here. I do not know how different the code bases are.

It is also around that time that the app had a gaping, amateurish hole in that it was simply leaking everything via logcat. And what does the guy do? Instead of addressing the issue like a professional, he goes on a complete tangent rubbishing F-Droid (https://github.com/WhisperSystems/Signal-Android/issues/53) and then making rather poor excuses as to why you should get your application from the Google store and not from anywhere else.

Excuses which by the way, have been evolving over time. I think he eventually admitted that he wants to keep track of how many users are using it (handy to show to your potential buyers).

He also has a history of lying, such as when he used fake WHOIS details to run his "Google anonymiser" thing. And of course, when he was shut down by the registrar, as you do when someone has given you false details, what did he do? He went to the press to whine about the registrar! After he entered a contract in bad faith, something which happens to be a prosecutable offence. That's the sort of person we are talking about here. I hope you will understand if his word does not exactly fill me with confidence.

> https://www.eff.org/node/82654.

That page starts with: "This is version 1.0 of our scorecard; it is out of date, and is preserved here for purely historical reasons."

And continues with: "the results in the scorecard below should not be read as endorsements of individual tools or guarantees of their security"

> Signal has been analyzed, with favorable results, by academic researchers at least twice:

Yes, I am aware of those. And that is not what validation and verification is which, as I said, in the absence of publicly available design documents, is impossible to do independently. The guy is trying to make it look like he's selling a "secure" communication platform, but if you presented that to a defence contractor (which I have some experience with) you would be laughed out of the building. Proper security is not done like this at all. For a start, you actually define your goals, i.e., what you intend to secure, against what threats, etc., etc. If you can show me a paper with that information I would be grateful.

Notably, you may have noticed that those papers, like Green's, are a protocol analysis, not an analysis of the entire solution. In that respect, you're back to the previous situation: the protocol might be ultra-secure, but if you're still leaking your plaintext on a different channel...

> Moxie Marlinspike and [...] probably wouldn't call themselves "cryptographers,"

At the risk of sounding elitist, what is his academic background? (I elided the other person because I do not know who he is).

> but almost anybody in the field would agree that they are experts on applied cryptography.

What do you base that conjecture on?


>He also has a history of lying, such as when he used fake WHOIS details to run his "Google anonymiser" thing. And of course, when he was shut down by the registrar, as you do when someone has given you false details, what did he do? He went to the press to whine about the registrar! After he entered a contract in bad faith, something which happens to be a prosecutable offence. That's the sort of person we are talking about here. I hope you will understand if his word does not exactly fill me with confidence.

I really don't see why someone should be on my shitlist for lying to godaddy dot com or whatever giant registrar unless you consider fudging identifying details about something that really doesn't matter, especially considering he was very openly associated with the project, some sort of horrible moral offense. I especially find your taking massive umbridge with fudging personal information baffling given how privacy-minded you otherwise seem.

>At the risk of sounding elitist, what is his academic background? (I elided the other person because I do not know who he is).

Combined with the above, the way you're hand-waving away the other of the two original developers of the protocol really just makes it seem like the position you've taken against Signal is mostly predicated on some sort of grudge against Marlinspike himself. Yes, trashing F-Droid was not a great thing to do and you might see him as someone with a strong penchant for self-promotion, but the way you keep on tying your criticisms to Marlinspike personally really muddles your case. For example, you object to him promoting Signal in a New York Times piece saying it is a generalist publication and posit he's just trying to drum up attention so he can find a buyer, which may or may not be true, but isn't one of the most important goals of a secure messaging application to get people to actually use it and to achieve widespread adoption? The main lesson I've learned from GPG mail is that a perfectly private means of communication is worth very little if I can't actually convince anyone to use it with me.


> I really don't see why someone should be on my shitlist for lying to godaddy dot com or whatever giant registrar unless you consider fudging identifying details about something that really doesn't matter,

I think I can see where you are coming from. You seem to compare this with, say, opening a GMail account under an alias, if I understand correctly.

However, holding domain names and, at the time, SSL certificates requires a different sort of accountability. I can elaborate on that if you wish, but I trust it won't be necessary.

> especially considering he was very openly associated with the project,

In the same way that Mr platinumrad or Ms at612 are associated with this discussion? By the use of an alias?

> some sort of horrible moral offense.

Yes. And please note he did not just lie to the registrar. When he got caught, he went and whined to some journo who published a piece criticising the registrar without bothering to contrast the information first. It all being presented as if it was the registrar in the wrong, when they were following the rules, which are there to protect the public in the first place. This coming from some bloke who was saying "don't trust Google, trust me. Because."

> I especially find your taking massive umbridge with fudging personal information baffling given how privacy-minded you otherwise seem.

I value my privacy. At the same time, when I enter a contract, I do so in good faith and of course part of it is letting the other party know who I am.

> really just makes it seem like the position you've taken against Signal is mostly predicated on some sort of grudge against Marlinspike himself.

Yes, you are correct. My apologies if that wasn't clear. I question the ethics, motivation, and competence of this one individual, who happens to be closely associated with said project.

> Yes, trashing F-Droid was not a great thing to do

To put it mildly. On an incidental note and more generally, have you ever seen him do a mea culpa?

> [but] isn't one of the most important goals of a secure messaging application to get people to actually use it and to achieve widespread adoption?

I do not know. I would guess not (based on defence experience). But the main point is that him saying "oh sure, it's secure" does not make it secure. He seems to be taking advantage of the public's inherent credulity and lack of awareness of what "security" actually means and involves. We have gone through this discussion already, so for an example of what I consider a better developed and correctly presented security solution, please see the Conversations IM application.

> The main lesson I've learned from GPG mail is that a perfectly private means of communication is worth very little if I can't actually convince anyone to use it with me.

This is a different, and long discussion, but it is probable that the reason why you are seeing that is the other party having mentally (or formally) done a cost/benefit analysis and deciding that their information is not of such value to justify the extra effort to protect it. Rightly or wrongly.


I think that issue highlights the problem with unofficial repositories. Users remained vulnerable because their upstream provider didn't update quickly enough. It culminated in a user spamming the official issue tracker with an outdated and annoying bug report.

This isn't just unique to Android: there are multiple ongoing efforts at the moment in the Linux world to lessen frustrations with distribution repositories. Snappy, Flatpak, and AppImage intend to unify application deployment and allow users to install applications from anywhere. In most cases, this could mean pulling directly from the application developer themselves. GNOME and KDE will likely encourage this.

I know some Firefox developers who have grouched at the delay between official releases and when distributions finally deploy them, so this problem isn't exclusive to desktop environment developers.

Back to Android: Moxie had a point when he claimed that Android is more privileged to have a system that provides package verification back to the original developer. It doesn't matter where you get an APK from: the developer's website, Google Play, APKMirror, or Bittorrent. If you have the developer's public signing key, you can verify the authenticity of the APK.

F-Droid represented a serious step backwards in Android security, back when they used to self-sign APKs. It wasn't possible any longer to cut out the distributor from the chain of trust. Fortunately, they reacted to Moxie's criticisms, and F-Droid now retains the original package signature when the build can be reproduced.

From a developer perspective however, encouraging or even tolerating unofficial installation channels for secure communication software is bad. If vulnerable users are in-contact with non-vulnerable users, they unknowingly put both parties at risk. If the ecosystem evolves to the point where this is common, the whole system is insecure.

What Android desperately needs is a high-quality, non-profit, privacy-friendly, charity- and grant-driven app store. It must entice open-source app developers. It cannot do self-builds, except for reproducibility. It needs crash-reporting, analytics, usage metrics, device-specific builds, localization options, and more. It requires dead-simple tools for command-line deploying.

Until then, in my opinion, F-Droid will never be accepted by app developers. F-Droid is for users only. Not for the same purposes, either: for the cautious user, F-Droid mainly shines as a locally-setup repo for self-deployed apps.

P.S.: Perrin & Moxie recently began documenting Signal Protocol: https://whispersystems.org/docs/


> From a developer perspective however, encouraging or even tolerating unofficial installation channels for secure communication software is bad.

What is your threat model?


> it is not open source

Huh?

https://github.com/WhisperSystems/Signal-Android

I'd even argue it's free software - the team has managed to create some confusion about distributing modified binaries - with regards to using the servers Signal operates -- but have clarified that it is indeed ok to build your own binary from the source they provide, and use their servers.

I'm not quite convinced about their argument for official app store distribution and updates, but I can understand the argument.

> it requires Google Play services

I'm fairly sure the iOS client doesn't depend on Google Play Services. Sticking with an app store does require trust in the provider though.

Whisper System have made some fairly clear choices, and while it's perfectly fine to disagree, I think it would be best to avoid FUD.

It certainly strikes me as one of the better options for pragmatic secure messaging, that allows for a fairly narrow and reasonable set of threats (Google/Apple/Microsoft (possibly more than one of each, depending on your platform), Whisper Systems themselves, probably most state actors).

The other reasonable option I'm aware of (that make slightly different trade-offs), is ChatSecure/zom.im (where zom.im is a "friendly" fork of ChatSecure).


> I'd even argue it's free software

Terminology. What you call free software I call open source. As you go on to mention, you can see the source but not use it in any meaningful way. In particular:

> but have clarified that it is indeed ok to build your own binary from the source they provide,

Exactly. Your own binary. From their source.

Build your own binary for someone else, and it's "malware", as the guy had the nerve to call F-Droid in that bug report (here: https://github.com/WhisperSystems/Signal-Android/issues/53). That sort of bad faith, coming from a known liar (see my other reply) is what I really cannot condone.

> and use their servers.

Yeah, similarly. Use a source other than theirs or servers other than theirs and they start whingeing.

That is not open source.

> I'm not quite convinced about their argument for official app store distribution and updates,

Possibly because every time it's a different excuse?

> but I can understand the argument.

Yes, so can I: they want to control the platform so that it is their users, so that they can sell it to someone else, like they did last time.

And I would be perfectly fine with that, if it wasn't done via lies, deception, and denigrating third parties, particularly the chaps at F-Droid who at least have the decency of using their real names (not to mention not seeing you as the product).

> Sticking with an app store does require trust in the provider though.

Agreed. How high is Google in your "trusted" list? Yes, I'm picking on Google because it's a bit of an easier target than Apple, but still.

> I think it would be best to avoid FUD.

I agree, and that's precisely why I feel the need to speak up. I challenge the honesty not of their enterprise (which is no different from that of Skype, Whatsapp, or any other player) but of the way they are pursuing their goal. See above.

> It certainly strikes me as one of the better options for pragmatic secure messaging,

I don't know. As mentioned elsewhere, XMPP meets all my requirements and is not vendor-dependent. But the availability of options depends on each user's definition of things like "pragmatic" and "secure" (and even "messaging" for that matter!)

From seeing what's out there though, it appears that modern versions of Whatsapp (which I don't use, I'm FOSS-only) offer essentially the same capabilities as this application though, including end-to-end encryption. And of course, essentially the same disadvantages. I could be mistaken here though.

> that allows for a fairly narrow and reasonable set of threats (Google/Apple/Microsoft (possibly more than one of each, depending on your platform), Whisper Systems themselves, probably most state actors).

I guess it also depends on each user's definition of "fairly narrow and reasonable". :-)


While you might claim that running an ASOP derivative you need to trust Google less (and in turn trust something like f-droid more, perhaps) -- if you want a chat/im client on an Android device it's hard to see how Google isn't already one entity you need to trust (along with a list of hardware manufacturers).

As for your other comments - you may run your own server infrastructure from same or derived sourced, your own derived clients, distribute binaries etc - but you can't dilute the brand. Similar with Debian cloud images for example.

I'm not sure how that's "not FOSS".


[flagged]


We detached this flagged subthread from https://news.ycombinator.com/item?id=13132650.


1. The actual leaks that WikiLeaks have published have virtually all targetted the Obama administration, so I'm not sure how you suppose the leaks have previously been making the "other team" look bad. By most accounts, Obama and Clinton are on the same "team."

2a. I don't see how you could seriously suggest that WikiLeaks's behavior in this election is no different than what they had done previously.

2b. If you can find me an example when WikiLeaks published stolen emails from people within a presidential campaign, and then published those emails during the campaign for the avowed purpose of influencing the election, I'd be interested to hear it. I'd be even more interested if you could find evidence that I supported it.

3. True, they're clearly trying to reduce government secrecy by making is harder to keep things secret. But what does this have to do with my question, which was how does this help promote free dissent? And I think you're overlooking the fact that they are also undermining private citizens' secrecy, when they see fit to publish their emails.


>> "I'm not sure how you suppose the leaks have previously been making the "other team" look bad"

The largest leak in US military history?? [1]

[1] https://en.wikipedia.org/wiki/Iraq_War_documents_leak


The Iraq war continued for 3 years under the Obama administration.

In fact, a few americans even believe that Obama was in office during 9/11,and blame him about everything.

I'm also among the people that used to support Wikileaks when they were anti-estabilishment, but I'm now disgusted by them

(only marginally because of their involvement in Trump's election: https://www.google.co.uk/amp/s/amp.theguardian.com/media/201... )


What does what a few Americans believe have to do with the comment made?


That it's not difficult at all to "make the \"other team\" look bad", and that the public opinion is easily swayed


You must be living in some sort of alternate reality, because the leaks that gained WikiLeaks its infamy[1] occurred in 2010, during the Obama Administration. After that, there was another long series of leaks during 2011-2015 [2], which included the complete draft of the TPP (of which Obama was in favor).

Furthermore, WikiLeaks used to carefully comb through documents and leak only those that would not cause personal harm. They did no such thing with the most recent leaks during the election season. Some of the emails they leaked contained personal information of donors, including home addresses and social security numbers. That's grossly irresponsible. It was clear they had a partisan political agenda that went above and beyond anti-secrecy in government.

[1]https://en.wikipedia.org/wiki/Afghan_War_documents_leak

[2]https://en.wikipedia.org/wiki/WikiLeaks#2011.E2.80.932015


At what point did Wikileaks ever scan their releases for personal information? This is completely false. They never have, nor is it feasible to, they just don't have the manpower to do so. That's a major argument against Wikileaks, that it has released names of ops, contacts, and soldiers in the past, far before this election.

What people don't seem to understand is they don't have the manpower or time to go through everything they receive. The bad comes with the good, which is why there are legitimate arguments either way.


There are a lot of people who believe that Wikileaks is compromised and that Assange is captured. There are also some who believe this to be some conspiracy theory.

The people who believe the former are waiting on Proof of Life, and do not believe they have gotten it.


Who says that?

If it's you, just say so. If not, what is your point?


The subreddit /r/WhereIsAssange/ has been going on about it since he went offline around mid-October. Of course, the discussion there is mixed, and conversation is starting to turn towards him still being in the embassy.

The point being that it is possible that Wikileaks is compromised and that could explain the parent's observation of changed behavior.

Of course, this could just be noise too. -shrugs-


Any communication tool that runs on a closed-source OS like Windows or iOS can certainly not be trusted.


It lacks some features of other messaging apps, like the ability to send stickers

That sounds like a feature more than a deficiency.


Signal crashed on me day in day out. That app is junk.


They take pull requests https://github.com/whispersystems

There are better ways to give feedback than just saying a free app is "crap"


This would be more helpful if you stipulated app version (or at least approximate era), hardware, and OS version. Just the basics.


I am really interested why Signal. Why not Telegram? I prefer Telegram over Signal because of much more better UI and rich functionalities i.e. bots. Signal app seems clunky and buggy. That's my 2 cents.


Signal never shares message content with the server.

Telegram does so by default, you have to opt in to keep messages private from the server.

You also see a lot of people criticizing the design of the secure Telegram protocol and praising the Signal protocol. There were some stories about governments actively compromising Telegram's security. There haven't been any about Signal.

So if you are worried about privacy more than UI and rich functionality, the choice seems clear enough.


Yes I'm seeing a lot of people criticizing Telegram on Twitter and in media. But after doing some research I couldn't find any proof of their claims that Telegram have a broken crypto.

You can use secret chats, those messages are not stored on the company server.


I also can't find any proof that the e2e crypto is broken, but crypto nerds sure love to shit on it. I'd love a source indicating otherwise.

The unencrypted messages are certainly not secure, and it's disingenuous of Telegram to market itself as a secure system without being really clear about the limitations.


It seems to be official now that Telegram has been hacked over and over again (article in German): https://netzpolitik.org/2016/bundeskriminalamt-knackt-44-tel...


That's not the e2e encryption protocol that they hacked, but the SMS-based authentication and unencrypted chats that are stored on the server in the plain. The state has control of SMS, so it can impersonate you. This would seem to be a potential problem for any service that's backing itself into SMS, but here the attack would not be possible as messages aren't shared across devices on e.g. Signal and WhatsApp.


That only happens for non-private chats only if you skipped 2-step verification setup [1]. I'm not telegram's relative, but security is a thing where you have to be educated. You cannot just install 'everyone's most secure' messenger and be productive.

For messenger to gain a MASS of users it has to be 'enter sms-code and go chat'.

For messenger to be SECURE it has to be 'install password manager, set passcode, use 2-factor, use e2e, use self-destruct, always compare keys'. No one will enjoy that. But if you need, you just have it in your regular sexy messenger, not in another ugly alien app that no one knows about, no one has installed, no one's ready to go through all of it.

I think telegram guys got it just right.

  [1] https://telegram.org/blog/15million-reuters


> I am really interested why Signal. Why not Telegram?

And I would be really interested to know why people are downvoting a perfectly reasonable question.


Use Signal and give them access to all your contacts who may or may not be using Signal. Use their proprietary client and trust them on their pinky promise that they "can't" look at your messages. The deceptive pretense of privacy is worse than no privacy at all.

[Edit] - "Use their proprietary client and trust them on their pinky promise" was factually wrong.

But, they still expect me to trust the signed binary they send through the App store right? How is that anyway non-proprietary just because there is a Git repo somewhere that may or may not be the same code running on your phone? Can I run a client from the Git repo and still use all of their infrastructure?


It's an open source client[0]. It's not a "pinky promise".

There are valid criticisms of Signal (primarily around the use of the Google Play Services Framework), but your comment seems to be jumping to a lot of conclusions without any research.

https://github.com/whispersystems


But, they still expect me to trust the signed binary they send through the App store right? How is that anyway non-proprietary just because there is a Git repo somewhere that may or may not be the same code running on your phone? Can I run a client from the Git repo and still use all of their infrastructure?

Until I'm able to do that, it is still their "pinky promise".


> Can I run a client from the Git repo and still use all of their infrastructure?

Yes. You can.

They describe how in the very repo I linked. Your ardent unwillingness to spend the 30-45 seconds it would take to find this out before spouting unwarranted false criticism is quite strange. Do you have some personal issue with OWS?

I really didn't mean to be defending OWS here - I'd much rather see Signal leveraging non-Google APIs in some way and provided via F-Droid (though I have read their arguments against that w.r.t. performance) but I'm astounded at the wilful ignorance here.

If you're genuinely interested in promoting secure messaging, a desire to get your facts straight and answer your own questions instead of assuming the negative should be step 1. Otherwise, you're needlessly steering people away from a tool that could practically improve their privacy.


>> Can I run a client from the Git repo and still use all of their infrastructure?

> Yes. You can.

The thing is, lucideer, the "restrictions" on the use of the source code are engineered to raise the barrier to independent use, notably by preventing or discouraging redistribution. This means that only those who are able and willing to compile Android source can run their own binaries. Everyone else has to go with the binaries they distribute which, as the other poster has correctly argued, cannot be independently verified.

> Do you have some personal issue with OWS?

I do not know about him. But I do. Please read on.

> and provided via F-Droid (though I have read their arguments against that w.r.t. performance)

Oh, so it's "performance" this time? It's not something about "updates" like last time¹, or "features", or "metrics"?

Do you really not think, if you go through the discussions, that there's just too many excuses? Does it make sense to you? Do you not get the feeling someone's got something to hide, if you would pardon the pun? :-)

But I tell you what really got in my tits, it was this message: (https://github.com/WhisperSystems/Signal-Android/issues/53#i...)

"Please do not install software from F-Droid. It is an unverified build, exceptionally out of date, and should be considered malware."

You know what? I fucking trust F-Droid. And I for one I am very grateful to everyone who collaborate to make that happen and stand firm in their commitment to open source, and especially to Ciaran, the founder, who gives so much to the community in spite of very challenging family issues (which are publicly known). Top bloke he is.

And then you get some lying, incompetent, manipulative², and possibly delusional individual accuse them of distributing malware. That is seriously not cool.

For those of you unaware, this is the person we are talking about (I suggest you read the comments too): http://www.gandibar.net/post/2010/04/07/The-googlesharingnet...

¹ Cooperative and competent (or at least willing) open source developers can set it up so that F-Droid auto-builds every time you tag a new release. ² And I say this because I'm sure someone will correctly argue that he did not call F-Droid itself malware--that would be too crass even for this guy. He's very careful in choosing his words.


I would love to see Signal provided on F-Droid, as I mentioned above, but I would temper my criticism quite a lot more than yours, for a number of reasons:

1. I haven't read/heard excuses based on updates/features/metrics/&c. as you mention, but their one performance excuse sounds reasonably plausible. Moxie has commented that he'd welcome a PR[0] even if it had bad performance (provided it only ran conditionally of course).

2. The tone in your Github link is a bit heavy-handed, and calling it malware is going too far, but I can understand the developer of software for which security is extremely critical advising strongly against using an outdated version that's being built and distributed by someone else. There is definitely no implication in that comment that F-Droid is malware, he's only referring to TextSecure.

3. I understand your Github link was just to provide an indication of Moxie's tone, but it is very old and the actual factual details in there are probably not very relevant today.

4. I trust the F-Droid software itself, but not necessarily the repository. The distribution of the outdated TextSecure above is as good an example as any - the idea of a 3rd-party building TextSecure and providing it through F-Droid may be all well and good in the spirit of Free Software, but it doesn't really instill trust when the actual author of the software isn't involved at all.

Finally, I'd never heard of the Gandi issues with Moxie's cert before reading that article. After reading it, I'm much more inclined to be suspicious of Gandi than of Moxie. The article is littered with red flags - that fact that their initial response seems to be to go after their customer rather than to question their business relationship with Comodo being a big one.

[0] https://news.ycombinator.com/item?id=12883410


> Everyone else has to go with the binaries they distribute which, as the other poster has correctly argued, cannot be independently verified.

Do you have reverse engineering experience on Android?

APK uses the zip format. Extract its contents and compare those, minus the META-INF directory, which contains digests and a detached PKCS#7 signature.

Apps whose code output isn't reproducible can still be compared with a varying amount of IDA analysis.


Re-reading this post, I'm not sure why I typed IDA -- I meant baksmali. IDA is still useful for bundled ELF dependencies.



Thanks for that link. I'll go through the build process and play with the apk.

But even if I wanted to build my own apk, and run it on my custom Android build, it would'nt work right? Because of the need for Google Play store?

Verifiable builds are atleast a step in the right direction.


Why do you need the play store? You can install apk files without it just fine, if you have the appropriate dev options enabled in your phone.


Signal requires the Google Play Services to work, and includes several proprietary libraries from Google in their app, too.


And so? You're still trusting them and the various libs they're using to protect you... Did you read the code? Did you understand it?

Me neither. How can a bit of software like this obtain any kind of reputation whereas we barely trust openssl anymore is fucking beyond my understanding.........


This criticism applies equally to every piece of software you haven't written yourself. You have to trust someone at some point. OWS has demonstrated that they are more trustworthy than their competitors.


Sure, but the ease of someone abusing this is far greater than what we've been using before.

Can you honestly say that you trust signal over locally or even airgapped pgp'd text?


Nope! But they solve two completely different problems. And the metadata with Signal is actually much better than PGP+email. Have you looked at email headers recently?


You could just look into their APK.


Survey says: NOPE.

Realistically, the code in that repo probably isn't even everything that would run on your device even provided you built it yourself.

Bets on some 'fetch js from somewhere' code in there which could completely unfuck the whole thing which acts as a help screen or something that would be very hard to find...

There is literally no way this sort of thing can ever be trusted. Christ, we barely trust PGP anymore...



All well and good for us, but we're probably not the target here anyway.

And be honest, did you really bother to do all that?


The hole point about this sort of things is you can detect mass attack. If only one person finds a problem all get the benefits.

Its much harder to make a targeted attack, specially if you are not google or have the full support of google.

Nothing is perfect, but there is very little that is better.

Your other options are only not using the devices at all.


I think as long as they are GPL, you can use the infrastructure.

https://github.com/WhisperSystems/Signal-Android/issues/282


You can, but moxie will throw a temper tantrum about it.

https://github.com/LibreSignal/LibreSignal/issues/37#issueco...


You call that a temper tantrum? I thought he was being very reasonable.


We'll agree to disagree.

If he said "Please don't use our servers with your custom client, shit might break. You could run your own infra and keep it up to date with us and we'll federate", this would be reasonable.

Instead, if you read that thread a bit more he throws up his hands and says federation is a thing of the past because he had a bad experience with CM. Yeah, because we don't still use federated standards like email and telephony every day. Come on.

https://github.com/LibreSignal/LibreSignal/issues/37#issueco...


Yes, and because it's a federated protocol, it still sends every damn thing in cleartext by default.


Federation does not require things to be transmitted in cleartext by default. Don't know where you're pulling that from.


Federated standards have problems with spam and rolling out encryption-by-default. This applies to both examples you mentioned.


Just read the thread, I am mostly on your side, I understand his point but it is weird you license your code GPL and then try to prevent derivative software to work.

This feels a bit like "bad intention". Alternatively you can choose less liberal license, and then make your derivative product from that (Chromium -> Chrome)


But.. But!

Everyone in the geek press said it's "end to end" encrypted! That must mean it's totally safe! I'm sure everyone is reading the source code and building their own clients all the time, yea? Didn't some uber hacker somewhere with a big beard say that he read all the code and it's totally fucking solid and government proof?

Surely you don't mistrust all that do you... I mean... Your seem a bit dubious. You can't be trying to say that there are people out there who are actually not buying this horseshit, that are genuinely withholding trust in these sort of 'services'?

Blasphemy! You think that there are people out there, with a basic grasp of security and cynicism, might feel offended and depressed at how easy it's been to dupe everyone like this.. Again?

Nah. I'm sure we're all totally sold on signal. Seems legit.

I mean, since all the users of such a service have pretty much zero control/visability over the code that's actually running even on their own devices which is talking to this 'service', let alone what's going on on the servers (in some N.American datacentre, probably super secure...) might raise a few eyebrows, but it doesn't really change the fact that we're talking abiut END TO END encryption that someone said was totally 'secure'. And even the media have pushed a few stories about how evil criminals are using it to evade monitoring..

No, I'm not sure I'm on your side. This is clearly totally secure and no one could possibly abuse this kind of thing..

Right?.... Right?

starts crying and loads the rifle


The lack of federation is unnerving, but the main server-less alternative I know of is PGP or maybe OTR, and PGP is way too hard for most people IMO. Also, OTR doesn't trust the fingerprint from first messages, so even that introduces an extra hoop to jump through.


For E2E? there's essentially PGP (sucks, see thread from the other day. worst user experience and questionable longterm security), OTR (most used, still annoying to have to re-establish sessions on restart, no multi-device), and OMEMO (based on signal, should be the future for XMPP). You could include most proprietary solutions like iMessage that usually miss authentication or are done unsafely.

OTR not trusting fingerprint is the default case for any authenticated standard. gnupg requires that you set the trust on a key. Signal allows it initially, warns if it changes.

For all the above though, they aren't a packaged solution that is user friendly. Conversations is a great android client, but there's sadly nothing comparable on iOS (chatsecure, but it's alpha status still). The desktop situation is hit and miss too (I was a pidgin user and moved to bitlbee for OTR, still don't have an omemo solution... though my setup doesn't need it through a TLS secured ZNC).


It's fairly new, but OMEMO was just officially published (now that there are free specs available to reference and everyone can be sure there won't be weird legal trouble later), and Conversations (an XMPP client for Android) has just started using a new Trust model (https://gultsch.de/trust.html). It's early days, but I can see this being a good compromise between security and ease of use (like Signal). We'll see.




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: