This worries me. What exactly do you mean by encrypted in the DB?
If you actually mean hashed using a one-way function, it would be very easy to hash every possible phone number (there are not that many) to build a lookup table to deanonymise your users instantly. Even if you individually salt each hash it would not take very long at all to find the phone number for each entry.
If you do mean encrypted, then authorities could compel you to turnover the database and the key.
Why do you need to store the phone number in any way at all?
Perfect security does not currently exist. A trusted source must store the information somewhere, to authorize and validate users without spreading that information elsewhere.
You can't get around this problem unless you invent magic psychic computers. What is the point in finding every possible flaw with security here? There is a gradient of complexity, the time it takes to break these things. Currently, everything that exists is susceptible to being broken, misused, or modified.
If you assume that your attackers know everything, and have the ability to immediately find and apply that knowledge, then yes, it can seem scary. But I tend to think that the more capacity a person has to do, it's really just a bigger intellectual burden.
So it sounds like messages are associated with accounts, and accounts are linked to phone numbers, and those phone numbers are easily recoverable from a centralized database. If htat's accurate then it's not simply imperfect anonymity. It's not even pseudo-anonymous. It's about as not-anonymous as it gets, which is fine for a casual messaging/chat app where no anonymity is expected. But users should not be misled into thinking it's safe to use it for anything they wouldn't say to a government official's face.
You are right, I was reacting to a pattern of argument, which is not the point that I should have been focused on. That creates more noise over more important issues.
> What is the point in finding every possible flaw with security here?
Because security is only as good as its weakest link, and an adversary who wants your sensitive data won't choose not to break in because your security is "mostly OK".
At the risk of being overly negative, if you have this attitude, you're not building a secure system, and, for your users' sakes, you shouldn't say that you are.
That was not my point, I over-reacted to a pattern of argument that bothers me when my own mind repeats it.
I agree that security should be taken seriously, with care and caution, and that users who ethically need to be aware of potentially fallacious assumptions about the usage of a device should have the capacity to easily find that information.
Obviously there exist people who believe that the things they say in this app, which claims to provide "anonymous" communication in a country with a history of repression of free speech, aren't tracable back to themselves. We've established here that this isn't true.
It's fine if you think this is an unreasonable conclusion to draw based on the evidence; you're entitled to your opinion. But the question is, do you feel comfortable with the possibility of people going to jail or dying because they didn't understand the security tradeoffs that this app makes? I think it's because some of us see this as an extremely real possibility -- particularly given the frankly ineffective security practices described here -- that you're seeing so much backlash.
The OP probably was replying to the fact that larger nations are able and willing to lean on service providers to extract personal information and deanonymize private conversation. This is a normal mode of analysis for citizens trying to secure their conversations from the US government, for example.
The Burmeses government however is not likely to be able to perform something like this, especially if the authors are not in Myanmar. In fact, there's a reasonable chance that the authors of the software are CSOs sponsored by other governments that would like to see changes in the regime in Myanmar (like the CIA's ZunZuneo app in Cuba) - in this case it is not likely that the government there will be able to play effective whack-a-mole.
The danger here is that the government will ban phones or create some sort of licensing for them if this becomes a large enough civil unrest problem. But it's equally likely that other countries would then raise sanctions and human rights violations against the regime on account of principles of 'freedom'.
I want to see you design a system so complex it cannot be understood.
Some people think cryptography is this utterly complex thing sitting on the edge of understanding. It isn't.
Evidence of rolling your own crypto, authentication or key-exchange mechanism is the first thing an attacker you'd want to worry about will look for.
Developers design systems that are easy to break because of ignorance and hubris. That's not to say you can't learn how to implement a secure system, just that if you did any research, you'd know that Rolling Your Own Is Bad because proper design is Hard and people, with much more experience than you, are aware of choke points in your design that you aren't aware of.
Complexity = time to brute force a crypto algo. I am using the word in the formal, traditional sense, where a brute force solution (and heuristics of intelligent solving) literally is measured in terms of computational complexity metrics (big o) or probability.
Try designing a system that can't be exploited and must rely on its algorithm's correctness such that only a brute force solution exists.
Few algorithms are correct such that their computational complexity alone is what provides their security. Without a formal proof, an audit and some testimony of experts I will not believe your hand-rolled algorithm is correct.
This is why you rely on proven algorithms and implementations and never roll your own. But you should not believe that using a correct algorithm alone should be enough to deem a system secure.
Its implementation might be exploitable in a way that sidesteps the security provided by your algorithm.
The idea that it might take an attacker five billion years to brute force your cipher text for a solution is nice, but if you're exchanging keys in an insecure way then that security goes right out the window.
Storing your salt on the same DB? Your search time for a solution is cut down since you can grab that as well.
This is why emphasis is placed on tearing apart a system that claims it is secure. Most of the time it isn't, and usually in ways that are easily identifiable.
Having good faith in the developer isn't an ideal when the well-being of many people might be at stake because of claims that cannot be backed up.
perfect security may not exist, but a lot better ideas than storing encrypted phone numbers in a central database have been thought of and are widely explained in the literature on the subject
Yes, it is possible to match a phone number to a hash. (We are also looking at ways to make it more secure. If you have any suggestion, please let us know.)
However, hushes are not connected with the phone number, we created unique token for our user when the hush is created.
We needed phone numbers for our upcoming features. The newer version which won't need phone number to login is on its way.
> We needed phone numbers for our upcoming features.
> The newer version which won't need phone number to login is on its way.
So which is it? new features need the phone number but the new version won't require it? Are you gonna ask users for permission to use their phone number to access the new features? Will users who prefer their anonymity be able to interact with users who surrender their identification?
1. Use a hashcash style proof of work mechanism to register account. The server send a challenge and have the client compute a nonce under the difficulty. This could be turned to take an average of seconds to minutes on your median users device.
This isn't a long term solution because eventually an attacker will realize that they can use more powerful machines to create sybils and abuse your network.
I'm actively working on alternatives ATM.
2. Embed Tor. The Guardian Project folks will be able to give advice on embedding a tor proxy in your app. There is documentation online as well. Chatsecure has done this.
This would reduce the amount of information you have in your possession for authorities to seize.
This would not protect your user base if authorities compel you to ship a malicious software update.
3. Also use SSL and pin your public key for your SSL cert
Hashcash is broken for any situation like this. The amount of hashcash time needed to thwart spam is higher than any user is willing to wait for, doing proof of work on a phone. Plus, spammers normally have their own Cloud Computing to resort to, and it's cheaper than Amazon AWS.
Interesting observation. I wonder if you could hide this work behind an approval period, much like waiting for a beta invite. When framed like this, it just builds anticipation, and the users doesn't necessarily need to know.
But of course, this would most obviously involve a client-side component with access to system resources. Workable for a native android app, but perhaps not for desktop/browser...
This is amazing (that someone is doing this). If there's anything the global tech community can do to help this team, I suspect it would be freely available -- anything to make Myanmar people better able to communicate safely is a huge win.
Great idea, but I don't get why they are based in Myanmar. This is almost asking for trouble from authorities. For something whose direct purpose and sales pitch is to do something that an authoritarian government is against, I would have based the entire company outside the country.
Most of us have never been to foreign countries. We graduated from Burmese Universities. (Only the CEO of the company studied in Singapore for about 3 years.)
Because they are Burmese? They did register the company in Singapore, according to the article. Also, the government isn't exactly authoritarian anymore (we all hope). Nevertheless, it does seem risky.
It's a start if you don't have funds to physical move to a different country. Besides it's 2015, Saffron Revolution pretty much ended around 2008, with new constitution and all. It's not Burma you know from 2008 Rambo. ;)
IIRC, those are "partially weakly anonymous". There are no pseudonyms in Secretly for the "main" posts - they are anonymous, even though the anonymity isn't even remotely strong. Machine-generated pseudonyms (identicons) are used only for discussion comments there -- don't know about Hush, though.
Either way, there's nothing revolutionary about this sort of apps, but marketing statements. Anonymous BBSes are there for decades, and this app has exactly the same concept except for being a mobile app instead of a website or desktop one. The only relatively novel part here is location-awareness.
Please update the title to "This new messaging app from developers in Myanmar is kind of revolutionary", the submitted site uses that now. The original was very misleading.
How do you ensure that users accounts are not linked Personally Identifying Information?
We've seen that when folks implement weak anonymity technologies, disaster quickly ensues.