Hacker News new | past | comments | ask | show | jobs | submit login
Tor Browser Exposed: Anti-Privacy Implantation at Mass Scale (hackernoon.com)
122 points by Jerry2 on Sept 21, 2016 | hide | past | favorite | 62 comments



This piece is charged with the personal bias of the author (https://twitter.com/movrcx) who launched an hostile fork of the Tor Browser Bundle because "untrustworthyness".

I recommend you read this instead, which provides a more level-headed and technically correct analysis of the vulnerability (which was there, even if not properly in the terms described by OP):

https://hackernoon.com/postmortem-of-the-firefox-and-tor-cer...


Nevermind the fact that this hostile fork explicitly disabled code signing for extensions [0] in order to meet an arbitrary shipping date [1].

[0] https://github.com/IndependentOnion/rotor-browser/commit/916...

[1] https://github.com/IndependentOnion/rotor-browser/issues/7


Oh but not just any kind of “untrusworthyness”, from what I gathered from Twitter at the time, he did it after the Appelbaum affair broke out to provide a “non-SJW” fork of Tor Browser. And If I'm not mistaken, he initially forked the wrong repo, it was a running joke for a couple of days... Nice to see the farce still going strong.

EDIT: s/Tor/Tor Browser/


Actually, people thought he forked the wrong repo because he forked the browser, not the Tor node repo... and then he finds this bug in the browser.

I suspect that wasn't planned, but regardless, he did fork the "right" repo.


Ah, right, thanks. But, this thread got me thinking yesterday — if TP is untrustworthy, shouldn't he have forked node code? What use is what I understand is just a Firefox with locked down security related settings and set up to proxy through Tor? So in that sense he did fork the wrong code, and the TP people were right.


Well, one explanation is that he thinks the threat model is such that a more likely risk is RCEs into the browser; if Tor directory authorities are doing untrustworthy things that can probably be more easily detected than deliberate exploits.

It's also a very pragmatic plan: rebooting the whole Tor relay/exit node community is going to be a long term project without clear gains in the end; if your replacement looks like Tor in that you continue to invite volunteers to run nodes if the previous set of nodes are compromised what prevents those nodes from joining your network too? Arguably better to focus on forking what you can fix, which is apparently the browser.

But that's all a charitable explanation: the less charitable explanation was he forked the browser because that's where the Tor branding is... Fork the relay codebase and you can't take pretty screenshots of your new fork.


As I recall, the board of the Tor Project recently changed substantially amid some drama. is this related to that?


I think it is, but I didn't really follow. The whole story is nausea inducing. You have, apparently, a respected, turns-out creepy/selectively abusive guy (one of life's great archetypes) get called out for sexual abuse. Numerous people confirm, some deny. Alt-right and associated acts immediately jump on this and start hating on the Tor Project because “SJW mob lynching” etc. Then, accusations start that Tor Project is deeply infiltrated by the CIA to, I guess, try to marginalise Appelbaum. Of course, everyone ignores the fact that Appelbaum himself practically acted as a CIA agent in Burma some years ago via Radio Free Asia. “The public” demands a purge of the “SJW/CIA infiltrated” TP board. At that point I just stopped paying attention, for the sake of mental hygiene. In this story I learned that the board changed in the end, so I'll google what happened, but I'm not in a hurry.


> a “non-SJW” fork of Tor

Hahaha, what?



What a dumpster fire... how does someone write that out and think it's a good idea to broadcast to the entire planet?


Thanks for the background, I got the sense that the author had an axe to grind.

Still, I seem to recall that when the tor browser auto-update mechanism was deployed, the idea was that HTTPS with pinning was only the first step, and that going forward the updater would also check PGP signatures. It's a bit disappointing to see that hasn't happened yet.

Especially with reproducible builds and several trusted signers independently verifying the built binaries and signing the resulting package, this would add considerable security to the update process.


From /r/TOR:

"Old news. This was fixed in 6.0.5.

https://blog.torproject.org/blog/tor-browser-605-released

Interesting note: The author is part of the rotor browser fork that is going no where so far. Doesn't look like the reported issue has been fixed there. In fact, no commits since before this blog post."

https://www.reddit.com/r/TOR/comments/53u1cd/tor_browser_exp...


> Old news. This was fixed in 6.0.5.

6.0.5 was released five days ago. There is no universe where this qualifies as old news.


The highest-scoring response to that comment is

> I'd honestly suspect the author was COINTELPRO were it not for the fact that so many of his statements and code are literally laughable.

So there's that!


Tor is not, nor has it ever been, trustworthy. Hell, you can still try active deanonymization for youself: https://github.com/Miserlou/Detour

This didn't used to be a problem, as it was essentially run as a sandbox project for the academic anonymity community. It was very up front about its capabilities and limitations.

Unfortunately, in recent years, the US government has been bankrolling more "privacy" software development through its propaganda arms (OTF, RFA, etc.), and the Snowden revelations have led private foundations to follow suit.

As such, the organization doubled down on rebranding to be a "human rights" _tool_, as this is what grant giving organizations love to promote (free speech in Iran, activist publishing, etc.) This combined with a overly-enthusiastic do-gooders gaining more and more prominence in the Tor organization has led to the dangerous situation of promoting inherently insecure software as a security solution to vulnerable people. This is a general problem in the scene (remember when those activists in South America got vanned for using CryptoCat?) - and one that I've been guilty of myself in the past.

I really hope the new boards steers them back to the academic realm and slaps a big red USE AT YOUR OWN RISK warning on the tin. Unfortunately, I think the opposite will happen.


I'm not sure where this narrative is coming from. TOR was developed as an anonymous network by the US Naval Research Lab. It was designed for use by military and intelligence. TOR was never just some academic experiment.

TOR is still a valid tool. No, it wasn't designed to foil NSA level surveillance, because it was built by the US. But this vulnerability isn't even related to TOR, it has to do with the TOR Browser.

The Snowden leaks contain slides where the NSA clearly laments the use of TOR, so saying that it never has been trustworthy is simply not true.


It was "designed" for military use in the sense they were computer scientists working for the Naval research, but the community felt like an academic one. For instance, it was originally published at USENIX. The majority of discussion of Tor was related to papers on Anonbib, not about code itself. TBB didn't even exist until much later. I've never seen any evidence that it was ever used "in production" by the military before it was made public.

Re: The NSA Tor slides - they're really not as damning as you say - http://i.imgur.com/cnOeVQf.png - and they're also made before the FBI was caught using remote code execution exploits against Tor users.


> but the community felt like an academic one

How do you know what the development environment 'felt' like? Are you Roger Dingledine or Nick Mathewson?

And I doubt that you will find any evidence that it was used before being made public. Using TOR before it was public would be like screaming, "HEY I'M HIDING SOMETHING! AND I'M US MILITARY OR INTELLIGENCE!". The whole point of releasing it was to gather a userbase. Otherwise, TOR wouldn't be very anonymous at all.


I was working on a related anonymity project at the same time, funded by the same grant giving organization. I lurked and participated, I have met RD and NM IRL, and I have been a hidden service operator for many years. That being said, I am only at the far periphery of the project. But, I have been out here for a while and I stand by my original statement.


Well despite our differences in opinion, I respect your contribution to anonymity and privacy. And I'm a bit envious that you've met RD and NM.


They're just normal guys, who are quite approachable - go to the right conferences and you'll meet them.


Almost all the research out of NRL feels academic. It's fundamental research by professional scientists. I don't know what else you'd expect.


Of course, this is exactly my point. I think Tor was healthier when it was academic, not activist.


> Tor is not, nor has it ever been, trustworthy. Hell, you can still try active deanonymization for youself: https://github.com/Miserlou/Detour

Is it even possible to protect against end-to-end time correlation attacks without massively increasing latency?


I'd call that an "open problem."


I'm fairly certain it isn't possible. If you can detect, with high confidence, that an entity received a message and that an entity sent a message, timing attacks can be done. The traditional means to prevent this involves sending dummy traffic, introducing unpredictable latency, or using some sort of broadcast to conceal the exact receiver. None of those things are practical for a low-latency proxy.

It's been a couple years since I studied this, so I might be wrong.


A more dangerous side-effect of branding Tor as a "human rights" tool is that whereas if it's a "privacy" tool, people in oppressive regimes can legitimately use it (e.g. people working for such regimes), whereas as a "human rights" tool, its simple presence on a computer is evidence of guilt.


I haven't live in a lot of oppressive regimes, but usually this is not how it works. I lived two close examples

When i was a kid a friend of my grand mother was arrested because a neighbor said he was a communist. He was tortured for a week and his party, kind of center/right wing took him out as they were a close party to the current government. Oppressive regimes usually don't need evidences.

A friend of mine was arrested for two years, accused for terrorism. The proofs? a war and peace copy (not even a photocopied book) and a guns and roses poster. And this was in "democracy"... so stupid proofs are also used, and whatever can be a proof, like a book about cubism was considered that was a book of cuba's ideology.

Tor has been for years looked by "regular"/"normal"/"common" people as a tool for drug dealers or child molesters. The switch to a human rights tool doesn't seem to really put it more into the illegal line.

Anyway, oppressive regimes do whatever they want, Tor can avoid some of the spying but if the state is already taking your computer you are screw up with Tor or without it.


I think you're missing his point. Many people in oppressive regimes, including oppressors, are interested in a bit of extra privacy. Regime might even be able to get around it since a lot of "privacy" tools are bullshit. A tool saying it's specifically designed to defeat oppressive regimes is practically an attack on them. Even possessing it says you're fighting the regime. So, eliminating all such tools or people using them is a logical response for oppressive regimes.

And people on Pieter's side of this issue probably think it also logically follows to prevent that mental connection from happening in government officials' minds. Just gotta change how it's branded.


I understand the point, what I am saying is that things don't work this way, neither in my personal experience (which i told) and neither in the areas i work (i work for a human rights organization, which doesn't recommend any software btw). There is a difference on what you think people should react and what people really react.

If it's as concepts, "human rights" triggers less alarms than anonymity, first because while I don't know which regimes you consider oppressive there is a big probability they themselves think they comply and/or promote human rights. Iran for example have a Islamic Human Rights Commission and proudly promote it. Israel would be another country that fits this example.

Then we have that anonymity could mean something it scares them the most, which is not human rights defenders but spying. Tor is already in a bad list for this reason, same as any anonymity software. The biggest threat those countries face is still military intervention or terrorism. A friend was arrested while taking pictures in Palestine, when questioned he was asked if he had Tor or i2p installed, PGP or any encryption software on his laptop. They didn't took his laptop away, but that was before the switch to "human rights" brand.

Then there is another vector we can take, Tor as circunvention. Another friend when visiting sudan got a pamflet to not use Tor, VPNs or Proxies when asked for the visa. The hotel made the same requeriment. This was 4 years ago. The reason was not that Sudan has been in the list of the worst human rights offenders but that you could access immoral content with it.

So, while I understand the point, it doesn't seem to have a backed reality to be sustained.


> This combined with a overly-enthusiastic do-gooders gaining more and more prominence in the Tor organization has led to the dangerous situation of promoting inherently insecure software as a security solution to vulnerable people.

What is the inherently secure alternative available to these vulnerable people?

> I really hope the new boards steers them back to the academic realm and slaps a big red USE AT YOUR OWN RISK warning on the tin.

What causes you to believe that activists and vulnerable people would stop using Tor if this warning were in place?


Are you asking what they should use instead? Because there is no alternative. My point isn't that they should be using some other specific piece of software, or even that they shouldn't be using Tor, just that they should be aware that Tor is old software with many known vulnerabilities, a poor track record of security including active RCE attacks by governments and private companies, etc. If you go to the homepage, you'll see none of that.

This might not deter people from using Tor entirely, but at least they'd understand the threat model a bit better so they could change their behavior accordingly, for instance maybe not using it at their own home, not using it for Facebook, etc.

As much as we might want to believe that we can code our way out of anything, human rights demands more than software.


> Because there is no alternative.

Just out of curiosity and for purely educational purposes (I admittedly know very little about the topic), how about something like I2P or FreeNet? Are these at all viable alternatives?

What other choices do we have for "anonymity" outside of the usual VPN/Proxy?


Anonymity isn't binary. Perfect anonymity is, and always will be, completely impossible. Practically speaking, we can only talk about reducing risk in different scenarios.

I like I2P as it has a reduced threat footprint compared to Tor as it's all internal, but it's also a smaller network, probably more vulnerable to Sybil, less audited code, etc.

Again, my point is that there is no perfect solution, it all depends on the situation and the requirements.


Most of the issues seem to revolve around accessing clearnet resources through Tor. Hidden services within Tor still seem secure (whatever that word means these days).


Yet again, TOR gets blamed for a Firefox vulnerability. Surprise, surprise...


To be fair it's the Firefox that comes as part of the popular official Tor bundle.


The fact that Mozilla is so slow in implementing real per-process sandboxing in Firefox and that it doesn't even plan to rewrite most of the browser in Rust over the next few years, makes me think that maybe Tor should just bite the bullet and rebuild on top of Chromium, while vigilantly watching out for anti-privacy features in it that they can remove.


They can't use Chromium because the Chrome team refuses to maintain certain features required by Tor, such as routing all traffic through the SOCKS proxy [0]. They would have to literally create a new browser.

0: https://bugs.chromium.org/p/chromium/issues/detail?id=80722#...


They're targeting the wrong level of abstraction. I think people should use Whonix or Tails instead of Tor Browser.


>I think people should use Whonix or Tails instead of Tor Browser.

I'm not sure I follow. Tor Browser is the default browser of both operating systems.


Whonix and Tails are OSes, meaning they could override what the browser tries to do in terms of network access. If Chromium prevents a plugin from directing all traffic to SOCKS, just take controls of Chromium's network settings from the underlying OS (Whonix, Tails).

Note: This just answers the question about browsers or other pieces of software that don't allow control of their network components. It doesn't address a vulnerability in that software.


If it is, it's for their anti-fingerprinting features and not for Tor. What I'm saying is that running Tor on a machine that also connects to the public Internet is a bad idea. Use full-system Tor through iptables at the very least, and ideally use Tor on the router or host machine so it's transparent to the guest.


The same MitM update attack can be leveled against all Firefox users, and not just Tor browser users?


Yes, but being the man in the middle is much easier when the target willingly places you in the middle of his connection...

The whole situation can be worked around by using a custom prefs.js that disables auto updating addons (there are various other attacks that can be prevented by tweaking settings in about:config such as the webrtc related ones) and there are various websites providing privacy oriented prefs.js. A better workaround would be for the TOR browser maintainers to ship such a file with it, and a solution would of course be Mozilla fix things on their side.


It doesn't like like it:

1. You'd need to be able to MitM all of them (where controlling a ton of TOR exit nodes comes in handy)

2. You'd need to know an extension lots of people have (I'm guessing NoScript is default on TOR browsers)


Using Tor may actually be less secure that using a normal browser.

At least when I connect to Microsoft, Google, Facebook, etc. I don't expect to get hit by a driveby JS exploit, and Google does help with "safe browsing".

With Tor, you're one HTTP website (or not HSTS website) away from a driveby virus, with no way to tell that you're connecting to a dangerous exit node


Tor Project runs several scanners for this behavior. Arguably, unless your ISP, ISP's ISP, coffee shop, etc., are all 100% on top of their game, this could happen in any one of those environments too.


So they shut down a node, the node operator notices and restarts it.

The problem is that people actually make money from malware. It's not bored college kids showing off skills. It's pros.

So think like a pro. You use a zero day to hack into Verizon to feed malware, get noticed, and your hack gets reversed after an hour.

You open an exit Tor node on a VPS, use it to feed malware, profit. They close it, you re-open it on another host. They play wack-a-mole, and you rake it in.


The thing is that it takes a significant amount of time and bandwidth to get flagged as an exit and included on circuits. So your set of hosts is going to be pretty limited to start; most hosts are pretty hostile to Tor exits as it is, and are going to shut down an exit hosted in their IP space because they don't want to deal with the abuse complaints. In contrast, the exit scanner can be part of the first users of an exit node. You could try to detect the scanner, but the nature of Tor is that this isn't feasible.

In any case, you can solve the problem of distributing software over Tor by setting up a hidden service. The Tor devs have been making noise for a while about creating an "onion service" that isn't hidden, but has the same guarantees as a hidden service (an improved version of exit enclaving).


Last time I checked, the Tor Project's handling of malicious exit nodes was one guy who didn't really care all that much about reports of active MITM attacks that were actually robbing people of money (well, Bitcoins anyway). Just flat out refused to pull the nodes doing it.


Did you check prior to the multiple years of work on the exit scanner? Just lurking on tor-relays and tor-talk shows a lot of responsiveness on the part of the dirauths.


When you connect to Microsoft, Google, Facebook, etc., you're connecting via HTTPS, so there's nothing malicious a Tor exit node could do in these cases.


The security conscious should be running noscript anyway.


But then you'd become fingerprintable by staying out of the crowd


By default Tor Browser runs noscript.


>The entire security of the Tor Browser ecosystem relies on the integrity of a single TLS certificate that has already been previously compromised.

Seriously? That seems like a really weird - to say the least - decision to make about something this important...


It's the certificate used to sign TLS for addons.mozilla.org. Since "Tor Browser" is a lightly modified Firefox that hasn't had its automatic addon update checking disabled, and Mozilla's addon signing process is an automated rubber stamp, that's a problem.

To be clear, I don't think it's so much a problem on Mozilla's part; perhaps manual review would be a good idea, but I doubt they have the resources. The problem here is that Tor Browser has claims made for it that aren't supported by the amount of work that's actually gone into making it secure. That would appear to be entirely on the people who run the Tor foundation, or whatever nonprofit structure it is that they use.


Is it really so easy to control a significant portion of tor exit nodes? I seem to remember there are automatic systems and members of the project checking for suspiscious nodes.


Yes, it's happened at least once before with Carnegie Mellon.


> Using Tor

This is the joke right?




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: