>The most intersting fingerprinting vector I found on Tor Browser is getClientRects. Is strange that reading back from a canvas has been prevented but simply asking the browser javascript API how a specific DOM elements has been drawn on the screen has not been prevented or protected in any way.
This isn't as strange as he makes it sound, it is done to prevent the link color history attack [1]. Most of the other CSS properties aren't allowed on :active or :visited modifiers.
Only because JS has been used as lipstick on the html pig. Html should do a lot more by default, which would make javascript unnecessary in 90% of the case. A few features that should really be html based:
- form/input server-side validation (you would specify a URL as an attribute of the input / form to which what-if data would be posted)
- input auto-complete (same thing, URL in attribute of the input)
- adaptive design (they should rethink CSS with various formats in mind)
With these 3 things alone I think you can pretty much create a fully working JS-free website. You would only need JS if you really need to build a SPA (which should be the exception: online trading platforms, etc).
The fact that now even a blog article is not viewable without JS is a joke.
Is there a way to take another approach of preventing dynamic data from getting back without an explicit opt-in from the user? This will never happen of course on the regular internet, but for hidden services or some other static-page-only-unless-opt-in surely even if fingerprinting information can be obtained, can we block it from getting back to the host?
It seems like a losing battle to me. You'd have to prevent Javascript inserting links into the DOM (it could stick parameters in the URL), inserting images (similar), loading any assets from anywhere programmatically, any AJAX requests, any redirects, setting any cookies... and probably more besides.
...and eventually you'd have some chap like the OP here who will come up with a clever way to exfiltrate information somehow anyway.
Lots of ideas, many of which I've had as well, but I am missing conclusions. On the demo page it tells me my CPU benchmark and some scrolling measurements. Great, but how unique was that now? And how are you going to make the data points into a fingerprint? Because next time I scroll, I will totally scroll a millisecond differently.
He needs to collect data first in order to be able to say something about that. Panoptoclick [1] can report on uniqueness because they have test data from thousands of clients. Perhaps these fingerprints can be added there for the exposure (and because they will work to identify non-tor browsers as well).
> And how are you going to make the data points into a fingerprint?
The two "scrolling deltas" arrays are very different in nature, you could easily drop all the zeros and boil it down to "all 3" or "not all 3". That would give a nonzero contribution to the number of bits of that form an overall fingerprint. Similarly for the CPU benchmark, a phone is not as powerful as a desktop, so a result of "500" on one and "2800" on another are very likely different machines. So bin it to the nearest 500 and you'll have another non-zero contribution. Repeat for client rectangles and so on.
Good points but not even an attempt is being made at using the data. He could have tried his laptop, his phone and his mom's tablet or something, at the very least, though that would probably still be hugely overfitting the data.
I don't believe that any of these will link different Whonix instances on the same host machine. Using Tor browser in the same OS that you use for general work is not secure. Even sharing the same host machine is insecure, where anonymity really matters.
The author is missing the point of the Tor Browser.
They don't try to make fingerprinting impossible. They want to make the outcome uniform across all users.
See "Strategies for Defense: Randomization versus Uniformity" in their design docs [1].
And (as other said) uniformity is increased when using an anonymous/privacy enhancing operating system like Tails or WHONIX underneath.
This isn't as strange as he makes it sound, it is done to prevent the link color history attack [1]. Most of the other CSS properties aren't allowed on :active or :visited modifiers.
[1] http://dbaron.org/mozilla/visited-privacy