Absolutely it’s important - Just because one hole is still open doesn’t mean another shouldn’t be closed.
And FF and Safari should continue their work to close any fingerprinting opportunities Fingerprinting is becoming less effective over time - for example fingerprinting on iOS is pretty unsuccessful.
While it has some native anti-fingerprinting protection (including automatically deleting third-party cookies every week), the main deterrent is homogeneity: you can be sure that the browser/device is Safari on iPhone 12 Pro Max... and that's it. In other words, unlike other devices where you can get what GPU is in the system (WebGL and Canvas), the resolution of the screen, the list of fonts installed by the user (indirectly, by testing them), list of webcams and sound cards on the system (WebRTC), how many (logical) CPU cores are there (WASM), whether the device has a battery (Battery API), and the laundry abuse of APIs that exists means that it is possible to individually identify desktop users and (to a certain extent) Android users.
> Time zone is one possible fingerprint data point.
Totally forget that. Oops.
Now for the meat of your comment ...and how many have tested their protections so that their testing site recognize that your device is not unique?
A very good counterclaim was posted in the comments:
I strongly disagree with your findings, Ben. Namely, you list fingerprinting techniques available to browsers, and fail to mention how Safari (and Firefox to some extent) make those methods less precise. Instead, you say
Note that this isn’t a comprehensive list, it’s just examples. When a website analyses all of the data available to it, things get very specific, very fast.
So let me point out where you were wrong about Safari in particular:
• Fonts installed. Safari reports very limited subset of fonts, which does not vary. it is the same for every Safari users.
• Plugins installed. Unsurprisingly, Safari lists just one: PDF reader. Native plugins are not reported.
• Codecs supported for video. The uniqueness checking site reported just H.264 and FLAC. Audio format are not reported at all. There's no mention of H.265 and VP9 which work in my Safari beta version, and no mention of the whole plethora of audio formats which are supported.
• Screen resolution is not the real screen resolution. I'm on 27'' 5K iMac and the screen is reported as 2048 x 1152.
• Media devices attached reported as "audioinput" and "videoinput". It has nothing to do with the actual available media devices.
And incorrect reporting goes on.
As you can see, fingerprinting through browser leaves Safari users very poorly segregated. As long as you running latest OS with latest version of Safari, you are a part of a very broad chunk. You can't be identified through browser fingerprinting along.
This means that the only unique data that you can get are:
a) Language settings. There is no way to work-around this (unless you consistently lie that you solely use English)
b) Time zone. There is no way to work-around this (unless you consistently lie that you solely use UTC)
These things can be predicted anyway with IP address, so it is not perceptibly meaningful in any way. In other words, advertisers can literally give up on detecting when Safari are the browser and rely instead on IP addresses (which can tie into a family (or in some IPv6 cases) a device.
I've got a question; if it is ok to lie in these reports, why do they even exist? I thought these reports were there as a way to introduce client capabilities so that the server can serve the right content.
Disclaimer: This is a genuine question. I'm a hardware guy and I don't know how web works nowadays.
And FF and Safari should continue their work to close any fingerprinting opportunities Fingerprinting is becoming less effective over time - for example fingerprinting on iOS is pretty unsuccessful.