Mozilla didn't mention this in the article, but the study they referenced had an astounding statistic proving their point about default settings.
> Chrome and Safari are the two most prevalent browsers in our data, with Chrome being associated to about 43% of the ad transactions and Safari to about 38%. About 73% of the ads shown on a Safari browser do not have a cookie associated, whereas on Chrome this is the case about 17% of the time.
> The difference is probably due to different default tracking settings across the two browsers, with Safari impeding, by default, third-party tracking cookies being set on the user’s machine (the user has to explicitly allow the usage of third-party cookies)
[Googler, but this is just my own musing] Here's a theoretical question, if all third party tracking cookies were blocked, wouldn't that strengthen Google's position in the ad market and weaken all of the third party ad networks?
Google gets most of it's revenue (~70%) from it's first party sites, and stuff like AdSense could be made to work without cookies, and given Google's size in the market, people would switch to whatever ad embedding format they required.
But smaller ad networks won't have that power, and don't have huge first party sites either. So in a way, if Google jumps onboard this bandwagon in Chrome, they could be accused of doing it to strengthen their own position, the same way adopting Apple's extension/ad blocking restrictions in Chrome, led people to accuse them they're trying to sabotage ad blockers, instead of trying to reign in a toxic hell stew malware from overly permissive extensions.
I am not a Googler and have been a long time critic of any business model that is not “I give you money and you give me stuff.”
That being said, I defended Google’s choice of implementing ad blocking extensions using an approach similar to Apple’s because I inherently don’t trust random third party extension makers that can intercept all of my web browsing. I also don’t trust VPN providers to protect my privacy but that’s a rant for another day.
I’m very careful on what gets installed on my work Windows laptop and I rarely use my home computer. I’m usually on my iPhone or iPad. There, I install any random app because I know the permission model only allows so much.
I generally agree, though I do trust uBlock Origin with the coveted webrequest blocking API. It has some pretty powerful features that I don’t think could be implemented otherwise, block list limitations aside. I kind of hope Firefox does not get rid of the webrequest blocking API for that reason alone.
(Usual disclaimer: Googler here, opinions are my own.)
What's AdSense going to switch to that works without cookies? And whatever that is, what makes you think Firefox and other non-Chrome browsers won't block it?
This step is overdue and i applaud Mozilla for doing it. But:
Why doesn‘t Firefox block all 3rd party cookies by default? That would be a huge win for privacy.
Yes, some sites would break. But if Apple can do it with Safari, Mozilla can do it with Firefox.
When sites break, people leave Firefox. No amount of explaining or media changes that: the number one (by far) reason people leave a browser is because a site is broken.
Further evidence: Windows 7 wasn't even that much of a change from Vista. The main differences were 1) software had adjusted to deal with UAC and 2) new laptops were more powerful. But most of the good new architectural features had premiered with Vista.
I think they did improve the UAC situation with Vista SP1, if memory serves.
The main UAC change in Windows 7 was just that it didn't pop up for changing of settings. But UAC on settings wasn't the problem (in fact, macOS basically has this and it's a non-issue), UAC on legacy Windows apps only tested on Windows 9x or on administrator accounts on NT systems was.
I second this. I had a secondary computer that I switched between Vista and 7, back and forth, for various reasons, and it was much, _much_ slower on Vista than 7.
Most of the anecdotal evidence was comparing a laptop running with years' worth of bloated software to a fresh install. No surprise that Windows 7 came out ahead.
I use this setting (blocking all 3rd party cookies) since years and the number of sites that are unusable is rather small. I usually go to another site when it happens. I think in the last 2 or 3 years there were only three occasions where i bothered to change the browser settings because I absolutely had to access the broken site.
I think disqus is one of the notable services that is semi-broken due to this.
I strongly agree. There is no legitimate functionality that can't be implemented without 3rd party cookies. Some sites will break at first until they are fixed, and we could have a whitelist for a while, while making it very cumbersome to be added to that list. New sites would simply need to workaround the lack of 3rd party cookies and accept that user tracking is technically illegal.
The surveillance internet is something that was allowed to evolve by a lack of foresight from the standard authors in their desire to build a rich web. We should put an end to it as soon as possible. The browser should not leak unique identifiers in any situation and all standard browsers should be as finger-print resistant as the Tor Browser.
In my view, if you leak data that is critical to the privacy of the user, it's as bad if not worse than an improper implementation of TLS. It takes only two, modestly skilled, collaborating webadmins (one site "shameful" and one used with authentication) to possibly destroy a person's life, reputation and family. A practical attack on SSL2.0 is orders of magnitude more difficult.
Let's put responsibility for privacy back where it should be: on the browsers and web standard authors.
I'm genuinely curious about your assertion. Let's say I have an application for my business and I want to embedded a report on one of my pages from a separate application. This other application is a SaaS application, so I can't just "mount" it at mydomain/some-sub-path. How do I have secure reports without third-party cookies? Store cookie-like stateless tokens in my wrapping app and send them in the URL?
That could work well because if the message shows the domain that looks like the content they want to see they will allow it but if its a random tracker domain they will block it.
It's also important to not include [Whitelist INC i.imgur.com], only [Whitelist INC i.imgur.com FROM example.com], because trackers will adapt to exploit the first one as soon as it's common enough to be noteworthy.
If browsers break 3rd party cookies and they are not fingerprintable, the arms race stops: there is no way to track a client across multiple sites, other than weak correlations with IP, interests, etc.
If it's inside your business you can use subdomains like site1.domain.com and site2.domain.com and set the cookie domain attribute to domain.com and you will not have a problem because the cookie will be visible among all the sites of your domain.
Technically, ad-tracking is not an invasion to privacy per sé, and does not identify a person. It is always connected to a client, and it is illegal to connect the client browsing data to a real person.
So from that perspective, how is ad-tech a problem for privacy?
Plus users do not have an alternative, at least on iOS. They all pass through webkit.
On a desktop, if Firefox breaks, people will just switch to Chrome or something else. Plenty of people want things to "just work" out of the box.
Maybe the better solution is to have a guided install process. When installing firefox, ask the user if they want it to "just work" or be "privacy conscious"? Warn them of the tradeoffs of both and let them decide once.
Well, there is a real danger the web would split into two groups: those who care about privacy and use Firefox - so are a threat to several business models that are en vogue in SV right now, and those who use Chrome and are therefore better candidates for milking. Website owners would do everything to discourage people from using their websites with Firefox. Instead of seeing your favorite website in Firefox, you would see "Welcome to Oath family" or similar bullshit.
I suspect we're just training people to click the Whatever button. As that's most “normal” people's response (I am not normal), the courts will eventually rule that a first-use roadblock doesn't produce informed consent, so doesn't excuse you under the GDPR. Hopefully, then, websites will stop performing a superficial impression of caring about privacy.
(Explicit informed consent is the GDPR's last-resort excuse, which you only need if your use of personal data falls outside all of the GDPR's automatically-acceptable justifications for using and storing personal data.)
I get what you mean and yes, it would definitely be a good move in terms of showing the way to sane defaults, however I would have a hard time calling it a "huge win" considering Firefox + Safari make up <20% market share. Maybe I'm just being cynical.
As of right now their UI to allow third-party cookies is pretty bad. Chrome shows an icon in the address bar for it. In Nightly there is a section in tracking prevention, but it's not in normal Firefox yet.
Safari does not block all 3rd party cookies by default last I checked. It does something a lot more complicated that still allows 3rd party cookies in various cases.
It‘s just cookies recognized to be tracking cookies, isn‘t it? These companies are earning money by tracking you. They have a strong incentive to bypass these mechanisms.
(Maybe put "Edit:" in your message, for a minute I thought this was your reply, implying that the parent comment should retract their statement or something, but given the replies to your comment, I'm guessing you want to indicate that you changed your mind.)
When it comes to the greater public, default settings might as well be the only available setting. Apart from a few 'techies' most people will never even touch the default settings out of the naive belief that "the default setting is what's best for me".
As an alternative approach I would suggest empty settings to begin with, forcing the user to think about their preferences on first use.
That would only work if every browser implemented it but for the average user, choosing between a blank-slate approach where they have to parse through terminology they don't understand, and an alternative offering "sensible" defaults, I suspect most users would just pick the easier latter option.
Perhaps there's a middle ground. Give users a range of options (say 3 to 5) that aggregate the settings, ranging from "I don't really care about privacy" to "I wear a tinfoil hat to bed", along with pointers to where and how they might wish to delve deeper into more detailed settings. It can't be that hard...?
If you put a big scary decision as the first thing users see, many will just close the browser because they don't know what they should pick. When they open a different browser that doesn't present them with that choice, they may conclude that it's not a problem on that other browser.
"Today marks an important milestone in the history of Firefox and the web. As of today, for NEW USERS who download and install Firefox for the FIRST TIME, Enhanced Tracking Protection will automatically be set on by default, protecting our users from the pervasive tracking and collection of personal data by ad networks and tech companies."
this gives me the impression that Mozilla is trying to pull in a bunch of new recruits, also does this mean upgrades or repetitive DLs will not have this ~privacy by default?
Probably. This is generally the correct way to handle an upgrade while minimizing breaking changes to the user. Changing a user's settings during an upgrade will erode users' trust in doing so in the future, even if it's "good" for them.
So they're killing the Google relationship? I mean something called "Enhanced Tracking Protection" would have to disable any sending of data to Google (or anyone, except the server as required to get the data requested), surely?!?
Google might not be happy about this move, but historically Google has paid Mozilla to be the default search engine in Firefox, which doesn't require Firefox sending any data to Google (apart from actual search queries, obviously).
Thats exactly what firefox telemetry data is. It collects things like % of requests over HTTPS, % over IPv6, etc and sends anonymized stats to Mozilla.
But at the end of the day, I'm just taking someone's word for it that this is all they send, and assumes that it won't change over time in a browser that regularly updates itself.
It'd be a lot more acceptable if there was an option to show me "This is the exact telemetry payload we want to send to Mozilla." And even then you are taking someone's word for it that there isn't some other piece of data hidden in a hash or something, or that the browser isn't secretly sending data.
I'm not quite paranoid enough to do the full monitoring of all network traffic, but how do I reasonably know what is going on without listening to traffic/watching memory at all times? In the end, I'm trusting a faceless corporation that is attempting to put on a facade of trustworthiness.
The only trustworthy computer is an unnetworked one.
> But at the end of the day, I'm just taking someone's word for it that this is all they send, and assumes that it won't change over time in a browser that regularly updates itself.
Actually you aren't. It's quite easy to build your own Firefox and use that, or use say Ubuntu's build if you trust them more. You wouldn't want to audit the Firefox source code yourself, but if Mozilla was intentionally sneaking backdoors into the source code, someone could (and probably would) eventually find that out and you'd be able to verify it by examining your source archives... which means Mozilla would have to be stupid/crazy to try that.
Firefox is open source, and even the telemetry server code is on github, so if you wanted to really dig in you could.
At the end of the day if you want Firefox to make rational decisions about what feature to implement they need to collect some data. That fact alone can't be a problem for anyone who wants to have firefox be a competitive browser.
You don't know there isn't a government official placing a secret camera across the street from you, watching your computer screen through the window, neither.
Eventually, you have to just get on with your life and trust people.
I'm not particularly bothered by the idea that Firefox can see how often private browsing is used. If you don't know what features people use, you can't prioritize development resources. Mozilla doesn't know who is using private browsing, or what they are looking at.
I can certainly respect that it bothers you—but may I ask what browser you intend to switch to? I'm skeptical that you'll find a better, usable option. That's a sad state of affairs for sure, but also the way of things right now.
The thing that absolutely pisses me off is how I try to be actively aware of what settings I disagree with and disable things I don't like - and then an unseen update resets things to default.
HOW MANY TIMES MUST I UNCHECK WHAT TO SYNC TO MY ACCOUNT? YOU WOULD THINK THAT IS SAVED PERSISTENTLY.
I think my qualify of life on Firefox would be improved greatly if a notice popped up saying some of my settings were reset to defaults because of breaking changes (or minor). Like they give a crap.
This is a slightly difficult problem but it has been solved in other places. When I update my computer with linux sometimes I will get a message saying that upstream has changed a config file that I have also modified and it asks if I want to keep my version, keep upstreams version or open it in an editor.
Not sure why the down votes that would sure piss me off. I'm very conscious of what I want synced and it only takes a single bug like that to throw away all the effort to keep things separated
- Collects a bunch of telemetry data via several mechanisms and ships them to Mozilla HQ
- Provides Mozilla with remote code execution privileges on your machine via the shield (or normandy, or whatever they are calling it these days) mechanism, which can install and uninstall extensions and certificates, change browser settings, etc
- Uses Google as the default search engine, and search suggestions leak private data to Google
- Uses Google Location Services for their geolocation thingy, which - unsurprisingly - phones home to Google
- Ships closed source third party add-ons
- Comes with a bunch of "about:config" settings configured in sub-optimal ways, privacy wise - battery API enabled by default, accept all cookies by default and so on
Sure, Chrome is worse, but bringing that up that is like arguing that your pile of manure is better because it doesn't smell as bad: in the end, you are still arguing about shit.
There are some valid privacy complaints about Mozilla but I think they are severely overblown by a lot of people.
Mozilla is very up-front about exactly what telemetry data they're collecting and what it's used for, there's even a pop-up when you first install the browser about it telling you what's collected and how to disable it if you want to. And then when Mozilla makes decisions based on telemetry like removing features that 2% of people use the people who disabled telemetry complain that Mozilla is ignoring their opinions.
The optional syncing service is end to end encrypted so Mozilla can't see the data you're syncing.
Shield is a valid complaint, I am not a fan of it being opt-out.
Search suggestions are disabled by default in private browsing mode and probably a feature most people want anyway. Your query gets sent to the search engine when you hit enter either way.
The battery API was completely removed from Firefox two and a half years ago, that particular complaint is very outdated. Firefox has been tracking cookies by default for a while now too. More strict cookie policies would just annoy the vast majority of users.
Didn't you already trust Mozilla to execute their code on your machine when you installed the browser, in the first place? And to do it remotely with auto-updates.
There is a big difference between them being able to activate a connection to my machine at their whim and execute code, vs me downloading their software or an update at a time of my choosing, especially since if I am very security conscience I can wait until an updated has been audited or tested.
With a remote code execution engine, someone could hack into their backend and then start running malicious code on thousands or millions of machines. If they compromise a software update, at least there is a chance it can be caught before it gets to me.
There's a config-flag to turn it off. You could even deploy that enterprise-wide.
That said, every auto-update system is essentially an RCE system. For highly exposed and security-sensitive applications like browsers, the auto-update is a net win in many deployment scenarios.
No browser is really "privacy-focused". Performance, security, stability and Web compatibility are all table stakes for Web browsers. If you aren't competitive at those, it doesn't matter what else you do, your product isn't viable. And telemetry data is really valuable for achieving all those; without it, you'll waste a lot of resources fixing the wrong things. Mozilla certainly can't afford to do that.
Once your browser is competitive at those table stakes, only then can you give it a "privacy focus" to differentiate from Chrome.
If you're security-conscious then you'll install updates immediately, before you get compromised by whatever attack it might be fixing.
In reality no-one outside Mozilla is auditing updates (other than black-hats reverse-engineering security fixes to catch the people who don't update immediately). I don't think the situation for other browser vendors is any different.
In that fecal analogy, Mozilla is a fresh cup of coffee from squirrel-digested coffee beans, while Chrome is a neck deep swim in human sewage.
The privacy points you raise are significant but at no point discredit the very real privacy efforts made by Mozilla and in no way make it comparable to Google.
I get annoyed when I remove most of the default search providers, and then an update brings Google back. I specifically removed some of those providers so my searches would not be predicted with those services.
Any unwanted, unrequested connections to 3rd parties counts as a privacy concern. If I don't explicitly click a link or enter a URL in the address bar I don't expect traffic to be sent anywhere or any content at all downloaded.
There's literally an article on the front page talking about how Apple is really a privacy as a service company based on all of their marketing talking points about privacy.
What's your Firefox version? I just checked and "browser.urlbar.suggest.searches" still defaults to false on mine, and it doesn't show search suggestions.
As said for the other comment, on my 67.0 installed on Ubuntu, from Europe (if that makes any difference), I have no prompt on new profile and the suggestions are enabled by default. Not sure if that changes per region, operating system, or other reasons to be honest.
I'd like some clarification on that. Does Chrome send automated telemetry reports about my browsing to Google when you are not logged in with your Google account? Does Chrome give remote code excecution privileges to Google (Yeah, via the Updater, but that does not really count)? I searched but I found nothing on the net about telemetry.
I never use Firefox because it has these horrible defaults, and keeping up on all the about:config switches I need to toggle to be able to use it is just too much.
maybe it's just me, but Google privacy note seems a lot more reasonable. And if it is correct, they ask if they may collect usage statistics during the installation, while Firefox does not.
Why does "Updater" "not really count" as "remote code execution privileges"?
Either the vendor can remotely change your software and its configuration without user intervention, or it cannot. For any software that supports unattended updates, it can. End of story.
I don't like how the opening line of article exploits the fact the average person does not know cost of average online ad to make it appear like tracking has basically no value.
>... data about you was transmitted to dozens or even hundreds of companies, all so that the website could earn an additional $0.00008 per ad.
For the reader to be able to accurately understand how much money this is they need to know the percentages.
Very roughly (this varies widely based on the country and websitem) the average online ad only costs ~$0.0005, so that insignificant $0.00008 is around 10-20%. If the article had presented the exact same information but instead framed it in the form of revenue available to pay employees at an online company dependent on advertising, this would sound very different while really conveying the same concept.
Edit: I read the linked study and the data they used had an average cost per add of $0.001 putting the difference around 4%. This is smaller than I would have predicted. I would still rather they have lead with this number.
My personal problem with the model is not how much they make, but rather the intentionally hidden relationship they develop with the user.
If the relationship is - you get the article, we get to show you an ad that gives us a chance to sell you something and make money on it, which is ~$0.00008 - that would be clear.
Even if the relationship was - you get the article, we get the above plus we'll collect some bits of information about you that we explicitly list. The ad itself will give us ~$0.00008, and the collected data another ~$0.000007 - that would be ethical imho.
But the real model is - we give you an article, and in return you sign a blank document that allows us to collect all the possible data and try to maximize the amount of money we can make on it. You step into it today, but we are not comfortable putting any price point on this agreement because we bank on the idea that in the future we'll make more as we increase our grip over understanding of user behavior and improve our ability to monetize it in any, potentially unethical way.
The reason companies hide the nature of the relationship is because their business models are built around the assumption that the data collection will generate increasing amount of revenue in the future.
And since there's no way for you to understand the relationship, or step out of it, you're entering it with information disadvantage, and there's no turning back.
I hope you can see how this approach is by design hostile to users and the Internet as a public plane.
I'm amused by this because when I called Mozilla out a few days ago, I got a bunch of downvotes [1]. Plus one of the top comments was a subtweet of mine.
> Chrome and Safari are the two most prevalent browsers in our data, with Chrome being associated to about 43% of the ad transactions and Safari to about 38%. About 73% of the ads shown on a Safari browser do not have a cookie associated, whereas on Chrome this is the case about 17% of the time.
> The difference is probably due to different default tracking settings across the two browsers, with Safari impeding, by default, third-party tracking cookies being set on the user’s machine (the user has to explicitly allow the usage of third-party cookies)
https://weis2019.econinfosec.org/wp-content/uploads/sites/6/...