> "large scale blocking of cookies undermine people’s privacy by encouraging opaque techniques such as fingerprinting"
"If y'all had just handed over your wallet when we asked, we wouldn't have had to hurt you. So really, this is your fault."
More seriously, what Google is saying here is that privacy can't be mainstream. It's no different from the arguments that advertisers gave when they rejected DNT. They're willing to let a few privacy-minded people avoid being tracked, as long as that group never becomes a majority. But it was never their plan to get rid of tracking, every compromise they ever offered had the catch, "as long as it doesn't affect our bottom line."
It's just another Equifax settlement. "Yeah we offered it, but we never assumed that so many of you would take it. Be reasonable."
Heck that. If you genuinely believe that people should have a choice about privacy, then you have to accept that maybe the majority of people will decide they don't want cookies. You have to take a step back and consider that browsers that block cookies by default are just reflecting what their users already widely want; people want their browsers to protect them from tracking by default, without configuration.You're not offering users a choice, you're just mad that you can't use dark patterns to make users unsafe-by-default.
"First, large scale blocking of cookies undermine people’s privacy..."
Oh yeah, users and their privacy first!
"Second, blocking cookies without another way to deliver relevant ads significantly reduces publishers’ primary means of funding..."
Also we make less money when you do this... But users, privacy and synergy first!
I think the second point is the real issue for Google. But of course they care about users first! I feel cynical as hell, but the whole thing reads so slimy to me. Like a used-car salesman twisting an obvious negative into a positive. "That's not a giant rust hole, that's just to reduce weight for efficiency!"
I think a great setup for privacy are temporary containers. In fact, using the Temporary Containers Firefox addon and domain level isolation works virtually everywhere except on Google / Gmail / Youtube...
That's because when you try to log into one of those, it tries to simultaneously log you into the other domains/Google products linked to your account and per-domain sandboxing prevents that.
- ClearURLs: Prevent URL-based tracking, which Google and Amazon use everywhere
- VPN: Prevent IP-based tracking, to some degree
- uBlock Origin: Block adds and some nasty JavaScript
I find it quite hard to prevent fingerprinting. The privacy.resistFingerprinting in Firefox breaks tons of things. So does blocking JavaScript. And uMatrix is a bit time consuming to use, and not suited for inexperienced users.
Any alternatives considering aside from Tor Browser?
I use Multi-Account Containers in Firefox plus the Cookie AutoDelete extension which integrates with it. Does the job for me. Here's my worry, though: the more broadly used cookie-blocking tools are, the greater incentive websites will have to use other, less-transparent methods to attempt to infer your identity without a cookie. These can be nefarious hacks that work like a cookie, or they can be statistical methods that estimate your identity with some degree of confidence. But one way or another, this would swing the pendulum back in the direction of sites knowing a massive amount about their users without users' awareness.
This will be an arms race and seems to be occurring over the course of decades. There will end up being extensions to regularize the timing of your keystrokes and mouseclicks because they give away your identity, VPNs will be integrated into the browser, anything to combat the information leakage.
DNS filtering like pihole may be useful on your network for Ad blocking on Android/iOS. Not much more useful than ublock origin on your desktop though. Blocking at the DNS level is very effective.
> Also we make less money when you do this... But users, privacy and synergy first!
In practical terms, the point is important. Even if Google doesn't change their behavior to find other ways to invade your privacy, there are lots of other companies whose financial incentives point solidly that direction.
With enough people standing to gain enough money, someone's going to cross that moral line. Is this cynical? Absolutely. Correct? Almost certainly. Is it something we should take seriously as a result? I think so, though obviously opinions may differ on the subject.
Few publishers are confident in their ability to jettison an targeted-advertising-based business model for an alternative model that produces similar financial results.
Interesting to me is that Google's Privacy Model suggestions of
"Identity is partitioned by First Party Site" and "Third Parties can be allowed access to a first-party identity"[1] come off as more coherent and effective than Apple's ML based Tracking Prevention policies[2].
Issue is Google's third suggestion, "A per-first-party identity can only be associated with small amounts of cross-site information", is clearly advertiser-focused, not privacy-focused.
I hope all can rally around the partitioning concepts and leave the others behind.
Google’s proposal sounds simple because it is vague, does not address all the ways websites can track you, and does not address all the possible compatibility problems with partitioning. WebKit’s policy (and more importantly our code) are what things look like when you actually try to ship tracking protection. Easy to be simple if you don’t have to spell out details or actually ship.
I don't believe it's very vague. This sentence in particular was succinct and represents a 90% solution to the problem:
The identity "Me while I'm visiting nytimes.com" is distinct from the identity "Me while visiting cnn.com".
To me, this means all client-triggered requests while my address bar ends in "nytimes.com" will have a distinct set of cookies that is only drawn from while I'm on nytimes.com.
I prefer a rules-based solution to an ML-based solution. ML is more opaque and it's harder for developers to know when their solution will run afoul of the classifier.
Are there significant compatibility problems this would cause?
It says “identity”, not cookies. Maybe it means cookies are partitioned. Maybe it means some new form of identity token is created (Googles have proposed them at various times) and is partitioned, but cookies are blocked. Maybe it means new token plus cookies remain as they are based on the “fingerprinting is even worse” argument. Maybe there are different settings. Who knows?
Anyway, the reason we moved away from cookie partitioning is that many third party embeds got confused when presented with non-empty cookies, but ones that weren’t the user’s canonical cookie for that site. Bugs were reported to us by embed providers, including Google web properties. This is a bummer because partitioning is elegant and has been WebKit’s go to approach since 2013, when we partitioned all storage besides cookies (LocalStorage, the HTTP cache, etc). A move that, incidentally was opposed by Chrime Engineers back when we shared an engine code base, though they are coming around now.
(I should mention also that partitioning cookies made many social embeds stop working at all until we provided Storage Access API and got embed providers to adopt them, which is why ITP originally had a 24 hour user interaction exception. Fortunately the web has adapted now.)
I should also mention, WebKit ML solution is deterministic. It’s trained offline. And it doesn’t substitute for enforcement actions, it substitutes for a block list of known trackers, which other tracking prevention systems use. One downside of strict rule based systems is that it may allow malicious players to game the system. So there has to be an out for changing the rules or ratcheting up enforcement.
Apologies, you're farther along than I was giving credit for. Storage Access API looks very much like the last 10% I was envisioning.
I do still wish the ML stuff weren't there, though. As a developer for a small company that's trying to become a big company (and develops on these boundaries), I fear we'll build something that works one day, but the next will be classified and the rules will change.
I'd rather have a policy that forces big and small sites to follow the same rules. Between storage access and link decoration detection, WebKit seems quite close to being able to detect cross-site tracking as it happens rather than via classifier, and I'd hope that could one day replace the ML component.
> "If y'all had just handed over your wallet when we asked, we wouldn't have had to hurt you. So really, this is your fault."
Seeing as Google doesn't engage in fingerprinting and are actively taking steps to block it[1], I don't think this is a fair characterization of Google's position.
Google does engage in fingerprinting in multiple of its products, but even if it didn't I don't think I would change my analogy.
Google's position here is, "obviously you're going to have to let us track you some way, and our way (cookies) is better than fingerprinting." It's still blaming privacy-conscious users for the actions that advertisers have taken, because those unreasonable privacy-conscious people couldn't be content with just blocking some of the trackers.
There's an underlying tone to this article that reads as, "you all messed this up because you didn't just leave well enough alone. Now we'll come in and fix it, but you're gonna have to compromise with us when we do." From my perspective -- no, users didn't break anything. This isn't our fault, and we should be allowed to broadly block cookies without being tracked in more pervasive ways.
But, if it makes you feel better you can change a word in my characterization to read:
> "If y'all had just handed over your wallet when we asked, they wouldn't have had to hurt you. So really, this is your fault."
>This is why Chrome plans to more aggressively restrict fingerprinting across the web. One way in which we’ll be doing this is reducing the ways in which browsers can be passively fingerprinted, so that we can detect and intervene against active fingerprinting efforts as they happen.
That seems like a 180 from their stance 2 years ago[1]. Their statement is low on details so I'll believe it when I see it. Right now the only browser with meaningful fingerprinting protection is tor browser/firefox.
The initial statement was low on details because it was just an statement of intention. Specific steps they're taking to combat fingerprinting are visible in their public bug tracker[1]. Also, let's not forget the privacy budget[2] system they just announced in the OP. That seems rather promising.
The cynic in me is 100% sure they calculated the impact of these changes to their ability to track users and found the impact negligible. Still, it probably does make tracking more difficult for other advertisers, so there's that.
Since Google has cookies dropped by google.com properties that they can use for user tracking, I'm surprised they haven't worked harder to reduce the effectiveness of fingerprinting in Chrome. They could make user tracking more difficult for ad companies that don't have cookies dropped by sites as popular as google.com while accurately marketing the Chrome changes as improving user privacy.
HN guidelines would prefer that you don't comment on downvoting, as it makes for boring reading. If you don't have anything substantive to add (such as, if "others have pointed out" your thoughts already), just vote and move on. :)
Although the words "I'm downvoting you" aren't useful, I would argue that saying why you're downvoting someone is really important. Helps prevent the feeling of shooting the messenger.
I disagree. No one has posted any evidence that Google engages in fingerprinting users beyond simple user-controllable mechanisms like cookies. (Though several have indeed made that assertion with no evidence.)
Google Analytics does not fingerprint. Hits are identified with a user either by a first-party cookie (called the Client ID), or with a user-id value provided by the site owner. The site owner could provide their own fingerprinting value as a user-id, but the fingerprinting is not done by analytics itself.
(There are some cases where identifiers are pulled in from other systems, but none of them can be used as the basis for unifying users.)
GA records a lot of data about the user such as screen resolution and browser version. It doesn't do full fingerprinting to identify unique users but it uses many common fingerprinting techniques.
Like collecting any data at all? I mean “what browsers and what versions do I need to support” and “what resolutions do I need to design for” are pretty benign questions.
Re-Captcha doesn't fingerprint users. It uses statistical models to guess whether a particular user is a bot or not; it doesn't de-anonymize users or identify them in any way.
Analytics doesn't either, it uses regular cookies. (See sibling comments.)
Cookies, browser characteristics, user behavior, etc. Just because a certain bit of information could theoretically be used for fingerprinting users, doesn't mean any system that uses that information is doing fingerprinting.
Agreed, but it's incredibly naive to assume they are not. Google's product is it's users, and it's a known fact they manipulate results based on what they want you to think or what they think you want to see. They are completely shady on privacy.
I don't know why HN seems to be so stubborn about things like this. It's clear as day they fingerprint users, but seems like the majority of users here refuse to believe a corporation or government has done anything wrong unless they literally admit it in a press release.
IP address, the client/user ID, type of interaction, timestamp and a few other bits of data like your device hardware.
GA isn't nearly as invasive as people think. The real problem is data being combined on the backed and correlated across websites by ad networks, not the analytics itself.
Considering that we're talking about a company which owes probably nearly all of its not inconsiderable fortune to predicting people's behavior via mass surveillance, this comment seems to miss the forest for the trees a bit.
It would greatly reduce the amount of fingerprinting, but not completely eliminate it. I've written about this before, but there's more than just Javascript we need to worry about -- we'd want to strip down CSS a bit, rethink how asset caching works, and potentially send less information in request headers.
But it would certainly be a lot easier to do all of that without Javascript.
The problem is that one of two things would happen:
A) the majority of web apps would move to native platforms, where fingerprinting resistance is even worse. That would include a good many news sites, even though most news sites aren't really web-apps, and don't need Javascript to operate. They'd still do it.
B) the majority of users would not use those browsers. We're playing a kind of weird game with fingerprinting right now -- we want to block it without breaking most websites. This is why adblocking is fantastic: it observably makes the web better to even an average user, so there's an incentive for users to adblock en-mass, and publishers can't really leave the web. Adblocking is a way to fight surveillance apparatuses without making sacrifices.
It's a tricky situation to be in, and there's a lot more that could be said about the merits of different strategies.
That being said, if you don't care about an en-mass policy, and you're just looking to increase your own privacy, it would definitely be a good idea right now to use something like UMatrix to disable at least 3rd-party Javascript by default. You'll have to do a lot of per-website tweaking, but if you're at all technical it's honestly not that bad.
> maybe the majority of people will decide they don't want cookies
At the same time, a lot of people don't understand what cookies actually do (the good things they do, I mean) and thus having them disabled by default will deepen their belief that they're only bad.
I'm not sure what I'd do if cookies were blocked by default. What an absolutely inconvenient reality that would be as a web developer. You'd have to embed a token in the page on each request, like WebForms, or something.
Nobody has suggest nor enacted blocking all cookies by default. I'm not sure what you're referring to here. This is mostly about third party cookies and or Safari/Firefox's default privacy blocklists.
You seem to be conflating first-party cookies with third-party ones. The originating party of the cookie, with respect to the site the user get the cookie from, is the crucial point on that subject in this context.
Nobody in the comments sections has identified the most nefarious part of their plan. Their goal to develop an "open set of standards" is an attempt to control how other browsers protect user privacy.
This is Google saying, "oh looks like everyone else is doing things that hurt our business, so let's get everyone to work together on open standards, where we can block or delay attempts that run counter to our business model".
This is a study any publisher can replicate today. Compare an experimental group with setRequestNonPersonalizedAds [1] against a control group. I think you'll see similar numbers.
My non-expert [1] understanding is that by default Google's ads are both contextually targeted and personalized. If you recently visited a mattress site and they are trying to bring you back for a sale, you'll get mattress ads everywhere via remarketing, a kind of personalization. On the other hand, if you're visiting a mattress site for the first time, you'll probably also get mattress ads, but this time because of the context of the page.
If the publisher requests non-personalized ads, that turns off the personalization, but not the contextual targeting, so the answer to your question is kind of yes, kind of no: it removes personal targeting from the mix, but contextual targeting is always on.
It seemed like the lack of reference in TFA to the source of that study - specifically the lack of name-dropping of the source to serve as an ethos - made me immediately assume that it was probably a study run by the very company putting this borderline propaganda out. Glad to have it confirmed.
Probably going to take some heat for this answer, especially as many of the comments here are very negatively Google, but I actually think that was pretty balanced. Sure they're (obviously) protecting their own interests, but realistically SO much of the web relies on advertising. I mean, how many startups did we watch over the last 20 years with no real source of revenue (but some of which were actually fun/useful) end up doing the whole "we'll just use ads" thing? And ultimately as a product owner (web or otherwise), you're not going to spend money on ads (and thus fund a whole load the Internet) if you a) can't make a return on your ad spend, and b) can't understand what and where performs best.
Don't get me wrong, I hate the super spammy intrusive ads, and sometimes retargeting does feel like I'm being eavesdropped on, but I can't be the only one that prefers to see relevant ads? I'd much rather see ads for sneakers and watches (which I'll frequently click), than I would for some other junk that doesn't interest me. I genuinely don't mind ads if I know it's supporting the thing it's displaying ads on.
And as for the cookie stuff - it's a complete mess now. I get that we as a community should probably be spearheading SOME sort of privacy to bubble down to those that don't 'get' it, but talk to the bulk of my non-tech friends and it's a non issue. All those cookie popups are clicked without being read (and they never will be read ever get, that behaviour has been established) - the brightest button gets the click and they can move on and get to actually looking at what they wanted. Literally nobody reads, cares or understands. Even the "they're selling my data" / "they're tracking me scandals" - maybe five minutes of outrage and then assuming you can still share cat memes and holiday photos, who cares? Yesterday people were all over Instagram sharing an outraged message that "Instagram have no permission to do what they want with my content" - and yet no one left IG. Everyone is still back today, business as usual.
I think at least _exploring_ this whole thing in a way that's balanced towards the entire internet ecosystem is way more thoughtful than a knee-jerk "block everything, privacy everything" response. It's a pretty emotive topic, and close to a lot of our hearts, but we definitely need more talk. Just my 2c!
Levelheaded and well-written. I personally am vehemently against all forms of tracking/targeted ads not due to ad preferences, but due to concerns over how much data is being collected and how little control we have over it.
I too would prefer non-spammy targeted ads to 'dumb' ads - given that the data being collected and stored is used solely for this purpose. But that's never the case.
Targeted advertising at first glance seems like a win-win-win: you're exposed to products aligned with your interests, the manufacturer makes a sale, Google takes a small cut. Thing is, the advertising giant's 'win' carries significantly more weight than yours or the manufacturer's:
You get your product, you're pleased with it. Win!
The manufacturer makes a sale. Win!
The ad/tracking agency receives a juicy set of click/keystroke/impression/conversion data. They ingest it into their massive database and it's matched with a wealth of other information already collected on you. This is all combined and fed through a state-of-the-art machine learning system. This third party now has a full psychological profile of your interests, hobbies, lifestyle, connections, etc. They can use it to serve you better-targeted ads. Win!
Or they can use it to influence you politically. Or they could sell it to the NSA. Or they could be hacked by a foreign nation state.
Targeted advertising is one thing. Blanket surveillance + dragnet data collection + Google ML + PRISM is quite another.
I think this response somewhat misses the point. The basic problem is that people instinctively feel that highly-targeted ads are creepy. People have a level of targeting that they feel is acceptable, and exceeding that level feels like an invasion of privacy. It doesn't take much imagination to come up with scenarios in which, say, highly targeted ads for divorce lawyers would feel unacceptably creepy to most people. That negative feeling can often outweigh any positive benefit people get from the targeting of ads.
One difficulty with this entire debate is that privacy is a social problem as much as a technical one. Trying to come up with a rigidly-defined model that separates benign ad targeting from privacy violations is very hard if not impossible.
>That negative feeling can often outweigh any positive benefit people get from the targeting of ads.
If targeted ads really improve (in amount and in quality) the free content available on the internet because it drives revenue, then that's the real benefit.
I understand that it feels creepy at first, but personally that feeling faded away. As things stand personally, I'm consuming online services from Canada and I don't understand what's so bad about the current level of targeting.
Most of my discomfort is based on potentialities, the what-if's, but I don't think I would demand complete privacy from corporations before anything severe happens.
People distrust corporations immensely, but I'm more worried about politically motivated organizations, such as the government.
How I Learned to Stop Worrying and Love Big Brother
Btw, you should be aware that the government has full access on whatever private information corporations have about you, that's the whole point of what Edward Snowden revealed.
> Don't get me wrong, I hate the super spammy intrusive ads, and sometimes retargeting does feel like I'm being eavesdropped on, but I can't be the only one that prefers to see relevant ads? I'd much rather see ads for sneakers and watches (which I'll frequently click), than I would for some other junk that doesn't interest me. I genuinely don't mind ads if I know it's supporting the thing it's displaying ads on.
None of this fundamentally requires the surveillance apparatus that most ad scripts come with.
I think that's exactly Google's point. They're trying to build an alternative system which allows ad personalization, but blocks the sort of intrusive tracking prevalent in the modern-day advertising industry.
That's false, and can be trivially shown to be so with a simple hypothetical counterexample:
Imagine a system where ad servers sent 10 ads to your browser instead of one, and then your browser decided which ad to show you based on some settings you previously selected in your browser settings page. There you have it: personalization without privacy invasion. Easy.
Google's system will undoubtedly be a bit more complex than that, but there's no reason why it would necessarily have to invade your privacy just because it allows ad customization.
The system you suggest doesn't come near privacy. Sure, there is some uncertainty over which ad was shown, but as soon as it is clicked you have revealed that too. And it's clicks and page visits that buils visitor's profile, much more than ad views.
Obviously if you click on an ad then yes, the advertiser will know you're interested in that product. I don't see how that has anything to do with ad personalization though.
And with this system, the advertiser wouldn't have the ability to build a profile on the user, since the idea is to replace cookies with this system, not to merely supplement them.
Au contraire, the proposed schemes increasingly hand the control over to browser vendors (Chrome) and BigTech (hint). It also does nothing to prevent user control and propaganda machine that has been built on top the mass privacy grave.
The news papers have the right model (in terms of privacy): I'd rather online businesses who's revenue model is dependent on ads, do that instead.
Like they did with AMP, Google could offer to peanlise websites in their search results that excessively fingerprint users? Show a red bad banner on Chrome? Not these workaround that warrant new mitigations techniques 'cause third-party cookie blocking won't be enough now.
True, it does hand control to browser vendors. But I think I'd rather my browser have control over the information used to personalize ads served to me rather than that information being available to advertisers directly.
Google is taking steps to stop excessive fingerprinting[1][2], so it's not like they're just adding new systems for ad personalization without closing off the existing, more intrusive ones.
Assuming this proposal is adopted in some sense, how much time until Firefox (the alternative big name browser in this case) - or more specifically Mozilla, the browser vendor - does something that Google doesn't like while adhering to the specs? This is putting the power in the hands of browser vendors after all.
Letting Google - the advertising conglomerate that "vends" its own browser - use its industry sway to publish standards that give power to browser vendors is concerning because Google already has its finger in that pie also.
They've already begun to track users through Chrome itself now with account tie-in in the actual application.
Google can now play friendly and claim to be keeping web advertising at bay, while siphoning increased amount of data directly from the user's browser - no JavaScript required.
Google Chrome, the most popular browser is developed by Google the advertiser, the biggest advertiser, so preferring one over the other makes zero difference.
Of course there's a difference. Instead of hundreds of advertising companies having access to my data, this way there's only one.
And that's even ignoring the difference between my ad preference data being stored on a company's servers, versus stored locally on my machine by my browser.
I like your optimism, and I don't mean to patronise you, but I agree with your points if I keep the short term in mind. In the long run; however, I feel one would be disappointed in what the ad-industry will turn this privacy-sandbox into. Refusal to ack DNT and dark patterns in-use all throughout the interwebs stands testament to this.
Advertisement has, unfortunately, degraded into a plague spreading scareware, spyware, malware, scamware and what not. It gets worse with each passing day as more unsuspecting users discover internet for the first time. The surveillance appartus that the industry has built has made robots out of humans, mere instruments to be toyed with. Not all parts of the advertisement industry is worthy of our data but have access to it anyway, because they've got the money to burn that companies like Google and Facebook scramble over to gather.
These companies need to be forced to come up with alternative business models or ads that do not require extensive surveillance, or I am afraid given the unprecedented data they have amassed, they will attract, in the long term, all sorts of power-hungry designs and the outcome is going to be a disaster for everyone else but the power-grabbers. The result isn't going to be pretty. Sorry to go all Orwellian on you.
> These companies need to be forced to come up with alternative business models or ads that do not require extensive surveillance
That's exactly what Google is trying to do here. They're replacing the existing ad system with one that does not require surveillance to achieve ad targeting.
Agreed - this and allowing _enough_ tracking that publishers can actually understand where ad budget is best spent, without jeopardising an individual's privacy.
> I can't be the only one that prefers to see relevant ads?
I'm sure that you're not the only one. However, I'm not in that camp at all. I don't view "relevant ads" as being a benefit to me. All they do is clearly demonstrate that I'm being spied on.
>but I can't be the only one that prefers to see relevant ads?
Of course. Everyone wants relevant ads.
But we want ads relevant to the site / subject we are browsing. Not to our past purchases, internet usages, exact location and what not. That's just too invasive.
That's not true. If I'm reading a generic newspaper website, I don't want to see ads relating to the article or the newspaper. I'm more than happy to see ads for sneakers and watches.
> but realistically SO much of the web relies on advertising
That appears to be mostly clickbait, blogspam, transparent marketing pretending to be content, and other forms of media that are driven to be as vapid as possible in order to maximize the number of ads they can show. I would promote adblocking specifically in order to destroy that part of the web.
> I genuinely don't mind ads if I know it's supporting the thing it's displaying ads on.
I do. I'm pretty that sure there are other people who would rather not consume content if they can't do it without having ads shoved in their faces constantly. In fact, ads bother me irl more than online, since it's harder to avoid them.
> All those cookie popups are clicked without being read (and they never will be read ever get, that behaviour has been established)
Says you. Since the EU cookie popups are not required if you don't use third party cookies, having the popup is basically admitting you are selling out your users. As for the GDPR tracking opt-in popups -- most of the ones I've seen violate half of the GDPR requirements. Once the regulators start fining publishers for those violations, we might see an improvement.
> I think at least _exploring_ this whole thing in a way that's balanced towards the entire internet ecosystem is way more thoughtful than a knee-jerk "block everything, privacy everything" response.
Tbh, I don't really care about the for-profit content publishing part of the "internet ecosystem". Most of the interesting stuff I read is on personal sites with no ads or tracking anyway, so I don't even care if half the "publishers" disappear. I've yet to meet a single consumer who likes ads.
> Second, blocking cookies without another way to deliver relevant ads significantly reduces publishers’ primary means of funding, which jeopardizes the future of the vibrant web.
Herein lies the issue. Users blocking cookies are trying to avoid being tracked and "seeing relevant ads". Essentially, he is trying to create web standards in order to do the thing that users specifically don't want them to do. Google's business model requires Schuh to twist logic in a manner that makes it seem like it isn't so.
Is "I don't want to see relevant ads" really the user's ultimate goal? Or is their goal actually something more like: "I don't want advertisers tracking me", and not seeing relevant ads is just a side-effect of that?
It sounds to me like Google's goal here is to give users what they want (less tracking, better privacy) without hurting their ad revenue. That seems like a reasonable compromise to me, assuming they can pull it off. At the very least, it's an improvement over the status quo.
I left out of my comment that there's significant overlap between users who block cookies and those who use adblockers. Many users simply don't want to see ads, but are subject to intrusive tracking nevertheless.
However, you are right that most people would probably be fine with personalized ads if it was possible to show them without identifiable tracking.
Personalizing content without identifiable tracking is a really hard issue that I'll be surprised if anyone ever manages to solve. It will also most likely involve some compromises that hurt Google's bottom line, so I think the financial incentives just aren't there for them. Given those two assumptions and Google's track record, I think it's absurd to assume that any standard they create will meaningfully improve user privacy.
Also, I fear their new standards would most likely just circumvent the tools available for users to avoid tracking, as the tracking would be baked into the browser itself.
This isn't a foreign concept, either. It's how everyone started out. Google just thought they could charge more for ads if they also began tracking users so they could display higher-value ads that are supposedly relevant to a user regardless of the content of the page they are on. It turns out that they were right, and they are able to make more money doing this. Unfortunately for the rest of us, privacy is collateral damage.
Those are valid concerns, though at first glance it doesn't seem to me like any of Google's actual proposals[1] have those issues. In fact, FloC seems like a rather promising way to achieve "personalizing content without identifiable tracking". Perhaps that particular problem isn't as hard as you thought?
Floc is still a surveillance machinery that comes with the usual bells and whistles like mass propaganda and behaviour control. They admit to as much in the concerns section of the readme.
The concerns section explains that broad interest categories (traits shared with thousands of other people) are revealed to ad networks. That's it. No "surveillance machinery", nor is there any mention of "mass propaganda and behaviour control".
> A flock could be used as a user identifier. It may not have enough bits of information to individually identify someone, but in combination with other information (such as an IP address), it might.
...
> A flock might reveal sensitive information. As a first mitigation, the browser should remove sensitive categories from its data collection. But this does not mean sensitive information can’t be leaked. Some people are sensitive to categories that others are not, and there is no globally accepted notion of sensitive categories.
...
> This API democratizes access to some information about an individual’s general browsing history (and thus, general interests) to any site that opts into the header. This is in contrast to today’s world, in which cookies or other tracking techniques may be used to collate someone’s browsing activity across many sites.
> Sites that know a person’s PII (e.g., when people sign in using their email address) could record and reveal their flock. This means that information about an individual's interests may eventually become public. This is not ideal, but still better than today’s situation in which PII can be joined to exact browsing history obtained via third-party cookies.
The goal is to not have every of the 130+ adchoices companies[0] and the countless other adtech companies from knowing every website you've visited.
The issue is that [at least in theory] seeing less relevant ads means you're less likely to click ads, meaning the average CPM for a website goes down leading to less revenue and a higher chance that the company runs out of money for the things they do or even for the cost of operating the servers.
> Is "I don't want to see relevant ads" really the user's ultimate goal?
"I don't want to see ads, period" is my goal, and that of many other users. I don't want the distraction, mental pollution, bandwidth usage, loading delays...
> I don't want the distraction, mental pollution, bandwidth usage, loading delays...
This argument concerns me, it's not like ads can take over your mind. I understand your position, but I'm of the opinion that it's not nearly as distracting as you seem to think. If a banner ad is there, great. It's not hurting me, it's sustaining the web, and if it is relevant, I might click on it.
My only gripe is pop-up ads, extreme overuse of banners, auto playing videos and performance. Oh my, performance. No wonder Google's had to make V8 fast, and it's still not nearly enough. If there was only one/two ads on screen at any time, I think that'd be a decent balance.
> This argument concerns me, it's not like ads can take over your mind.
Advertising's main goal is to take over your mind.
They want you to think of something and immediately put a brand on it, to spend your money on them above anything else, to think you need something that you don't, to feel inadequate for not having something and a lot more.
The fact you (and many others) think ads aren't that bad means you've conformed to what they want. To think that "it has to be this way and it's not that bad"
> Advertising's main goal is to take over your mind.
If it is, then it's doing an awful job. At the end of the day, I am responsible for me own actions, not banner ads.
That being said there are some ads that I do find disgusting which I refuse to let touch my computer, Taboola ads spring to mind. Just because I don't mind the concept doesn't mean I don't think there are huge issues with the ad industry, especially around ethics.
> The fact you (and many others) think ads aren't that bad means you've conformed to what they want.
So because I (and many others) don't agree with you, I've been warped by the ad industry? Who are "they"?
I never said they had low ROI, just that ultimately I am responsible for what I buy, regardless of how many ads I've seen. If I see an advert for something I'm already interested in, it's an extra datapoint to take into account if I ever purchase something. I don't consider this a takeover of the mind, that's just blatant hyperbole. If I like the product I'm getting, does it matter if I found them through ads?
Having worked a little bit with people in the ad industry (advertisers not publishers and I stress very limited work) they don't care if 90% is a 'waste' - because of the bidding system, that waste along with other things like click fraud is priced in. This only works if you can measure it.
Without tracking, digital ads would be worth less even if ROI was 10% higher (which I'm still sceptical of).
I know my tastes and try my best to not be influenced by marketing, but I wouldn't dare say "I was never influenced by an ad".
The very things you want or want to avoid are many times dictated by marketing. Maybe you hate those super advertised brands and choose to not buy from them, but the very fact that you do that means they influenced you.
Marketing is evil and influences everyone's lives. The fact that you tell yourself that "I still have a choice" doesn't make it any better. No matter how you put it, the bottom line is: marketing's only goal is to influence people's minds. Long gone are the days where marketing was simply a way to give awareness to a product.
FWIW, they could just be really effective at marketing... the effectiveness of ads. And nothing else. There are days when I wonder if adtech really works, given that most of the "proof" I've seen amounts to "it positively correlates... except when it doesn't".
> Or is their goal actually something more like: "I don't want advertisers tracking me", and not seeing relevant ads is just a side-effect of that?
I think this is a distinction without a difference. If you're seeing relevant ads consistently, the only way that's possible is if something is tracking your behaviour and/or interests.
> Is "I don't want to see relevant ads" really the user's ultimate goal?
I don't think ads are appropriate even when they are relevant to the page I am reading or the search I am performing. When I want to buy something I need to see trusted reviews and discussions. Then I buy.
I don't trust self interested messages trying to make me spend on what THEY want to sell. I want to make sure I am making the most informed decision, and that requires searching for relevant forums, discussions or reviews.
On the other hand I want to keep irrelevant information from entering my brain. My brain is my own, I don't share it with ad companies. I see massive tracking of my interests perverse, and the danger of being influenced against my interests disgusting.
That's why I can't listen to radio or TV and always use ad blocking while online. I almost never browse on my phone where ad blocking is missing.
Maybe I am hypersensitive because I got to live for my first 14 years of my childhood under communism, where thought policing was the rule. I fundamentally distrust the state and any agency that wants to make me do something I didn't initiate on my own.
> Maybe I am hypersensitive because I got to live for my first 14 years of my childhood under communism, where thought policing was the rule. I fundamentally distrust the state and any agency that wants to make me do something I didn't initiate on my own.
So you distrust all people whose interests aren't exactly aligned with your own, for fear that they might want to convince you to try some new thing? You avoid going out side for fear of seeing a billboard, make purchases randomly in stores, ignoring any branding, etc? If someone mentions a neat hobby or a food you haven't tried before, you run the other way and vow never to attempt that experience?
> If someone mentions a neat hobby or a food you haven't tried before, you run the other way and vow never to attempt that experience?
They important keyword here is 'someone' and not a huge corporation or state actor. I would open myself to people, especially if they don't have anything to gain from influencing me and are purely passioned about the subject they engaged in. Much of what is good on the internet comes from such people doing what they love for the sake of their passion/hobby (such as Wikipedia, parts of YouTube, Arxiv, HN and specific subreddits).
I think he's talking about the freedoms that ads take away. You don't ask to see the ad or the category of products it advertises, but someone found a way to show it to you anyway through some deliberate automated system, trying to steal your attention from something you are actually focused on doing at the moment and focus it on something else, distracting you and wasting energy and time.
How is that any different from a billboard, or a person suggesting you try a new food?
The only difference is that you're calling the web ads automated, which isn't always true (especially in the context were discussing, which is non-personalized ads), and not clearly distinct. Why is an algorithm on a computer vying for your attention worse than an algorithm used to place billboard ads?
They're all vying for your attention in a way you didn't explicitly request. The idea that internet ads are somehow different, and not just {better/worse/more effective depending on your perspective} doesn't follow for me.
Billboards are pretty different. It's like if websites had a separate page with ads where you go if you are curious or bored, not doing anything important.
And people suggesting something is on the whole another level. They have empathy, know how to behave, when they can approach you, can learn when not to bother you, etc.
I agree that that is the crux of the matter, but I'm not sure that the two sides are irreconcilable.
Google and site owners want to do targeted ads. I'm not sure that users hate targeted ads specifically. I, for one, am not a site owner myself, but I'm sympathetic enough to their need for revenue that I'm OK with targeted ads.
I hate stuff like auto-play video (and am thrilled that Firefox has well and truly conquered that issue) and I'm pissed that many companies sell or leak my data but I'm not looking to reject targeted ads outright.
Yes, I have allowed autoplay on the video sites, but it is now click-to-enable on all those others sites that are just using it for ads.
Go to about:config in Firefox, and change "media.autoplay.allow-muted" to false.
Chrome has shut down auto-play on any video with sound (excepting video specific sites) and they have a flag "Autoplay policy" which was partially successful, but not as effective as the blocking in Firefox.
You could get to it through about:config (so.. a little difficult for the common user to reach)
media.autoplay.enabled
media.autoplay.allow-muted
They then changed to
media.autoplay.default
media.autoplay.block-webaudio
media.autoplay.enabled.user-gestures-needed
And removed the previous parameters, which caused the videos to start playing automatically again, until you would configure the new properties (again, only though about:config).
Finally, they're now providing a menu option for achieving the same, which deserves to be applauded.
> large scale blocking of cookies undermine people’s privacy by encouraging opaque techniques such as fingerprinting
The safest assumption is that fingerprinting is and will be used regardless of whether local storages (of which cookies is one flavor) are forbidden -- these are not mutually exclusive.
I closed my accounts with Chase and Citi because I kept getting locked out of my accounts for suspicious activity. Disabling PIA VPN helped but I never could figure out what else about me trying to avoid being tracked looked suspicious.
Apparently they take a 'only criminals would clear cookies' approach.
I’ve always enabled fingerprint-blocking, first with extensions, now in the browser. The only places I had problems logging in are some very few online shops. Never a bank, and I had a bank with a password limit of 6 characters (about 5 years ago)
This reads as "save the cookies" to me. From a company who specializes in tracking of all forms including location from our phones and parsing all of our emails to enumerate our recent shopping orders on 3rd party sites.
What if websites just return to contextual ads and proper opt-in consumer studies (as traditional groups like nielson still do) instead of tracking everything their users do online and offline for aggressive targeting?
It doesn't matter. That's the beautiful thing about being the owner of Chrome, which kind of is "the web" now. Open specs and industry standards are basically what you propose and accept. And you get the data anyway regardless of these standards since you own the OS (Android), the browser (Chrome), and the search engine. So anyone else getting less data equals you being ahead of all others even more equals your risk of being disrupted is even lower.
At the end of the day, Google has a fundamental crisis. Their entire business depends on your lack of privacy from them. They would be happy to help keep you private from everyone else but they definitely need to invade your privacy to keep their business alive.
I wonder if they could survive off of search page ads and website contextual ads? No user targeted ads. If they could, they could squash the invasive marketing advertisers with lobbying for privacy policy and then regain a total monopoly as no other tracking adtech is viable anymore.
I personally believe fingerprinting should be illegal, and that any company that relies on it is inherently unethical. I think Google could survive without it.
I am by no means a Google fan either, just recognizing they have some power here and that fighting for ethical policy might actually be a good business move.
Contextual ads would be perfectly fine and unlikely to be any less relevant than user targeted ads. However, any ad company can easily offer this. Google's raisin-d'etre is that they track everyone on the Internet, and that they claim by doing so, can offer the best ads. If they admit that's not true, you might as well go with any other ad company.
In my read of the situation, Google's ads dominate because of their network. Other ad suppliers can't put their ads into Google search results, and it would be hard to build a network of display advertisers as big as Google's.
I wouldn't mind a shake up in the display network side of things. "Carbon" ad network is a great example of what can come out of a niche, well managed ad network and people choose them because they are present where the eyeballs for their demographic are. Like putting motor oil adverts on a racetrack. You don't have to track people around for those types of networks to work.
Oh yeah, there were multiple networks like Carbon, The Deck, etc. The only kind that doesn't suck.
It's weird that tracking-based ads have taken over. Logically, contextual makes a lot more sense. But apparently statistical data is telling ad companies that tracking is worth it :(
What's the exact age range you're looking for? 13 years is too old, its high school aged, but 3-5 years is too young? So we're looking what, only for products released from '07-'12?
Personally, I don't trust the fox (Google) to guard the hen house (my privacy). Sure, they might keep other foxes away, but I'll still be short a few hens at the end of the day.
Going to try to respond to this in good faith. I know there is some intrinsic bias what with Google being an ad company, but I'll try to ignore that and assume they wish to protect users within Chrome, advertising division be damned.
On point one, regardless of cookie blocking or not, these more aggressive means of tracking users are going to happen. If they (ad programs) can eke out extra data points on users, that's more value that translates into revenue for them so of course they're going to do it. They're saying they're going to do what they can to mitigate fingerprinting, face value, that's great, but it's like anything: a game of cat and mouse where when one opportunity closes, they'll chase another.
Point two is going to be a hard sell. They're asking users to care about advertisers when users have had issues with advertisers before (massive string of redirects to a "congratulations, click here for your free Amazon gift card/Xbox/whatever" page that screws with the tab's history to the point you can't even get back to where you were before requiring closing the tab anyone?). I don't disagree that advertisers are a necessary evil on the Internet (sites want to make money, we're usually not paying them, so forth and so on), but at the same time, some networks are historically bad for user experience: not in a "ah man, that banner ad is annoying" way, but in a "it just hijacked the page and now I'm trying to do a bunch of stuff to get back to the content/away from the site" way.
I don't think Google has its customers best interest in any of this. They try to make it seem like it at face value, but to deliver real privacy, they wouldn't get the data they need for their own business.
An example from the article- "First, large scale blocking of cookies undermine people’s privacy by encouraging opaque techniques such as fingerprinting."
I'm pretty sure fingerprinting can happen either way. It's just easier for Google to control cookies and prevent users from blocking cookies because cookies will provide more exact data on what Google wants.
They'll sell the point that they will keep your data safe from 3rd parties... but they will use it for their purposes however they please.
What's funny is, despite all the criticism this is getting on Hacker News, this move is probably going to end up doing more to improve the online privacy of the average user than every ad blocker and tracking blocker has combined over the last decade or so.
Google, the world largest ad provider and browser vendor, is essentially proposing to tear down the web's current free-for-all cookie-based ad targeting and conversion tracking system, and replace it with a privacy-respecting one built on standardized APIs implemented by the user's browser. If they're even moderately successful at this, it's going to be a huge improvement to the current status quo of online advertising from a privacy perspective.
For those who haven't, read the Chromium blog post on this effort[1] and the associated early proposals[2]. They're a lot more concrete about what exactly this new vision entails.
And for those cynics who are skeptical of this effort just because of their preconceptions surrounding the company proposing it, consider this: Google has every reason to want to improve the reputation of online advertising. Better, more privacy-respecting ads mean less people blocking ads, which means more revenue for Google. Last year Chrome implemented a built-in ad blocker for a similar reason[3]. Personalized ad targeting doesn't have to mean compromises to user privacy, and Google intends to prove it.
Even Privacy Budgets (a genuinely interesting proposal) is not particularly useful right now, because in order for a Privacy Budget to work, you first have to identify and build blocking tools for all of the potential fingerprinting vectors it encompasses -- that's the actually hard problem that Safari and Firefox are working on right now.
The only reason Google is at the table at all is because they have to be. They've disclosed multiple times in their earning reports that they consider adblocking to be a threat to their business, and adblock usage is pretty steadily rising. Right now adblockers are a minority, but if nothing changes they will eventually be a majority. Google's only here to stick a foot in the door and say, "but you're not gonna block all tracking, right? We'll come up with some advertising tools that are actually palatable."
But that's the problem: palatable isn't good enough any more. Given that Google is likely only here because they feel threatened, and given that other browsers like Firefox are putting in the majority of the work right now to actually block fingerprinting in the real world outside of proposal documents -- as a user, why should I be excited about compromising with the advertising industry?
Why should I be excited about blocking most tracking, when I have two browsers who are committed to blocking all tracking?
Google is really misreading the room here. Since when has the advertising industry ever showed restraint about anything? They're not going to stop fingerprinting just because an extra API exists. We can't bribe the industry into not tracking us with concessions, we just have to buckle down and block fingerprinting. There's no advantage to compromising; there's no reason to be excited about a theoretical new, better way to serve targeted ads when you can just block ads instead.
The fact that these proposals are compromises is precisely what makes them so potentially effective.
Instead of playing the usual game of cat and mouse where browsers move to block tracking and ad providers respond by coming up with creative ways to circumvent those restrictions ad infinitum, Google is creating a mutually acceptable compromise where ad companies don't have to fight against user privacy in order to make a profit.
It sounds like you believe you've got the upper hand in this fight, so you're more inclined to push for unconditional surrender rather than a peace treaty. That's understandable, and if you want to go that route I'm sure there will be other browsers and browser extensions which will continue to pursue that strategy. Just as, as you say, there will probably be advertisers who continue to peruse the "gather as much data on users as possible" route (though Google plans to fight those advertisers themselves to maintain the truce). But that's not the only valid solution to the problem, and I'm glad to see Google offering an alternative.
I do think that blockers have the upper hand, but even if they didn't, I wouldn't trust a compromise solution like this.
As a reminder, we're talking about an industry that uses wifi signals to track customer's positions through stores, that has embedded trackers in TV sets, that has bought credit reports on customers. The advertising industry is addicted to data; that's not going to change just because Google says so.
If Google's compromises here made me think that that advertisers weren't going to use fingerprinting any more, then I'd be more open to the idea. But even Google isn't pretending that proposals like FLoC will mean that browsers won't have to block fingerprinting.
And post DNT, I don't even trust that it would only be occasional bad actors. Advertisers have made it obvious that they're willing to compromise on privacy only if it doesn't cost them anything at all. The moment people en-mass start using private options, they'll back out of the arrangement the same way that they always have under the excuse that they have to stay competitive with advertisers who do fingerprint. Give it a year, and we'll be seeing think pieces about how FLoC just doesn't provide enough granularity, so we have to use FLoC and data-point X to be competitive.
So if we compromise with the advertising industry, and we still have to block fingerprinting to prevent bad actors, and it's the same amount of work to plug the same number of holes, what do we actually get out of this arrangement as users? It's not just that it's a compromise, it's a compromise that has no value to anyone except advertisers.
Seeing as how Google themselves (possibly the world's largest advertiser) seems willing to compromise, I don't think it makes sense to assume no other advertising company will be willing to follow them down that path. And even if no other advertisers were participating, isn't the fact that Google themselves are participating already a huge win? They are a pretty big player in the advertising industry after all.
As a user, what you get out of this arrangement is that a large number of online advertising agencies will stop trying to build systems that track your online activity, because they won't need to track you anymore to run their business effectively. They can get what they need in a way that doesn't hurt user privacy.
Yes, there'll be bad actors, but as you said that's a problem we'd have anyway. At least this way the number of bad actors will be reduced, and the economic advantage gained by such misbehavior will be minimized. Why would an advertising agency bother spending a ton of money trying to develop new fingerprinting techniques to fight cookie blocking if doing so doesn't result in any significant revenue increases over just using FLoC?
In effect, Google is fighting a two-pronged battle against online tracking: they're increasing the cost companies need to spend to track users, while simultaneously decreasing the economic benefit gained from doing such tracking in the first place.
It's (almost) funny to see how Google "reframe" the concept of privacy to preserve its business model !!!
- "large scale blocking of cookies undermine people’s privacy by encouraging opaque techniques such as fingerprinting"... so it means: one way or another, it's OK to gather as much information as possible to show relevant (targeted and weel paid) ads. But there are good ways (cookies) and bad ways (fingerprinting).
- "blocking cookies without another way to deliver relevant ads significantly reduces publishers’ primary means of funding". And what about delivering "standard" ads ???? Newspapers have shown non-targeted ads for years and nobody complained about that (I mean... except marketers maybe). If publishers stopped relying on targeted ads, maybe they may at last find some really new business model (like shared subscription) instead of just being lazy, using Google/Facebook products ?
- "If this funding is cut, we are concerned that we will see much less accessible content for everyone", meaning...? Oh... less things to show on google search engine and with AMP. As Google is not producing content by itself but using (stealing?) content produced by others, it's quite understandable that their bottom line is tied up to the content publishers providing free content. Actually, maybe Google should pay content provider to be able to include it in their serach engine ? (ok... link the link tax)
It seems to me that Google doesn't understand that users DON'T WANT TO BE TRACKED AT ALL. It has gone too creepy for years. It's not a matter of being tracked by one trick or another, it's not been tracked AT ALL.
But, admitting this would mean the end of GoogleAds. And Google seem to has lost any creativity to find new solutions to replace that product that users can't bear anymore. Sad... "do no Evil" if really gone forever :-(
You're right. But, Google does understand that their users don't want to be tracked. They just don't care.
The real solution is to "Get off Google". If everyone reading this did that, would it impact them? No, they would be fine. But I think it's better to do something rather than nothing.
I still use some Google products because it's far too inconvenient to get rid of them (notably, Android and YouTube), but I got rid of pretty much anything else that "normal people" rely on: search, maps, Chrome (or any Chromium-based browser), analytics, calendar, drive, assistant, Gmail etc.
Part of the problem is that tracking at scale really does add convenience (automatic home to work detection, traffic pattern detection, real-time routing around events.) But then using that information for other purposes (advertising your nearest fast food restaurant) isn't seen by users as compelling compared to the user's purpose of sharing that information. That's one reason why GDPR and its enforcement has been so weighted towards the _purposes_ of data collection and enforcing adherence to stated purposes.
The problem, of course, is that incentives are misaligned and the company's purpose in collection and use is often different from the user's purpose. And the laws are still written to the service provider's purpose not the users' purpose. Its left to the company to align them.
I use Waze, Google Maps, and Apple Maps. Each has a perceptibly different balancing of those purposes.
OSM with a good theme is pretty good... But it's the other features people use Google Maps for. If you use Android, you might want to try Locus, it can do online and offline maps, online routing from multiple providers and a ton of other stuff.
This seems like a cop-out to me. “What can I, as one person do? Just don’t bother since the impact is tiny.”
I think people leave out the word “only” in statements like yours. Systemic problems can’t be solved with only individual action but the actions of many individuals can very much set examples, serve as encouragement to others, shift norms, and make the political space for regulated or coordinated action more available.
We should not exempt ourselves from individual action, especially when it is inconvenient, simply because our small change won’t single-handedly solve the whole problem. This goes for browsers, “free” email, privacy, and environmental impact.
True. Customer boycott have worked in the past. DELETEUBER campaign was quite a big PR nightmare for Uber and this could be measured in their market share vs Lyft.
Same from buying power. You can either buy from small business and make a difference for a family. Or you can keep shopping at big box stores. Sure it feels like a small impact, but this isn't small at all for the small business. Those things add up.
They were created by the ascribe of a few powerful individuals (not in some conspiratorial way, I just mean the actions of a few company heads / marketing gurus etc), not by the individual actions of some billions of users. They will have to be dismantled the same way.
No. They were created by the individual actions of some billinos of users who (like me) subcribed to those problems (not regarded as such at the time) due to - i will agree - the actions of a few company heads / marketing gurus. But just as nobody was pointing a gun to my head at the time, nobody is pointing a gun to my head now: "Thank you for your services, I think I 'll be leaving now"
True, but it is not the way the problem was created. It is a part, but nothing compared to the conscious choice of installing chrome or signing up for gmail etc. The massive adoption of google products was not achieved the way you describe. To be clear, i agree with you. What you describe, is true and indicative of the situation. But I just want to point that individual action can solve the problem just as it was the main drive that created it
The problems were created because incentives aligned for people to make decisions that lead to the problem. The only way to revert it is to change the incentives. And that can't be done by just me or you.
Yes, but you aren't going to get enough people to change their decisions to have an impact without changing their incentives.
Right now people are incentivized to use these "free" services because they feel like that's what is in their best interest. And to be honest, they are probably right in the current state of the world. I would be significantly worse off if I refused to use any Google services. I literally wouldn't be able to do my current job.
Making any type of change isn't going to come from us, but from regulators who force change.
This conversation was started about addressing the systematic problem of Google tracking users despite their wishes. You responded saying that could be solved through individual action. Now you are changing the topic to be about Google tracking you.
Where by "Internet" I mean "running 3rd party code talking back to a server over a global network". One way is going back to the original WWW design of hyperlinked documents, removing cookies and javascript altogether. Tracking becomes limited to the set of hyperlinked documents accessed, which could be proxied by a 3rd party for anonymization purposes. However, there are a few downsides.
* No authentication to remote servers. Forget web email, shopping or banking. On the plus side, forget social media as well.
> One way is going back to the original WWW design of hyperlinked documents, removing cookies and javascript altogether.
While it doesn't seem likely to be viable even for a single person (primarily because of the government and banking services), it sounds rather nice to me, and such a sentiment pops up here and there.
> No authentication to remote servers.
No certain types of custom authentication, but standard HTTP authentication doesn't require executing freshly loaded code. Even custom forms-based authentication doesn't require to do that, or to use cookies.
> No business model for content publishers.
This is probably meant to be exaggerated; surely there are content publisher business models not reliant on JS-powered advertisement (e.g., subscriptions, tracking-free advertisements, donations, perhaps merchandise).
> No rich web UIs, for example maps.
I find this example rather interesting: it's a common and old application of pushing the limits of web browsers, and a seemingly common example of the need to do so (perhaps because maps are useful to many people, and the need in a specialized UI for those is hardly controversial). But it also seems quite sensible to me to use a dedicated program (which doesn't run on top of a web browser, that is) for mapping.
Thank you. The business model is likely faulty reasoning by my part: Observe that in current world the most effective business model is advertisement, out-competing alternatives, then incorrectly concluding it is the best in all possible worlds.
I think that’s an unfair reading of the blog post. They seem to suggest that advertisers can do targeting client side, using cookies, and without sending the cookies out. There are issues with this of course (enforcement, bandwidth), but it sound like a good compromise.
There is enforcement though. I think the idea is that once a standard for privacy-respecting ad personalization is in place, Google will be free to start taking more aggressive measures towards blocking more invasive tracking systems (like cookies and fingerprinting) without hurting their own advertising business.
The next evolution of the internet is most probably the internet of value. Google is in the position to push this and be the largest player in that new internet but reading their ideas on how to fix the current internet I don't think they are ready for real change.
I would like to suggest a great talk about this topic from David Schwartz.
https://youtu.be/FUtlTgWyX5w?t=5223 Use this URL for the web monetization part he come straight to the point. Whats broken, why and how to fix it in less than 2 minutes. He then explains more technically how the internet of value can/is be created. Very interesting for anyone who know the basic about IP and the different layer that made our internet really usable.
If people want fewer ads and tracking then more content will be paywalled. This is bad for a few reasons. For one, people who can't afford to pay their way thru paywalls go get their news from Facebook or sources backed by people like the Kochs or Breitbart.
Awesome, we killed advertising and fixed privacy but now what have we gained? A walled-garden on the truth?
>Technology that publishers and advertisers use to make advertising even more relevant to people is now being used far beyond its original design intent - to a point where some data practices don’t match up to user expectations for privacy.
The whole thing reads like Google is just discovering the abuse that they themselves have perpetuated and bred into a real monster.
> Second, blocking cookies without another way to deliver relevant ads significantly reduces publishers’ primary means of funding,
Cookies and tracking have nothing whatsoever to do with relevant ads. Publishers can put ads relevant to their content on their pages with no client-side participation whatsoever.
Tracking is important for ads that target the user. The targeted advertising business may account for huge amounts of money spent, but I would argue that it is a race to the bottom and adds no value to the overall economy. The outright failure of the targeted advertising industry would arguably be a good thing.
Why can't publishers like the NYT simply sell ads directly to businesses, and display them without using adsense and tracking? Like newspapers used to do back in the day...
Let me guess, everything about "Privacy Sandbox" is meaningless when the company that proposes it also holds a major stronghold on the web thanks to Google Analytics, Google SSO, etc. Google themselves will likely barely be impacted (and why should we expect otherwise, google is not going to shoot themselves in the foot)
> When faced with a tradeoff, we will typically prioritize user benefits over preserving current website practices. We believe that that is the role of a web browser, also known as the user agent.
Except without advertising half of the internet collapses. And to anyone suggesting that a paid model will just surface, I don't really want to live in a world where only the rich have access to tools as useful as Maps, Translate or Youtube.
"Based on an analysis of a randomly selected fraction of traffic on each of the 500 largest Google Ad Manager publishers globally over the last three months, we evaluated how the presence of a cookie affected programmatic revenue. Traffic for which there was no cookie present yielded an average of 52 percent less revenue for the publisher than traffic for which there was a cookie present."
(Disclosure: I work for Google on ads, speaking only for myself)
Wouldn't this data be tainted by the fact that those deleting your cookies are the sort of people who actively adblock or protect their privacy, and hence are naturally unlikely to click on ads?
If the default for everyone was to not target users, the actual revenue value would probably be a wash, because ad targeting is useless.
Or perhaps without cookies, only lower value ads bid? There are a lot of factors and it's not clear how they interact or if they'd be relevant if cookies didn't exist for any user.
Perhaps, but many are using other privacy protection methods. Like I use Privacy Badger, which aims to largely leave ads but kill tracking cookies. I don't ever click ads though, so people like me are who fill out much of the "no cookie" market.
If everyone didn't have user tracking, the metrics on the effectiveness of stalking everyone would be very different.
Google AdSense started off as contextual. Maybe the drop is so significant in this study because Google's ad network no longer knows how to be contextual and is crappy without tracking.
The half of the internet that would collapse without ads is the half people aren’t willing to pay for, and that is expensive to keep up. So, what would be left would look more like a library, and a bunch of paid publications and services.
> Except without advertising half of the internet collapses.
This is a lie adtech industry tries to spread, but it was never true. Among a top couple of million of websites almost none are funded through adtech, as it doesn't pay much to websites at all. Adtech first and foremost is about making money for advertising companies, on all websites it can convince to join in aggregate, and for a couple of big platforms, but definitely not about supporting websites.
There are a lot of good aguments that the bad brought by the internet out weighs the good.
I also don't think all the nice tools you described would go away. People would still provide these services for free, although it would be different people than today.
Why would you have to be rich? If something can be supported by ads, it can be supported by a less than a penny from each user. You don't have to be rich to afford a penny.
And I stated this in a separate child content, but the internet has proven repeatedly that services can be provided ad-free. No ads anywhere. They're either supported by donations, a subset of users who pay for advanced features, or run by volunteers on their own dime.
Listing three Google properties supposedly without competitors (which they do) does not prove that it's impossible (in fact, the hundreds and thousands of other web properties actually prove that ad-unsupported services can and do thrive).
I am not rich but I would pay a monthly subscription for a Google-like service that isnt as crippled or now politically inclined as Google is.
A service that gives me the same results as any other customer. If I want my results to be bubbled may it be by tags I choose such as "Programmer" bubble or "Medical Professional" and maybe even Public Libraries and schools could pay for access to such engines. If I remember correctly many schools pay for curated search engines as it is.
I would also pay money for a solid social network and / or IM service. If I dont see certain people on it even better. My favorite thing about the internet is meeting new people anyway.
Don't get me wrong, I would also love to have the option between a free and a paid model, but that's beside the point, which is that there will always be some people who can't afford the paid one, and for them, access to these tools far outweighs whatever privacy implication there is.
Furthermore, we all enjoy our accurate and free traffic predictions every day, but you do realize that it wouldn't be possible without user data, right?
Google’s annual revenue is $136b (5/6th is from ads) and there are ~2.5 billion active android users. So, google would still be wildly profitable if they charged each user $3.7 a month, and eliminated their ads division.
This ignores revenue from iOS users, and the cost savings from shutting down their ads and tracking businesses.
No it is not, Google is just proposing extra protections around existing mechanisms. That will continue to likely allow google nearly unlimited access to data.
That's not how I read it. To me, it just sounded like trying to standardize all the different approaches everyone is using, so we don't end up with a mess like the User-Agent field currently is, because everyone tries their own hack on top of hack on top of hack trying to play a cat and mouse game.
Trying to standardize is already what Mozilla and Apple (by openly stating their proposal was based on Mozilla) attempting to do.
The only thing Google is doing by proposing this is to attempt to remove or lower the impact on them.
Considering this in the beginning:
>Recently, some other browsers have attempted to address this problem, but without an agreed upon set of standards, attempts to improve user privacy are having unintended consequences.
> They proposed an option for allowing ads without them being a privacy nightmare.
Non-personalized ads won't be lucrative enough to fund the web. Let's stop this equivocation. When people say that preventing tracking will kill advertising, they mean that it'll kill personalized advertising. When you respond to these concerns by pointing out that non-personalized advertising would still be possible, you're using this one word, "advertising", to refer to a different concept. Word games make for bad arguments.
How is showing advertisements for your product not advertising? TV advertisements don't monitor every show I watch and channel I surf to, but they are still called advertisements.
You don't need to sniff my metaphorical underwear to serve me advertisements, and the definition of advertisement surely doesn't depend upon harvesting user data.
> How is showing advertisements for your product not advertising?
Quoting my own post, which you either didn't read or didn't understand: when people say that preventing tracking will kill advertising, they mean that it'll kill personalized advertising.
> You don't need to sniff my metaphorical underwear to serve me advertisement
Without personalization, ads aren't profitable enough to fund free-to-use web services.
> definition of advertisement
Nobody is talking about the definition of advertisement.
And how do you suggest a service such as Maps or Youtube stay free? Do you realize the amount of money that goes into maintaining those services? Would you be okay to returning to using paper maps? Or do you propose the rich get richer by having better tools and the poor are stuck with primitive tools?
How has openstreetmap.org remained free, if it's impossible to do without ads? Or torrents? Or GitHub? Or webarchive? Or any of the millions of other dual-licensed services or volunteer run services?
It will still get done, even without ads. The internet has, oddly enough, been the one to prove that to be the rule, not the exception.
I haven't tried it yet, but Peertube looks like an interesting alternative to Youtube. Mastadon and Matrix are similar exciting projects too. I host a Matrix instance that I and my extended family use to keep in touch, and we can seamlessly interoperate with anyone on any other Matrix instance. A few kinks to work out still (I wish the server software was way less bloated) but overall I think it's a really promising approach.
Google doesn't use webkit any more if that's what you're referring to.
But absolutely Google doesn't believe that. Even if the Chromium team truly believed in privacy (and their track record suggests otherwise), there still is a huge conflict of interest and daddy Google is not gonna let them interfere with their revenue.
That I know, but it is important the differences of a company funded by ads and the ones that are not.
Especially given this right at the top:
> Recently, some other browsers have attempted to address this problem, but without an agreed upon set of standards, attempts to improve user privacy are having unintended consequences.
If the EC or DoJ forced a spin-off of Chrome from Google it would fix that incentive right away. Chrome would be in a position to extract a large TAC payment to be the default search engine (like Apple) and they could realistically focus on being the best browser for privacy.
The post does not even give a high level technical aim of they want to achieve it. They just threw a "privacy sandbox" phrase at us that's it. If they are talking about standards there must be some technical solution they have in mind. What is this? Does anyone know?
And will we be stuck with whatever solution they come up with, even if it sucks, just because it's Google and they can pressure everyone into following them?
It's hard to take seriously initiatives like this. Google tracks users on a industrial scale that no other online company can match. Their digital fingerprints can be found in every corner of the web. Just why are they so obsessed with tracking users to death? I'm beginning to believe that tracking online behaviour is in the DNA of the company and that it simply can't reign itself back. It escapes serious scrutiny, not least because the tech community would rather rush to it's defence than scrutinise it's practices and behaviour.
It has, in my view, a narrow view of privacy (shared by large swathes of the tech community) which is essentially: privacy = security. Of course, you can't have privacy without security, but security by itself does not equal privacy. Or put another way, Google's attitude is essentially: we'll track you to death, but we guarantee that information will never leak from Google. The response from large swatches of the tech community: that sounds fine.
this statement is fraudulent and Justin Schuh should be embarrassed to have it associated with his name. i'm tired of being lied to about this company's obviously fake "values".
> Second, blocking cookies without another way to deliver relevant ads significantly reduces publishers’ primary means of funding, which jeopardizes the future of the vibrant web.
I think you could argue that targeted advertisement has jeopardized the future of the vibrant web.
This is just their ordinary dishonesty. When Google wants to actually be fraudulent they do it the Google way, they hire thousands of genius engineers to work on their ad software and then have none of them complete the endpoints for refunds and then have all of them ignore it, even willfully, until $75 million is stolen and a judge is sniffing around... twenty years later.
The question is, does that $75 million of fraud include the twenty years of missing banned-Adsense-account revenue Google paid $11m to conceal what they do with... or is that a distinct other fraud.
I have a better idea, bake in ID in browsers, then make the website/script request the ID and the user can choose if they want to identify themselves or not. A basic ID should only be a public key. Then the browser can make a "signup" request where they can get more info about the user. The user id can then be used for a passwordless login. Kinda like Google or Github ID but built into the browser, standardized and easy for developers to integrate. For extra security client certificates can be used (but current implementations need to be overhauled) and second factor hardware keys can be used for additional security. Ad publisher do not need to have the same level of security as bank id, so there need to be something very basic and easy to implement as well.
"Some ideas include new approaches to ensure that ads continue to be relevant for users, but user data shared with websites and advertisers would be minimized by anonymously aggregating user information, and keeping much more user information on-device only. Our goal is to create a set of standards that is more consistent with users’ expectations of privacy."
These ideas don't work together Google. First, I don't want to see ads at all to lookup information. Encyclopedias didn't have advertisements in their Table of Contents. Second, you can't show relevant ads for users while keeping user data anonymous, especially when your algorithms and methods of doing so are closed-source.
>>Recent studies have shown that when advertising is made less relevant by removing cookies, funding for publishers falls by 52% on average
Has anyone ever done a study of some sort of natural heuristic advertising? Instead of tracking people, track what a publisher publishes and through that, you can make some very plausible (and specific, I'd think) assumptions about the people who consume those publications.
Like, if I'm reading arstechnica, you don't need a tracking cookie to tell you that I am likely intersted in:
- Video Games
- Consumer Electronics
- Some light-medium scripting/programming
- Generally am technology minded
Yet they don't mention Android, where the security is so lax you can also be fingerprinted and data sent to a remote server without you ever knowing, where Chrome on Android doesn't even allow you to use ad blocking tools?
This is a weak attempt at showing how tracking Google has been a part of and enabled is not being accepted by certain vendors anymore.
Let's build a cross OS, cross browser and cross platform set of standards, not just for the web but computing as a whole.
Also, didn't Google propose cutting the web manifest to prevent tools like uBlock from working as well? Not sure what's going on with this...
1. Fails to mention that the reason why many people block ads is because publishers sue overly aggressive ads or ads that advertise harmful content (scams). Publishers who use these practices should be losing money.
2. Pretends that if cookies are used then companies will stop trying to fingerprint.
3. Sounds like they are trying to prevent blocking cookies entirely- I'm not sure how that is giving more power to the users.
This is like when Nintendo and Microsoft demand loot boxes include the 0.00000001% odds of getting what you want. They do this because they are looking for a way to save loot boxes because childhood gambling is immensely profitable and some children are so generous they may even spend thousands. They seek a safe middle-ground where most children can still be exploited and their parents defrauded but the mechanic itself is not made illegal and they are not breaking any law.
Google wants to save internet-wide tracking before it is irreversibly thwarted or outlawed for being generally abusive to society, because it put a hundred billion in their savings account and there's more coming.
Looks like Google has their calendars messed up and meant to post this on April 1.
"Privacy is paramount to us" and then blaming abuses of my privacy on me? Funny stuff Google; you've really outdone yourselves with your April Fools post this year!
> Recently, some other browsers have attempted to address this problem, but without an agreed upon set of standards, attempts to improve user privacy are having unintended consequences.
Haha, nice, blame it on the ones that actually try.
> First, large scale blocking of cookies undermine people’s privacy by encouraging opaque techniques such as fingerprinting.
Fingerprinting was used in addition to cookies. Banning 3rd party cookies is just a first step.
> Second, blocking cookies without another way to deliver relevant ads significantly reduces publishers’ primary means of funding, which jeopardizes the future of the vibrant web.
Here's the problem. Google only wants a solution if it works with it's current business model, which is ads. Trusting Google to protect our privacy is like trusting a burglar to protect your home whilst you're away.
Want to not be tracked ? Install Firefox or Safari, turn on ad blockers, use a different browser for Facebook (or use Facebook Container on Firefox), and refuse any "Consent approvals" when you visit a website. Also, advocate for laws like the GDPR (it has made privacy a bigger concerns for companies in the EU).
Why was this downvoted? Save the fact that you just shouldn't use Facebook at all, I think this is great comment. It actually references text from the post, unlike many of the other comments.
It must have been a few Google bots...
I especially like this text they wrote: "attempts to improve user privacy are having unintended consequences"
Unintended consequences? Oh no! Google's evil business model is being messed with!!!
In general, if Google talks about open standards, don't let it blind you. Somewhere in there _will_ be some loophole, through which Google will benefit from any standard they propose. And, in the very unlikely case, that we cannot find that loophole, their benefit will be, that some competitor has a disadvantage.
When Google talks about privacy, you can be 100% sure it is some kind of deception to make you trust them more or shift how you perceive things, in such a way, that benefits Google.
They are riding that "It's open! It must be good!" wave quite hard and too many people are fooled by such a tactic.
I couldn't resist laughing out, just as I think if this guy Justin Schuh had been presenting this in-person at any non-adtech gathering, he would have been laughed out of the room.
Maybe if he presented it to the Hacker News crowd, for which the popular opinion seems to be that anything less than 100% privacy is acceptable.
You know this, but plenty of people, including me, prefers free content to paywalls at every corner. That Google is ramping up efforts in trying to improve privacy while maintaining the viability of free content providers is very positive.
What's interesting anyway is what mechanisms they'll implement. I'm intrigued about the fingerprinting budget: how well will it really work?
I for one don't believe 100% privacy is even possible, and you've got to keep some kind of reality principle to decide what's "acceptable".
But anyway, as someone else said, I don't trust the fox to guard the hen house - even though again there's some kind of reality check that says if (one of) the most powerful actor starts doing something, there _will_ be some traction, so yeah, let's see concretely what it is about...
About the privacy budget: if I understand well, they want to count API calls that return device/environment-specific data, probably giving some set of weights to each of calls and stop accepting requests after a certain threshold. I'm a bit skeptical with that kind of complexity but why not.
I've got a better idea: let's continue to get rid of fingerprinting and third-party cookies alltogether, then place ads on topical interesting sites (eg lets return to content-based ads). This will also greatly counter the web becoming a few portal sites to other people's content. It solves two problems at once - that of surveillance and that of monopolization.
> Second, blocking cookies without another way to deliver relevant ads significantly reduces publishers’ primary means of funding, which jeopardizes the future of the vibrant web.
1. Other media - TV, radio, print magazines - seem to have no trouble delivering advertising without using tracking cookies.
2. The web was pretty vibrant before adtech became the monster it is today.
> Browsers would need a way to form clusters that are both useful and private: Useful by collecting people with similar enough interests and producing labels suitable for machine learning...
This still doesn't address spreading mass propaganda (show feed of immigrant crimes to floc 43A8C before the elections, because well, they happen to be Xenophobic) and user control. They propose browsers send random flocs to avoid clustering flocks of sensitive categories, but pretty sure the default isn't going to be random.
> Fundamentally, we want to limit how much information about individual users is exposed to sites so that in total it is insufficient to identify and track users across the web, except for possibly as part of large, heterogeneous groups.
So, this is a glorified version of Do Not Track but with budgets and involves more telemetry to be shared with browser vendors and/or websites? How is this even in the conversation? How about browsers simply block attempts to fingerprint a user or allow extensions that do? Penalise websites known to fingerprint from search results? Display a big red banner before the user navigates to that website? Proxy such websites, if the user so agrees and send only content that's relevant (like some kind of an advanced read-mode)?
> Preventing fraud is a legitimate use case that the web should support, but it shouldn’t require an API as powerful as a stable, global, per-user identifier. In third party contexts, merely segmenting users into trusted and untrusted sets seems like a useful primitive that also preserves privacy. This kind of fraud protection is important both for CDNs, as well as for the ad industry which receives a large amount of invalid, fraudulent traffic.
I like the concept of issuing tokens and then redeeming them later... but it seems like a lot of elaborate crypto to only really prevent ad-fraud (the advanced fingerprinting techniques they talk about elsewhere cannot prevent it?). It opens up another attack surface, whilst also not being truly anonymous to the first-party that issues the tokens.
> Since the ads industry today uses common identifiers across advertiser and publisher sites to track conversions, these common identifiers can be used to enable other forms of cross-site tracking.
> This doesn’t have to be the case, though, especially in cases where identifiers like third party cookies are either unavailable or undesirable. A new API surface can be added to the web platform to satisfy this use-case without them, in a way that provides better privacy to users.
This is all out battle against content-blockers. Block third-party cookies, and still you're in their cross-hairs... but wait... this is more privacy-friendly, so will you, the end-user, please suck it up.
--
I hope Raymond Hill is reading and already thinking of ways to mitigate this. I see this as push by Chrome in response to the ecosystem Brave is building and the direction Apple is going, and simply taking control of the privacy conversation by doing what's best for them, all around, without holding back any punches. That's not to say, Apple isn't doing the exact same thing! BigTech can't be trusted with privacy, and this post kind of proves it?
"It is difficult to get a man to understand something, when his salary depends on his not understanding it." Indeed.
This is disgusting. I was actually excited in thinking this was a development of a new concept, instead it's a plea for folks not to use ad blockers so they can make money from tracking and adds. If this was April 1st, I would laugh at how hilarious it reads, but it's not.
The chromium blog entry mentions a browser that blocks cookies. Which browser are they referring to?
> We’ve seen this recently in response to the actions that other browsers have taken to block cookies - new techniques are emerging that are not transparent to the user, such as fingerprinting.
Firefox allows control over cookies, and there are a number of extensions which serve this function as well. Does Chrome not give an option re cookies?
It would be a start if users could select in the browser if they want to receive personalized content/advertising, be tracked or whatever. All those new GDPR induced layers definitely make the web a whole step worse (and I don't blame the GDPR for that).
Every time I see a website having 'essential' and 'functional' cookies I start freaking out because, in my opinion, those websites are just lying. In most cases, their 'functional' cookies are either marketing or tracking cookies, but never relevant for the core functionality of their website.
And there are still too many websites out there which either try to make you click the 'Accept all' button or simply don't understand that the law requires an opt-in and not opt-out.
Pushing the UI for the cookie-use-case selection into the browser would at least end this UI layer hell. But I am sure, some browser vendor will mess up the party and go some extra way, similar to what we had with the DNT header... Maybe we just need another law like 'Do Not Track means Do Not Track!'.
Any update on the whole adblock thing that was popular here a little bit ago, with them deprecating the ability to drop web requests? Seems kinda paramount to this initiative.
As expected, it was pretty overblown. Realistically though we won't really know for another year or two. The specs keep on changing, and even if it's finalized, it won't be enforced for a long while.
How am I going to trust google on the word Privacy when just yesterday my Android smartphone force updated 3 apps on me while on 4G and preference is set to "no automatic" update at all and only on WiFi ?
This is about an insincere as Google's lazy abandoned efforts to do something about bringing in end to end encryption for email following its exposure as being involved in PRISM.
Google talking about enhancing privacy on the web reeks of the sulfurous lies of Satan speaking through his children on earth.
Google has 0 incentive to actually introduce privacy, let alone anonymity or Constitutional rights to free speech.
But ALL the incentive to undercut and take out all the actual efforts to redistribute the web and introduce real freedom and privacy and allowing humans to have the rights they already have and which government does not have the right to curtail or allow the curtailment of in any way (as outlined in the Bill of Rights); regardless of whether the tyrants keep trying to ever increasingly tighten the noose they have slipped over our neck at this point.
No need to try to flee or put your efforts into redistributing the internet, citizen, see, we are providing all you need here on FAANG Ranch.
"If y'all had just handed over your wallet when we asked, we wouldn't have had to hurt you. So really, this is your fault."
More seriously, what Google is saying here is that privacy can't be mainstream. It's no different from the arguments that advertisers gave when they rejected DNT. They're willing to let a few privacy-minded people avoid being tracked, as long as that group never becomes a majority. But it was never their plan to get rid of tracking, every compromise they ever offered had the catch, "as long as it doesn't affect our bottom line."
It's just another Equifax settlement. "Yeah we offered it, but we never assumed that so many of you would take it. Be reasonable."
Heck that. If you genuinely believe that people should have a choice about privacy, then you have to accept that maybe the majority of people will decide they don't want cookies. You have to take a step back and consider that browsers that block cookies by default are just reflecting what their users already widely want; people want their browsers to protect them from tracking by default, without configuration.You're not offering users a choice, you're just mad that you can't use dark patterns to make users unsafe-by-default.
This entire article is just insulting.