I never understand the intent behind introducing this move - is it greed for more ad money through lockdown ? The company is literally a monopoly and still it wants more. This seems like the company is bowing down for shareholder supremacy.
In my opinion its not really a conspiracy. People at Google are so far up their own ass that they don't understand the importance of content blockers.
Personally I just can't deal with websites without ublock modifying them.
Exactly, the experience on the web is much better with ad blockers!
EDIT : I think my first comment triggered a bit of discussion. I also don't think there is any conspiracy (conspiracy is a big word :) ) - because I didn't intend to portray it that way. What I did want to convey, from the vantage point of a user, is that the priorities didn't seem to align.
Depends on what you consider a "conspiracy". The word "conspiracy" sounds unnecessarily controversial and mysterious. It seems plausible and probable to me that there are economic motivating factors at play.
The initial proposal was for a static list of blocked uris, and no api to add an item to the list.
That really speaks to a conspiracy to me. What plausible reason would there be for that? You would have had to push a new version of an extension to change the list. There's no way that omission was an "oopsie".
I don't think they are far up their ass. They don't give two shits on a popsicle what tech users think. They have an absolute monopoly on almost everything on the internet. Apps, ads, search, phones, browser. Where we gonna go if they kick us out?
> I never understand the intent behind introducing this move
I think the reasons provided are valid (namely, faster blocking with less data flowing through extensions), whether or not you think they are good enough to outweigh their disadvantages.
I think you should read this part of the article too :
"Their study --which analyzed the network performance of ad blockers such as uBlock Origin, Adblock Plus, Brave, DuckDuckGo and Cliqz'z Ghostery-- found sub-millisecond median decision times per request, showing quite the opposite of what the Chrome team claimed."
What are they really optimizing for ? The sub-millisecond times per request is barely noticeable, and on any given day its much preferable to ads tracking you everywhere on the internet!
I read the article the quote was pulled from, as well as participated in the discussion and responded to the author when it was posted here. Just FYI, Safari (which already has this feature in the form of content blockers) is able to block ads just fine, and anecdotally the performance is slightly, but noticeably better.
The question still remains: what are they optimizing for if we already have sub-millisecond response times?
The only evidence you offer is anecdotal. I'm having a hard time imagining perceptible differences in sub-millisecond response times so either you or the study are in error.
And just to reiterate other comments: yes, Safari is probably able to block ads just fine now, because they modeled the engine according to capabilities needed right now. However, it comes at a huge cost to innovation and creates yet another barrier of entry guarded by a large corporation.
Taking all of that in account, it seems unnecessary. The looming threat of ulterior motives from Google remains ever-present.
To be honest, I'm much more in favor of this API because of its privacy features than performance. I think it's a little bit faster, but it has a much bigger benefit on the trust side because extensions no longer run arbitrary JavaScript in every page I visit.
It's very improbable the few popular ad blocking extensions could slide in something sinister and avoid public scrutiny for a long time. You have much more reason to be worried about your privacy from Chrome than from uBlock Origin, since Chrome and Google provably violate it.
JavaScript code will take a millisecond to decide whether to block a request or not and some hard-coded browser function will do it slightly faster, but not by much, because most of the work will be done by the same regular expression, so we are talking about fractions of a millisecond here and you claim to notice that?
Safari pre-compiles the blocking list into some internal representation that is faster than a raw regular expression, and the matching operation is performed many times per page. So I do think it's possible to notice a small performance benefit?
To filter URLs, you have to parse them, check if the domain is blocked using a hash table and then search for thousands of substrings in the path and query parts of the URL. If you use a regex for that, most of the filtering will already run in native code. I guarantee you that this gives you a tough to beat baseline with almost no room for improvements.
Looking at the WebKit implementation, the authors are shipping their own regex engine for whatever reason. I doubt that it beats the battle-tested re2 engine by large margins, if at all.
I don’t think the content blocker API allows for the full set of features you’d find in standard regex library. This might mean that WebKit can roll their own regex library that’s better optimized for this subset?
Yeah, the user experience with current ad blockers is that browsing is way faster then without an adblocker. So anything Google states here is dodgy at best.
That justifies adding the new API, it does not explain removing the old capabilities. They could nudge extension developers to try the new API by adding some "extension X may slow down page loads" warnings somewhere if they don't use it.
It's also likely that the net gain in performance by blocking resources with heuristics (that wouldn't be blocked by a simple pattern) outweighs any loss. At least for many websites.
The only argument that made sense to me was the one that the current APIs makes it quite difficult to reason about what the extension actually does once installed and running.
By dynamically installing rules downloaded from the web a nefarious ad blocker could, for example, not just block ads but also hide certain political content from search results.
By requiring the list of rules to be hard-coded in the extension, it's easier to see what exactly the extension will do once installed.
For me though, this benefit does not outweigh the cost.
That's an interesting thought, but extensions can still hide, inject, or replace content. They just can't avoid downloading the original by any criteria other than uri patterns.
Sure, but the idea, as I understood it, was that by locking down the APIs such that they can't change behavior post-install, any such patters would be visible on inspection.
For example, extensions are already rejected if they contain minified scripts as that also obfuscates what happens. This could be seen as going one step further.
Ah, yes. The separate set of manifest v3 proposals that disallow external and/or obfuscated code. Probably a good idea at a high level, but it does break good stuff like TamperMonkey (10 million users...ouch).