(I’m responsible for Cloudflare’s L7 security products)
While we can’t comment on the specifics of any customer configuration, we do not block or challenge Firefox by default—either with our Bot Management products or with any other L7 security controls.
You can confirm this by signing up a free zone and making a request from Firefox.
You're saying there's not a simplistic "block Firefox rule" right? But, the user agent is surely one of the weighted features going into your ML stew. So it's plausible the poster is seeing that Firefox sends some calculation over the edge and causes blocking for them.
That is, you're not saying "Firefox doesn't change the scoring at all", right?
To add, without knowing the homepage firewall rule g2 has set up, we won't know exactly what sort of rules is triggering this, although the most likely signals they're using are either bot scores[0] or threat scores[0].
Customers can create firewall rules to challenge requests using any type of request metadata. User agent is one of the filters, that can be used in manual Firewall fields.
This is almost certainly a firewall rule put in place by the operators of that site. My own sites which are protected with Cloudflare do not exhibit this behavior when using Firefox.
Privacy preserving extensions like uBlock Origin, Canvas Blocker, Decentraleyes etc. used in Firefox also trigger the CloudFlare wall of harassment. Some of the actions of these extensions prevent browser fingerprinting and CloudFlare gets easily confused when this happens and unnecessarily triggers a lot of challenges for a user. You can easily get sucked into the blackhole of captchas - sometimes even solving 5+ captchas isn't enough to convince them that you are a real human.
I used to work there, and there wasn't a global "do this for this browser" (except for a little bit to reduce annoyance specifically for the Tor browser).
It is almost 99% certain to be a site operator firewall rule based on the browser user agent. This may even be accidental, they may have been hit by an aggressive bot using a UA string that matches Firefox and the site operator may not even realise they've done this (if they use Chrome, which is likely).
Is there a meme like name along the lines of "self-own" when you go online and have a tantrum about someone doing something to you only to find out it's your own doing that caused the issue?
I think it's two-fold: rise of tools like curl-impersonate (https://github.com/lwthiker/curl-impersonate) and the very consistent Firefox TLS fingerprint across platforms. Unlike Chromium (where you could differentiate a Linux, Mac or Windows computer from its chiphers, and so for example challenge only Linux clients), Firefox has NSS and NSS is used everywhere the Gecko engine is used while Chromium, although has BoringSSL for modern chiphers, also uses the underlying TLS stack of the operating system (whether it's Microsoft's SChannel, Apple's SecureTransport or Linux's... NSS). The only time Chromium uses a pure BoringSSL implementation is on Android (Conscrypt).
Why jump to attacking them? Occam's razor applies, and there's no proof Cloudflare explicitly is targeting Firefox specifically, unless you somehow have internal access to their source and know there's checks explicitly looking for FF there.
It's pretty simple to check. Go to https://www.g2.com/ in various browsers. That's what I did when I read the article. Here are my results:
Firefox: Javascript challenge
Chromium: No javascript challenge
Edge: No javascript challenge
How could you call this anything but an attack against firefox?
edit: Seems cloudflare has responded[1] and they claim it is a setting set up by the site author. So this attack is done by g2, using cloudflare, then.
Went to g2 from Germany. Got challenge, but was fast. Maybe 0.5secs. No challenge after closing and going there again. Maybe a geolocation thing, because of botnets, ddos, war, whatever?
I think you mean Hanlon's razor, but that adage is a farce. Every scoundrel in the world knows the trick of pretending it was all a mistake. Skepticism is warranted.
I can't see why Occam's razor wouldn't apply: don't postulate more entities than are necessary to explain the facts. Specifically, there is no reason to postulate some kind of anti-Firefox elements within Cloudflare when it's simply explainable as a mistake.
(There's a reason Hanlon's razor was given that name, I'm pretty sure, and that's because it's not too dissimilar from – essentially it's a much-more-situation-specific subset or variant of – Occam's razor. But I don't think "attribute to stupidity" was what the parent comment was suggesting.)
It doesn't matter which razor you want to apply. These guidelines work well for natural/random phenomena but if you are talking about a potentially malicious group of sentient actors you have to assume that they will abuse your willingness to brush off ambigious acts as benign. For an organization that controls as large as a percentage of web traffic as Cloudflare I don't think we should be giving them the benefit of the doubt. Even if this is a customer rule, Cloudflare are still the ones making it easy and cheap to implement such rules.
I think that definition is right, but only where 'simplest' factors in the probability of each premiss, not just the aggregate number of premisses. And yeah, on that reading it's trivially true, but something's being trivially true doesn't mean humans as psychological beings don't need reminding of it.
I would take a more measured stance. I don’t think Cloudflare is explicitly targeting Firefox. I don’t even think they are aware of what this does in Firefox. (You can definitely design systems that don’t hardcode something but still clearly only affect it.) The likely explanation to me is that Firefox blocks cookies or something and Cloudflare marked this as suspicious. It shouldn’t, but it does.
It's probably not a deliberate attack, but consider the incentive structure and the goals here.
Cloudflare is a site protection service. Their goal is to keep malicious traffic from griefing their customers. To do that, they have to distinguish malicious and non-malicious traffic. That becomes a game of trust, and if the user agent is masking its signal (or, more generally, making its signal less distinct) in the interest of privacy for its user, there are fewer trust hooks for Cloudflare to use to distinguish legit Firefox-shaped traffic from attacks.
Ironically, attempting to keep oneself anonymous can make one lest trustworthy. The server is always at liberty to refuse to serve.
I mean, with that market share, popularity among open source fans... it's an easy group to target and filter oddballs.
I'm a Firefox user and I'm used to this treatment at every step of the way, no matter if it's about software, airports, opening a bank account so I can receive a salary, etc. Fundamental things everyone wants to do are being made hard to do the right way. It's always anti privacy, anti self repair, anti longevity/sustainability, anti user freedoms, anti whatever we ideally want in this world. Of course using Firefox is now suspicious.
Really? I'm a firefox user too and whilst I occasionally bump into some situation as you describe where I need to hop onto Chrome, I genuinely can't remember the last time this happened. Nope, actually - sometime last year, with some poorly-coded gig ticket purchase thing, if I remember right. I was able to check the code and amend some stupid niggle in WebDev tools to get around it.
Not saying I should have to accept that, but that's what it was.
I certainly do not see it on the frequency the GP is talking about. But it's there. Every once in a while there is a site that insists on throwing you into infinite captchas, on requiring some extra information, or just refusing to work just because Firefox won't spy on me for their benefit.
I also do really not experience it "regularly", but different people visit different sites.
That said, the one problem I get most often is this Cloudfare one.
Yeah, I find 99% of the web is fine but the exceptions are jarring. Fedex tracking is totally broken in Firefox. Quite a few sites still struggle without 3p cookies, which also affects Safari. Slack has some issues. It's also come up a few times while looking for a Google Docs replacement—several options are effectively Chromium only, which is ironic when you're only looking at them so you can avoid Google products.
Often, just changing the user agent string so that Firefox appears to be Chrome will let those sites work. I.e. they work fine in Firefox, but are restricted by user agent string checks to reject that browser and/or OS.
There are several plugins for Firefox that make this easy.
I've used Firefox and Mozilla for what, 20 years? And I've never bumped into a situation like this. Some very clearly shit internal IT systems had pathetic IE requirements, but never one that said sorry go get chrome
They don't always say "go get chrome" but one example that still happens to me is logging in to Outlook email at work, if I use Firefox I get a hobbled "basic HTML" interface, if I masquerade as Chrome on Windows I get the full UI and it all works perfectly.
Google drive is another one, where some things stop working depending on the user agent I'm presenting.
For a while now I can't even access WhatsApp Web [1] through Firefox. It gets stuck on the loading page and keeps refreshing itself forever. I have to resort to Chrome whenever I need WhatsApp on my laptop.
If you time, please file a bug report on Mozilla’s webcompat.com with the steps you took and a screenshot. Mozilla’s webcompat developers will try to debug the website, reach out to the website’s web developers with a proposed fix, and until the website is fixed, add a site-specific intervention or UA override in Firefox itself, if possible.
Looks like there are already a couple bug reports about whatsapp.com, but not the forever-refreshing page you see:
I don't think I can be of much help but WhatsApp web has been working for me on Ubuntu with all Firefox versions since WhatsApp web has been a thing. Did you try to start Firefox in safe mode, no addons and/or a fresh profile?
I'm a firefox user too and whilst I occasionally bump into some situation as you describe where I need to hop onto Chrome, I genuinely can't remember the last time this happened.
Sadly I find it happens more and more often, even with important things like financial services.
What I haven't figured out yet is whether it's really the Firefox engine that is the problem or just that Firefox also provides better tools for blocking obnoxious trackers and the like. Maybe because I have those tools turned on by default more sites using obnoxious trackers break in Firefox than other browsers. In most cases that is a feature not a bug but it's frustrating when essential functionality on sites providing essential services gets broken as collateral damage.
What I find most common is that the website has arbitrarily blocked firefox based on the user agent. The site works fine if you tell it you're using Chrome, so it's neither an engine nor a tracking issue.
Just in the last month Firefox didn't work for me on my health insurer's site, my bank's site as well as when trying to request a mail-in ballot for a regional election. Obviously not Mozilla's fault, but for me at this point the browser isn't that usable any more because that's important services I need to have access to.
On large international sites the likelihood that Firefox works flawlessly is pretty high. But a lot of regional service providers appear to kind of have given up on it.
About once or twice a month I come across a site which hasn't been tested properly on Firefox and a flow of some sort doesn't work. The most recent was accessing scientific papers via a journal and academic society login. The redirect was broken on FF. Oh, and Facebook. Although that seems broken in various ways on all the browsers. Pretty remarkable.
Google Cloud Console works only in Chrome for me. I've been accessing it with Vivaldi for years, then I got the you broke my internet sad robot icon since a few months ago. Firefox loads the page but most of it is blank. I played with the controls of my privacy plugin with no benefit. It could be my configuration.
It's common enough that you recognize that you need to use a chromium browser, that's entirely too common. As a recent example, UPS tracking was/is broken on Firefox.
I just followed your link and it says, "Your browser is not supported. We recommend using the latest version of Safari, Microsoft Edge, or Chrome."
So apparently Firefox is subversive and scary enough to make Apple refer me to Microsoft or Google to avoid it. I guess that's what passes for "Think Different" at Apple these days.
I tried spoofing Safari's and Chrome's User-Agent string in Firefox and business.apple.com still blocks me because I'm not using a real Safari or Chrome. So they appear to be using feature detection of some API that's not available in Firefox. I see that the site also blocks Apple's own Safari browser on iOS.
Looks like this site was been blocking Firefox users since at least 2018, according to the bug report on Mozilla's webcompat.com issue tracker:
UA spofing used to work some time ago, as I was definitely using it that way.
it seems they recently changed something, because now I can't even load the login page properly anymore due to X-Frame-Options disallowing the embed of the auth iframe.
My personal experience is that I do fallback some sites to testing in Chrome quite often, maybe once a week. But the number of times where using Chrome actually fixed the issue is maybe once a year? Really, it's just that I stumble on quite a number of broken websites.
Appreciated :). I see you're from nearby, might you happen to be looking for dev or security work? (For anyone else reading this, remote is also possible. Contact in profile, to not spam the thread.)
I'm not actually from Aachen myself, but nearby in NL and I work in Aachen nowadays and wasn't sure what username to choose.
If someone else would want it, I'm also fine passing it on. Dunno if there are guidelines on that actually but I'd consider an HN username like a domain name: finite, first-come-first-serve in principle, and hoarding is not cool.
A workaround is to install the Privacy Pass extension to bypass the captchas [1] [2]
It's an open source extension available for Chrome and Firefox. It allows to privately identify you're human, and is the process of going through IETF standardisation, so hopefully someday you won't need to install an extension for it. After you complete a captcha once, you won't need to do it again for a long time.
I'm not happy about installing extensions just to view some websites, but it'll make things less painful
Deanonymizing yourself just to appease cloudflare is not a valid solution. Any website should work in any browser out of the box. If they don't, the website is broken.
"The blind signing procedure ensures that passes that are redeemed in the future are not feasibly linkable to those that are signed. We use a privacy-preserving cryptographic protocol based on ‘Verifiable, Oblivious Pseudorandom Functions’ (VOPRFs) built from elliptic curves to enforce unlinkability. The protocol is exceptionally fast and guarantees privacy for the user. As such, Privacy Pass is safe to use for those with strict anonymity restrictions."
> You have to trust a whole lot of companies to get onto and use the Internet.
Obviously, I wouldn't dream of asserting otherwise. My point is that for the vast majority of the population, a paragraph of technogibberish about cryptography doesn't fundamentally change anything, you're still reliant on trust. To most people, that paragraph is worth about as much as a basic promise. The worth of that statement is derived from whatever trust is had in the corporation and the ability of academics and regulators to stay on the ball and keep corporations in check.
If somebody who isn't a cryptographer has decided not to trust Cloudflare and not to trust the rest of society to keep a company like Cloudflare in check, then that whole explanation isn't worth much. It's boils down to saying "Just trust me" in response to somebody who just said "I don't trust you."
It's worth a bit, but it doesn't assuage all my concerns. Even with trust in the EFF to be both well informed and earnest, I think there is still reason for doubt. I've read it claimed many times that cryptography is easy to fuck up in subtle ways, and that these fuckups can go unnoticed for years. Furthermore, subtle flaws can be deliberately engineered into cryptographic schemes and probably concealed from notice for many years. The more novel a cryptographic scheme seems, the more reason there is to doubt that it's been inspected and verified from all angles. I've never heard of VOPRFs before today, they don't seem to have a wikipedia page and the articles I've found about them with a web search are all very recent.
Furthermore, there is the matter of Cloudflare itself, specifically it's size and scope. Concentrations of data are magnets for intelligence agencies. The more data a company has access to, the less I trust them to keep it safe.
> When it's the default state of things, what use is pointing it out going to bring?
I believe that appeals to math can obscure the role of trust. This is demonstrated by the formation of an industry of scammers exploiting the phenomena. Millions of people don't understand cryptocurrencies but buy in anyway, confidence bolstered by their lionization (but not comprehension) of math.
I think it's an illusion worth drawing attention to.
Sorry, can I get a layman's translation? What prevents websites from using Privacy Pass to track user behavior? (Beyond determining who is and is not a bot.)
Basically, you fill a captcha once, and that gives 30 anonymous one-time-use tokens which are stored on the browser. The cryptography used ensures that there's no way to associate the one-time tokens between each other or back to the original captcha. Redeeming the token proves that you've already filled a captcha, and will bypass the captcha for that session.
Cloudflare is the one putting up the captcha-wall and deciding whether to forward your request to the destination site. Your browser sends Cloudflare a token, then if Cloudflare accepts the token, it forwards your request. The destination site does not see the token and so cannot use it to track you.
Since Cloudflare does see the token, it's reasonable to consider whether Cloudflare could deanonymize you across different sites. Privacy Pass uses cryptography that claims to prevent that.
The Internet is a network with social effects. Whether "this didn't work" means "the website is broken" or "the browser is broken" has always been more about end-user experience and the wisdom of crowds than a more concrete definition.
A website broken only on Firefox works for 96.5% of users. I have personally had to make the hard judgment call (as a fan of Firefox!) to not spend 25% of our engineering debugging time on a problem only 3.5% of users encounter.
It seems as though capitalism has little room for craftsmanship as a virtue. The only value becomes the dollar value, people see no shame in shoddy workmanship so long as it's profitable.
The way I see it, Firefox's reputation for craftsmanship is unearned. The only time it crosses my desk as a site developer is people filing browser-specific bug reports for it. Its engine does not, in general, benchmark as performant as either Chrome's or Safari's on our site. It's certainly not beating them by enough percentage points for me to suggest people switch to it for performance.
Mozilla has had more time to work on this problem space than their competitors, and they don't have the technical advantage to show for it. They may have been the technologically better choice in the Browser Wars era of Internet Explorer, but nowadays? They're falling down on the technical merits, not just the network effects.
It's free and it's widely available. If they were better than the alternatives more people would switch to them but they're not.
But that's the issue. It's not my problem. My problem is maximizing the user experience for most of my users, and that involves squashing usability bugs common to all browsers and adding features that have been requested, not keeping up with the Gecko quirks-du-jour.
(Speaking of "quirks-du-jour", the problem eventually "solved itself." The next major iteration of Firefox fixed a rendering regression and resolved the bug. We "solved" the problem spending zero eng-hours on it; you can't beat that for efficiency. But that's the challenge Mozilla faces as an also-ran: burden's on them to keep up with the competition and make their rendering agent on-par with other agents for both performance and strangeness, because they lack the market clout to make developers bend to their flaws and oddities. No matter who the front runner is, there are always flaws and oddities.)
See it is your problem to offer something to the general public then serve only the defacto monopolist instead of web standards. Because with each small compromise we each contribute to the problem until it reaches a breaking point. All the while those on the margins suffer, some with no real alternative.
For ex in poorer areas where they cannot afford a computer that runs Chrome (which has no LTS/ESR).
Some people simply have no sense of community or civic duty. Everything is all about them, all the time. If they stand to profit from antisocial behavior, they won't hesitate.
the disconnect, as I see it, is the assertion that supporting Firefox is supporting "community" or "civic duty."
Most of the Firefox alternatives are standards-compliant also (specifically, the two big ones definitely are). And I don't see as many rendering regressions with them as I do with Firefox. So who truly benefits if I devote my team's engineering resources to chasing down Mozilla's bugs?
There might be some benefits to an engine multiculture; with so many engines derived from Chromium or Webkit, one could make a technical argument that maintaining Gecko as third choice has merit. I find that argument to be weak. Gecko has been around for longer than the other two and it isn't remarkably better (and seems to fall on its face quite often relative to alternatives). What if it's just a tech stack whose time has come and gone? How many resources are we wasting propping up an old stack that could be used to build, perhaps, a fourth option? Or solve existing problems in the other two? There's this vague hand-wavy assumption that Firefox represents "the open way of doing things" (odd when it's also maintained by a corporation, like the alternatives), but I don't see it as particularly more open than the other options.
I don't think I'm doing a disservice to the community by refraining from using jQuery and I don't think I'm doing a disservice to the community by refraining from going out of our way to support Firefox.
The monoculture is ruled by Google. Well funded forks are rarely breaking from upstream. So in practice supporting only one browser, or even just Blink / WebKit, yields more control and power to a single company.
>Deanonymizing yourself just to appease cloudflare is not a valid solution. Any website should work in any browser out of the box. If they don't, the website is broken.
I totally agree with you. I think maybe an upper limit per ip (maybe a bit higher for tor ips) would be need to prevent DoS type attacks.
Firefox user - have seen this many times. Always assumed it's either NoScript or Firefox's built-in tracking protection meaning there's some pre-existing cooking not set that Cloudflare places from other site visits, or because some script is getting blocked somewhere else.
Seeing the cloudflare challenge as a user can happen in any browser depending on the site and it's configuration. A site can choose a security level that requires all visitors to receive a JS/captcha challenge, and sites can make custom firewall rules to require JS/captcha challenges on any of hundreds of different attributes of a particular request using CF's web application firewall tools. One of those attributes is user agent, for instance.
They aren't, they're commenting on company general policy and making the obvious deduction from that.
It's (literally) basic logic
1. Some site flags Firefox.
2. Not all sites flag Firefox.
3. Either every site flags Firefox or individual sites flag Firefox.
4. It is not true that every site flags Firefox (obverse of 2)
C. Individual sites flag Firefox (disjunction with 3, 4)
I can confirm this on pure (native) FF as well. As someone who has run multiple large web properties this hurts to see. I've observed firsthand the impact of what adding friction to certain browser segments does to utilization, if large corps decide that FF is a "risk" it will do a number on their already struggling traction, and we already live in far too much of a browser monoculture.
I use Firefox Focus on my phone (opens links from apps in a private session - generally a great idea!) and always get this 5 second delay (which is often actually 10 or more seconds). I never considered that it's a Firefox-only thing!
This setup program is signed with an EV certificate from DigiCert and hosted on an https site. No other hoops left to jump through except this awesome Catch-22 implementation, which leaves no actionable solution.
I've been de-googling all of my services and software over the last couple years including a switch to firefox.
I have noticed cloudflare challenging me more and more often. I assumed it was related to privacy extensions like noscript, ublock, and privacy badger.
I also got perma blocked by cloudflare (no option to override to get access, not even their captcha), because I dared to disable web timing APIs in Firefox at some point in the distant past. (I felt those have no legitimate uses, and I still do)
I'd be curious to see if this is the case for other sites that use Cloudflare bot protection as well. There are a bunch of ways to tune the service so maybe they are just extra cautious?
> Open-source browsers are an important part of the web and should not be treated differently than their closed-source counterparts.
One way to interpret that is they should all have the same suspicion rules for lack of popularity applied to them. One way Cloudflare's rules could be causing this is if there's some threshold for fingerprints-per-second under which any UA is considered sus, and Firefox's market share is so low that it tends to fall under that threshold.
In which case, what lwt hiker is asking for is special treatment for the browser because they believe the Mozilla project's browser has special value to the web ecosystem. Which they are allowed to believe, but let's be clear about when we're seeking special treatment vs. being treated like any other user agent.
I'm not sure what the cause was, but many sites failed to resolve for me in Firefox (and derivatives) for a month or two straight recently. archive.is was one of them. I think it may still happen with default settings, but I solved it by turning off DNS-over-HTTPS (which I think is a stupid feature anyway)
Basically archive.is has an issue with the way Cloudflare DNS does things. This will also affect DNS over HTTPS as Cloudflare is the default DoH provider in Firefox.
I just close the tab if Cloudflare pops up and I don’t visit the site again. I don’t trust or like Cloudflare and I suspect they themselves initiate DDOS’s even though I have no actual proof.
Does it have to do with the main browser you use? I am running Debian, Firefox is main browser, while chromium is used occasionally. I got captcha for Chromium, but not Firefox.
I hit the “checking your browser” quite often, as well as hitting captchas. I assume this means my adblocker / tracking blocker (in Safari) is doing a good job.
I got HCaptcha-ed while using Chromium + uBlock Origin (no other privacy extensions / settings as far as I think). Happens both in normal and incognito mode.
Tested on Android with Fennec (Firefox without telemetry), FOSS browser and Midori. Only Fennec gets the challenge. Even such obscure browser like Midori is OK.
Cloudflare uses a lots of heuristics to determine whether to show their challenge. In fact getting served a challenge says more about the amount of bot traffic that the website is getting than about how bot-like you look.
It might say "more" about how much bot traffic the site is getting, but it is also making some very clear statements about user agent. Those don't contradict at all.
We're regressing to a state reminiscent of the dark IE years.
Except back then when you suggested alternative browsers, people were generally receptive once they saw the practical utility of features like tabs. Now when you suggest alternative browsers, people complain about tens of milliseconds more latency and insist on using Chrome for the speed. It's hard to blame them though, since the practical advantages of Firefox are slipping away as Mozilla focuses on more abstract advantages, like privacy, freedom, etc. Noble causes to be sure, reason enough for me to continue using Firefox even if it were a hundred times slower. But I think most people are looking for practical advantages; Firefox usage continues to decline and I don't have much hope for these trends turning around anytime soon.
Liking it is beyond the point, it isn't available for me to use unless I replace my phone and computers. But yes I think you're right, Firefox has become so thoroughly marginalized that Safari seems like the last meaningful resistance to the Chrome monoculture.
The reality is people want freedom to use Firefox or Chrome in iOs, they don't want a monopoly , imagine if Microsoft would not have allowed you to install a third pary browser unless it uses their engine and it accepts their rules, then we would all still use IE6 but with different themes and superficial features.
Safari needs to implement the standards and offer decent performance, adding on top of that very good integration with the OS and it should win on Apple OSes.
Also Apple users please demand Apple to sacrifice a bit of their profit and offer web developers some way to test their websites/code on Safari(including Betas for free) , either by providing test virtual machines images or some Web Based service. Safari Beta not only requiers you have Apple hardware and OS it requires you update to latest version (you maybe don't want to be forced to latest version).
The thing is, the lowest common denominator folks have to support, Safari, is what forces the entire web to follow web standards. As soon as developers can demand users switch to Chromium on iOS, you'll see a massive uptick in websites which only work in Chrome.
Effectively, in this case allowing competition will permanently destroy the open web.
The issue with Safari browsers is not that libraries we use support Chrome only features but that Safari can have bugged implementations or small differences and we can't test Safari. For example a WebGl based library I was using (for advanced stuff not rendering static text) stopped working on iOS because Apple changed the WebGl stuff, so we put for this users a 2d ugly but functional fallback. Because this was a third party library and not something I had experience with I could not entertain the idea of attempting to debug the issue or create a fix .
Then later Apple decided to change iPad browser to pretend is a desktop, this broke my checks and instead of the 2D version the iPad users got the broken 3D version. But finally the broken WebGl was released on Apple desktops/laptops so now everyone using Safari gets the 2D version.
What we need is a way to test Safari the stable and the Beta, I can find bugs, the guys that made the WebGl library find the bugs before the browser is released etc.
What non standard API only Chrome implements and evil web developers want to use?
> What non standard API only Chrome implements and evil web developers want to use?
Two things:
1. Feel free to look at the wide litany of garbage security and privacy holes Google has added to Chrome that are only supported by Chrome forks, which Mozilla and Safari have both publicly declared they will not support.
Mozilla has defined 24 items as "harmful", and that's a good start at bad ideas Google has shoehorned into calling "the web". Who wrote... basically every single harmful spec? Oh, the adtech company that is allowed to also make a web browser.
2. You're also just not understanding the problem. It doesn't actually matter what web developers want to use and don't want to use. Web developers are, and I'm sorry to insult a bunch of HN users here, really lazy. If they can demand people switch to Chrome, they will. In fact, they often do even when the website works fine everywhere else.
Next time you hit a Chrome-only user-agent check in Firefox, tell your browser to lie about the user agent and you'll probably discover the site works fine. I constantly see companies tell us their products "aren't supported" unless you use Chrome, even if they work fine in Firefox, and the issues we call about aren't caused by the browser, and can be duplicated in Chrome.
Right now, the only reason far more web developers aren't pasting Chromium checks into every single page, is they have to support Safari anyways, because they can't lose the iOS market. But as soon as it's possible for web devs to force users to install Chrome to continue on iOS, they will. And at that point, Google really doesn't need to pretend it cares about web standards anymore anyways, because alternative engines will already be dead.
Beware of one wishes for, in the end they will get Chrome everywhere anyway, because Firefox doesn't really matter, and there are no Web standards other than what Google is pushing from ChromeOS as standards, Project Fugus, Houdini, WebUSB, Web Bluetooth,...
If your problem is with bad developers not following the standards then why is your solution to screw the users. Find a solution for the problem bad developers and greedy companies. Such a solution for preventing IE6 like cases is to remind Apple that users should be able to install alternative browsers.
I test my stuff in Firefox and Chrome, can't check Safari or Safari Beta so if you really want more support for Safari you need to ask Apple to sacrifice a bit of profit and give developers virtual machines or whatever web service to be able to test their code.
But if you are honest with yourself you know the exact reason why Firefox is not allowed to be an option on iOS and the answer is simple, profits and sure a company is obligated to max out profits.
>The users have the option not to buy Apple devices.
And Apple has the option not to ship in EU when there will be a browser selection law as for Microsoft Windows.
If you are honest you would know that most users only have 2 mobile OS options, they don't have granular options like "I want an iOs with privacy of iOS, freedom of Linux and design of GNOME , users have a choice form 2 big pile of shit and they need to decide the one that stinks less.
We have the freedom to ask our representatives for new laws, bad behavior needs to be stopped regardless of market share.
And Apple will always have the option to follow the law or not sell stuff in EU, we demanded the industry to use same charger for devices (before even same fucking company could have different chargers) and we got it, we can do the same for browser options, payment options if our democracy wants it, but Apple could protect US citizens and not allow them to install alternative browsers and stores, then in 10 years we can compare numbers and see who was right - though I am expecting the answer "US is special, is big and the density and the diversity, something that works in EU will not work in US)
That ship sailed when developers collectively decided that browsers should be complete reimplementations of the entire operating system, and anything less was hopelessly broken. With a barrier to entry so high that trillion dollar companies don't find it worth the investment, it's no surprise that the last independent implementation is struggling to keep up.
> Mozilla focuses on more abstract advantages, like privacy
Mozilla marketing focuses on that. Mozilla development still uses opt-out telemetry, experiments on users without consent and they still have Google Analytics on their websites.
It might be, but I know that I had 200TB of real traffic via Cloudflare this month, while I paid for 8TB on my upstream. Running services of that scale would be utterly impossible for me and probably a lot of other people without their help, so it's definitely empowering.
Are you saying that 95% of traffic is malicious or 95% can be cached?
In either case 200TB is a 1Gpbs link with 70% utilization.
Something what a cheap (<100eur) server from Hetzner can sustain, for static content of course.
8TB is also not that much at all. As an example our household uses about 2TB per month, both working from home + watching netflix sometimes.
Considering CDN's are duplicates of websites located around the world to remove lag, has anyone every flagged up how CDN's can be used for nefarious means, or do people just trust CDN's blindly? When is a sock puppet not an avatar on a web forum but an entire website radicalising individuals in secret?
I have got this on gitlab.com every morning when I log in for at least a year. (We are a paying customer.) I use Firefox with Coookie Auto Delete.
I know that the internet is full of idiots and criminals. If they protect their service it's my benefit. It costs me maybe 2-3 seconds every morning, but then there will be 1000s of requests during the workday. If each of them were 0.1 seconds slower because their servers deal with nonsense my user experience would be much worse.
(I have no idea whether keeping cookies or using a different browser would avoid the visible challenge. I just don't care.)
Edit: I would really hate it if I had to do free Google captcha labor. Or fill the AWS one which always takes me 3 attempts to get it right.
While we can’t comment on the specifics of any customer configuration, we do not block or challenge Firefox by default—either with our Bot Management products or with any other L7 security controls.
You can confirm this by signing up a free zone and making a request from Firefox.