There's going to be more and more of these as browsers fully accept and cement themselves into their role as operating systems and inevitably expose more bare metal functionality.
The only way to stop it is to not use a browser that thinks it's an OS. That means not being able to use websites that use new OS features like web components, webgl, etc. It means not using these features as web dev unless you're forced into it by getting paid. Browsers that treat the web as a document instead of an application will have far, far fewer remote exploits.
Is there any everyday usable browser left that is not architectured as an OS and isn’t supposed to work like one ?
While I see your point it seems to me that ship has sailed many years ago, perhaps from the point where Chrome’s OS like kernel/module based architecture was accepted as something to strive for, and browser were deemed secure and fast enough to be pitched as an alternative to native applications.
I am able to use Dillo for a lot of everyday tasks. The limiting factor really poses a different question: are there any everyday usable websites left that are not architected as applications and aren't supposed to work like one?
The answer is some, with degraded functionality and layout if you don't support yesterday's CSS and JS
In addition to the onus on users to make a choice, there's an army of developers out there who can advocate to avoid unnecessary use of exotic features.
Sometimes they're appropriate (e.g. ability to drag-and-drop or CTRL+V paste a photo to an image sharing site), but as a user my personal inclination tends toward traditional interfaces (which as a happy bonus can be more responsive) and I've encountered an overwhelming number of sites that abuse capabilities for no good reason (e.g. those which immediately prompt for your location).
A little taste and restraint would be very welcome on the modern web.
Another way to stop it is to start writing software like this i safe-by-default languages like Rust and others instead of C++ with its numerous footguns.
Well, they tried to write a new browser engine in Rust but gave up and got laid off. A few pieces got integrated into Firefox but the browser is still wildly insecure (cf. the article).
Turns out "Rewrite it in Rust" is actually really hard when you have millions of lines of code. Even Google probably can't rewrite Chrome from scratch.
Maybe if we just try harder, communism will work. You gotta draw the line somewhere.
When you use those features only when you're a web dev, not only you're actually using them anyway (which defeats the purpose by driving demand), but you also increase supply by creating new websites of the kind.
Also, if you not use the websites "that think they're an OS", you're ending up not being able to function in today's world to an increasing extent. e-governments are all about web applications, often with "bare metal functionality" such as legacy Java, ActiveX or Flash applets. They should be long gone, but given that somebody pumped millions into them, it will take them years to go away.
Hell, even regular JS is bare metal today with all the complexity of JIT. I'm getting the impression that suggesting to go away from this realm is naive and a better solution would be to look at it from the perspective of "OK, it happened. How can we make it more secure?".
After all, becoming an OS isn't an excuse to doing less. In fact, browsers now have more responsibility to keep their security philosophy up to date.
Hard disagree. I'm not giving WebGL a pass, nor wasm, web notification, webrtc, webusb, HTTP3, websocks, DOH or whatever bright idea they had last month, just to read a newspaper.
I actually like and use webrtc, but only for actual RTC, otherwise it's a shitshow and disabled.
Some things are indeed useful, but I don't see how you go OK, it happend, time to make it secure - with an ever expanding scope and attack surface. Note - "more secure" is not enough, we need secure.
I got the impression from your post that you're willing to join the crowd that's never willing to turn on JS etc. If it's just about reading the newspaper, you can pick a less invasive data data source. But, say, for e-government, you really don't have a choice and given that all those things are already standard and can be used for good purposes, I guess we really have no option other than isolating those features the best we can.
If your bottom line is "features should only be available when there's a legitimate use case for them", perfect. The problem is when there's a major website and you don't know why it's working, but you turned off entire JS stack and it can't even tell you that.
What are "all those things" that "are already standard"? And why should a rando government or other site requiring an API mean it should be available to all websites everywhere?
Note, I didn't even mention javascript nor disabling it altogether, and I don't wish to imply we shouldn't secure any and all APIs/features.
I'm saying
(1) we will never secure all APIs/features;
(2) they are ever growing so it would be futile even if we could secure the present ones;
(3) even if all the APIs are "secure", they will be misused against users, so they should not be available by default like they currently are;
(4) yes, I do think static/simple sites should be usable without JS.
I used to think of Stallman's browsing habits as silly, but there may come a time where I will visit the web-at-large only from other people's or dedicated-use devices.
"There's going to be more and more of these as browsers fully accept and cement themselves into their role as operating systems..."
That sounds like job security for those employed in "computer security".
I use a text-only browser. Nine times out of ten, that's all I need to get the content I want.
I do not use Windows but "Nessie" looks interesting. Someone posted about this browser a few months ago and commenters crticised it for not being open source. It appears the source is now available:
It treats HTML as a dynamic thing to be filled in by executing some third party code. As opposed to being text in a document. There's no text to fall back to. Just blank stretches. This is treating the web like an application instead of a document.
I often use my browser with JS default and it boggles my mind how developers use JS for things that could be perfectly done with HTML and CSS. For most websites used as websites (reading the content), not apps, JS is completely reduntant. Moreover, it looks like most JS code is used for all possible trackers and ads. JS is great (or: decent enough), but for apps, not for HTML documents.
In my view, HTML6 should remove a lot of clutter and remove backward compatibility for a lot of stuff.
HTML is a language that is too ambiguous to parse, and that's not a good thing for browsers. HTML should be adapted to mobile so it can work faster, use less memory, etc. Something with vector graphics should be a better norm.
And I'm wary of the idea, because it would probably end up as wasm on webgl, with websock and input via webusb, while banning HTTP, HTML and CSS, yet somehow still require javascript.
This is entirely unsupported by the evidence. Many exploits are rendering bugs, or otherwise bugs in libraries created in unsafe languages that inevitably receive memory corruption. All of which are quickly fixed.
Perhaps you like browsing pure html documents to view websites. Everyone else prefers current gen browsers despite the rare risk.
This website seems to be a great resource in finding exploited zero days for each browser, but using it for any kind of security comparison is likely not a good idea. Generally, "number of CVEs" is a flawed comparison metric for almost every kind of question you're going to ask, and CVSS is not particularly good at matching severity in the sense that most people might care about it.
This suffers from the same flaws as most CVE statistics. You take some data that you have, assume it's a representative sample and then make some claims based on it. The problem is: It's not a representative sample.
If I understand this correct this is looking at CVEs where exploitation has happened as announced by the vendor. It's bad statistics, because you cannot assume all vendors tread these things equally (one vendor may be very open about known exploitations while another may try to hide stuff as much as possible). Creating such statistics also creates an incentive for vendors to be more secretive if such things happen, so it's not just bad statistics, it's also bad for security.
Do you mean that there are simply too few 0-days to make this claim?
Because in that case I disagree: a 0-day is an important event, every single one counts. It's like nuclear bombs: statistically, it indeed makes no sense to say "amount of employed nukes per capita" or so, but it still makes sense to say "The US employed the most nuclear bombs" and "Japan has seen the most nuclear attacks on it's citizens".
Sure, but it would be silly to imply that therefore japan is the most dangerous place to live due to risk of nuclear bombs, like how the website is subtly implying that chrome is the least secure.
I would also point out that the website is only tracking vulnerabilities in the last 2 years. I bet IE's number would go way up if you included IE6 (not that that would mean anything). I suppose you could also say that "Japan has seen the (tied for) least nuclear attacks on its citizens in the last 2 years", but i'm not sure what the point would be.
You can't really have 0 days in software that has been eol for years... That would make the term useless.
And I'm pretty sure chrome actually is the most "dangerous" browser wrt 0days. If they're going to exploit them, it's surely going to be the one with the largest market share.
But yes, the quantity of bugs doesn't necessarily translate to the actual security of the browser.
For anyone in the position to be a victim of a targeted attack (like an activist in a suppressing regime) the unexploited 0days can be important. It can be argued that for personal decision making for the general public, the number of exploited ones can be more meaningful.
They could have had 0-days at the time. That would help to determine if chrome is just having a bad year due to luck (albeit i think this entire thing is BS)
Agree, and it's also based on the number of 0 days _that have been found_. If more researchers are looking for them and disclosing them for Chrome (and it makes sense, since so much of web traffic goes through Chromium-based browsers and also because Google has a lot more people doing that), then you'll see more.
Chrome is also changing very rapidly. I imagine Firefox's ESR branch may be safer just because less change means less new code in need of review over time.
Most software sees many more zero days; usually you'll look at these kinds of things over tens or hundreds of exploits. It only takes one zero day to hack someone, of course, but by their nature they are perishable items and often single-use.
Yes, I think you have missed the point, as the severity of the problem is orthogonal to the issue of whether we have enough information to rank browsers. If they were not orthogonal, then severity of consequences would be a parameter in formal statistical methods for evaluating our confidence in the ranking.
Let's take your analogy a step further. Does it follow that the US is the most likely to be the next user of nuclear weapons? Does it follow that Japan is most likely to be the next target? (I'm sure someone is thinking that at least one of those outcomes is quite likely, but note that I wrote "does it follow...", not "how likely is it..." because the issue is what one can deduce from the evidence presented in the above post.)
While this is interesting on itself, it does not make any claim valid or invalid.
Proving that A is worse than B does not prove that A is good or that B is bad. Or, if you are more mathematical: a > b does not prove that b < 0 or a > 0, where 0 is neutral.
And as I point out above: this was not meant as comparison but as analogy. If my analogy affected anyone, I sincerely apologise.
Patients have died in ransomware attacks. Not half a million patients or even a quarter million but still, a browser zero day can be a threat to human life.
I am using the numbers from the Wikipedia summary. I also checked to see if those numbers include long-term effects which is unclear from my cursory review.
I don’t know how many people are in hospitals with life threatening conditions but I suspect it is more than a quarter million, or even half.
Based on personal experience in deskside hospital IT browser vulnerabilities are a significant vector. We spent a lot of time re-imaging machines.
There have been ransomware attacks at hospitals including deaths. A coordinated attack using this vector could easily cause as many deaths as nuclear weapons have.
Chrome bugs are currently selling for $500k and Firefox/Edge bugs are selling for $100k. It's kind of shocking that we got to this point, but for comparison, a full Chrome exploit sells for the same amount as a full exploit for IIS or Apache. Firefox and Edge sell for the same price as a full exploit in Wordpress.
If you track over time, which that site doesn't show, you'd see that "number of users" isn't much of a factor. There are a bunch of popular apps with shitty security practices that have cheaper bugs than less popular apps with better design practices.
This is a great way to measure "attacker utility". That is, it's not only a measure of how rare some of these bugs are, but how useful they are to attackers.
Chrome bugs have been significantly more expensive than IE bugs since long before Chrome overtook IE in popularity. For a while, Firefox bugs and Chrome bugs we neck and neck, because even though it was easier to find exploitable bugs in Firefox, the Intelligence Community was buying up Firefox bugs like mad in order to exploit Tor users, since Firefox is the underlying tech behind the Tor browser.
Another one to bookmark for heap corruption, use-after-free and type confusion coding error examples that expert C and C++ developers hired by top multinationals, with their PhD level recruiting processes, never make.
Nah, just being tired of being told that only newbies do security errors in C, and the languages that are copy-paste compatible with it.
But there is a positive side to it, where would the anti-virus, hardware memory tagging and security consultancy companies be without languages like C, there is a market to keep alive out there.
I think it's more about tiredness of having people saying that anything not written 100% in C/Assembler using a <20k binary taking more than 300k of memory is a bloated mess and everything was better in the past and we all should use C and C++ for everything.
(I will use C/Assembler if I'm coding something for an C64, Amiga or a 4k/64k demoscene intro but I'm not gonna put up a server written like that on the internet for people to exploit)
Wasn't CVE-2020-15999 a bug in Freetype, which Firefox also uses? Why isn't there a Firefox CVE on the list from the same time? Was the bug not exploitable in Firefox for some reason, or is this list just incomplete?
(Never mind that comparing counts of CVEs is a ridiculous way to compare security of products. CVE counts seem more indicative of the amount of research targeting the product than of the number of bugs in the product.)
Firefox only uses Freetype on Linux (the system-installed version) and on Android, and the vulnerable code path was only reachable in Firefox with a non-default pref set (gfx.downloadable_fonts.keep_color_bitmaps).
Is there any reference that these are actually zero days, or are these just particular CVE's based on some other category? e.g. the sources/links do not indicate that these were actually used in the wild? There's no indication that these were not responsibly disclosed.
Safari is very popular in some markets (e.g. 36% usage share in the USA), and you cannot select a more secure browser if you want to browse on your iDevice.
Als Safari signals you are a high value target, so a Safari zero-day is generally worse for Safari users than say a Chrome zero day.
It kinda makes sense though. IE is horrible, but it's usage share is in the single digits, so not a big attack surface to target. Chrome and Safari is far more interesting from a bad actors perspective
The only way to stop it is to not use a browser that thinks it's an OS. That means not being able to use websites that use new OS features like web components, webgl, etc. It means not using these features as web dev unless you're forced into it by getting paid. Browsers that treat the web as a document instead of an application will have far, far fewer remote exploits.