Hacker News new | past | comments | ask | show | jobs | submit login
WebKit is the jQuery of Browser Engines (ejohn.org)
252 points by jimsteinhart on Feb 13, 2013 | hide | past | favorite | 204 comments



The web is not open and becoming increasingly less so.

People love to talk about how the web is about open standards and such, but it really is rather quite closed.

It's driven less by standards and more by de-facto implementations. Soon we can get rid of the standards committee and just talk to the implementers of webkit to define the "standard".

And I think even worse has been the wholesale discounting of plugins. I still strongly believe that being tied to JavaScript as really the only client side language is a mistake. It's not a great IL and limiting the language for such a pervasive platform is scary. I powerful plugin model would be, IMO, one of the best things to a truly powerful web.

I wish the web was more open. I wish that browsers were a truly extensible runtime that specified lower level abstractions, that allowed more innovation at the top of the stack.

It feels like we're walking into the dark age of the internet.


This comment captures a lot, perhaps not in the way Ken intended, but I feel compelled to respond.

Lets start with the thesis statement, "The web is not open and becoming increasingly less so."

On its face, this statement is not only false, it is painfully so. Sort of like saying the world is not round and becoming increasingly less so. To pretty much anyone they would say "the world is clearly round, and its impossible to change that." Similarly there is absolutely nothing standing between Ken, or anyone else, preventing them from building an entirely different "web" just like Tim Berners-Lee did back at CERN. So the definition of the word 'open' here clearly is needs some additional verbiage.

The next statement helps a bit, "It's driven less by standards and more by de-facto implementations. Soon we can get rid of the standards committee and just talk to the implementers of webkit to define the 'standard'."

This deliciously captures a debate that has raged for 30 years at least. "Which came first, the standard or the code?"

Back when I was young and impressionable, the debate was something called the "ISO 7 layer networking model" and "TCP/IP". You see the international standards organization had decided that the world needed a global networking standard, and so they got their best standards engineers together to come up with what they gloriously christened the "Open Standards Interconnect" or OSI set of protocols. Meanwhile a scrappy bunch of network engineers and hacker types were in this loose knit organization called the Internet Engineering Task Force who were building networks that spanned countries and they wrote code and brought it to the meetings and debated stuff that worked well and stuff that didn't work well, and then everyone went back and wrote more code, Etc.

The forces of evil ignored the IETF and focused on the ISO working groups, since the latter were making standards and the former were just playing around with code.

As it turned out, working code tended to trump standards, and a process of debating changes to the system using working code vs debate using 'it should/it might' as opposed to 'version A does/ version B doesn't' meant that changes got made to standards based on a convincing argument that had never been tried or experienced in practice. The result was that the OSI standards had a lot of stuff in them to avoid issues that weren't issues, and were missing responses to things that actually were issues.

A number of people found the 'code first, standard later' methodology superior for that reason. Assuming that the code was available and unencumbered by patent or licensing restrictions. The latter of course became a much bigger problem when the focus switched to the IETF and the 'big guns' started their usual games.

My first response is then, "open" means anyone can implement and contribute new stuff. And by that definition the web is very open. However, since the community favors a implementation model over a theoretical standards model the 'cost' to influence change is you have to write code, as opposed to making a good argument. And that disenfranchises people without good coding skills.

The second part of this screed is prefaced with this: "And I think even worse has been the wholesale discounting of plugins." Which speaks to the other side effect of "open" as in "we don't make any boundaries we don't have to."

From a mass adoption point of view, the more variability you have in an experience the harder it is to capture the larger audience. This is why cars and motorcycles are all driven in basically the same way, Televisions have 'channels' and 'volume' and browsers have an address bar and bookmarks.

The unification of the structure allows for mass learning and talking in generalizations that remain true without device specific knowledge. Can you imagine how hard it would be to write a driver's code if every vehicle had its own customizable UI and indicators?

So as a technology matures the variability is removed and the common practices are enshrined into the structure.

What that means in a practical sense is that if you're trying to push the envelope in such a mature technology you will face higher and higher resistance. However, you are always allowed to create an entirely new way of doing things.

This isn't the 'dark age' it's the post renaissance start of the industrial revolution. Except instead of widely accessible books we've now got widely accessible information and a de-facto portal for accessing it.


I think this is unfair to the parent. Comparing web standards to the OSI model is really not the same thing, in my view,, at all. Web "standards" are much more specific and prescriptive, as well as detailed than the OSI model.

It's hard to say whether the parent is correct about the dark age but such a thing clearly has been the case in the past with regards to standards. There was a time before in the not-so-distant past that browser vendors, particularly MS, did not care much at all about conforming to any sort of standards and created mess for which things like jQuery were partially created to solve. So I think there is real ground the parent's point. The issue is whether it is really getting worse, still.

One thing I think is different from previous years is that the programming community is less accepting, I think, of totally non-standard, even weird, proprietary implementations.


You could argue that the IETF has become everything the ISO working groups used to be, and that 'rough consensus and running code' is long gone.

I seem to flip flop back and forth on this, personally. Some days the standards process seems bright and cheery, at other times, I fear Apple/Google/Microsoft are running the show.


"You could argue that the IETF has become everything the ISO working groups used to be, and that 'rough consensus and running code' is long gone."

Yes you could, I was there in the trenches trying to urge them not to go down that road. I really lost it when we had gotten Sun to release all claim to any rights to the XDR or RPC code libraries so that IETF could endorse it as an 'independent' standard. There were enough implementations out there, a regular connectathon which tested interoperability. Paul Leach, who made it his life's work to prevent any sort of standardization of RPC, successfully rallied Microsoft to overwhelm the otherwise rational members of the IETF and de-rail that process. It was so blatant, and so petty. I remember telling Vint Cerf that I marked that day, and those events, as the death of the IETF as a body with any integrity. Many working groups soldiered along and did well until they became 'strategic' to some big player, and then their integrity too was ripped out of them. Sort of like that movie 'Invasion of the Body Snatchers.' Very sad.

Over the years I've toyed briefly with starting a new networking working group.


Sadly a group of backwards people stopped WebDB just because SQLite being fast, fully featured and in public domain was not enough for them.


Lets step back from the "open" buzzword and look at the empirical facts.

Less than a decade ago you made sure your web sites ran well in Internet Explorer, a closed source browser that was allowed to stagnate. IE took the W3Cs standards as more like "guidelines" and not a specification.

- Today, every major web browser (except IE) uses an open source rendering engine (or the browser itself is open source.

- Every major web framework and library is open source.

- Most of the servers running the web are powered by an open source OS.

- The standards bodies are actually working faster than ever on new version of Ecmascript and HTML.

- IE's marketshare is smaller than ever.

Years ago some guys had a crazy idea to make a browser for KDE. Today it's powering much of the desktop web and almost ALL of the mobile web. Perhaps I just like a good love story, but it seems like this is a pretty great achievement for "open". Now, it seems pretty disingenuous to say "the web is not open" and even more so to say that just because more people are working on the same open source project that it's, "becoming increasingly less [open]."

[edited for formatting]


... and before IE it was Netscape: people didn't like that either, but you (and most people who talk about browser history) seem to forget or ignore that :(. If you go back and read through the W3C mailing lists people really really hated Netscape (the by-far dominant web browser at the time, for which books on HTML would have sections dedicated to optimizing for and would even go as far as to say being Netscape-only was fine) for seemingly making up HTML as they went along (almost all of the stuff in HTML that is deprecated, including all of the markup that was for style and presentation only, were Netscape-only HTML extensions) and refusing to take up the charge of CSS. Microsoft was even occasionally described as the potential savior that would come in with a second implementation that paid attention to them (and in fact you then find a ton of praise on the list from Microsoft publishing open DTDs from IE).

Despite all of this, Netscape (a company whose business model at the time relied on selling web browsers and getting contracts with ISPs to bundle their software with subscriptions) managed to get Microsoft's hands slapped so hard by the justice department for having the gall to give away a web browser as part of an operating system (something we now all take for granted: no one complains that Apple pushes Safari with OS X, nor do nearly enough people scream loudly about the fact that alternative web browsers on iOS are only possible if you use Apple's rendering engine in a crippled mode, defeating the purpose, despite Apple having near-monopoly status on the mobile web) that Microsoft never quite got back the courage to keep moving forward given the new constraints they were under. Thankfully, in the process, Netscape still died, and from its ashes arose the idea that an open-source web browser would be interesting and viable, leading to the ecosystem we have today.


Well, to be honest, why did Opera once had ads?

And on Netscape and CSS: http://news.ycombinator.com/item?id=2108940


The "almost all of the mobile" part is really bad, and exactly resembles the IE situation on the desktop before. I hope Mozilla will shift the balance there again.


The problem with saying "Webkit is the new IE" is that IE was allowed to stagnate because it was a singular browser with a dominate position in the market. When that position was achieved, it was no longer necessary for the company maintaining it to continue to compete.

Webkit, in contrast, isn't controlled by any one company. The people using it have access to the source, and more often that not are contributing to the project themselves.

I don't think the competition is going to end, It's just going to change form.


Stagnation was only one aspect of the problem. Another big aspect was common bad quality of sites created with IE only in mind. The same often happens with WebKit on mobile.


Honestly, in my view "guidelines" is even a strong word for what Microsoft did with the browser and "standards." This is not to hate on MS at all but honestly, even though you could, I suppose, argue IE has gotten much better, I don't really see a reason for it to exist anymore. It's been such a bad boy and has so few redeeming features that I really think the "blue e" on the desktop should be made to be like simply a shortcut to whatever your default browser is, be it Chrome, Firefox, Opera, or whatever (but not IE because it's development, in my view, should stop).


Here's another interesting way to think about it: look at the mobile web on iOS. Does it matter that WebKit is open source and forkable? No, since Apple actively prevents "alternative" visions of what the web should be since they just don't allow custom builds of WebKit/JS on there. Any browser you make can only add superficial features, forcing you to use the built in WebKit. You may have a great idea that would instantly make everyone want to use web apps on iOS instead of native apps, but since you 1) can't ship that browser on iOS, and 2) can't convince Apple to commit that change to WebKit proper, you are effectively locked out.

Compare this to a plugin-driven environment vs a standards-driven environment. Say what you like about Flash, but it was able to guerrilla video onto the web without anyone's permission.


> ...look at the mobile web on iOS. Does it matter that WebKit is open source and forkable?

The version of WebKit used on iOS is actually not open source and forkable; even WebCore, which is LGPL, Apple works around: rather than releasing code changes for iOS-specific features, they release the binary .o files users can link in.

Hell: Chrome for Android isn't even open source. People tend to totally forget that "WebKit is open source" is meaningless in the general case, as the BSD license allows specific forks to be closed, and all of the mobile ones hold stuff back.


WebCore/JSCore for iOS are indeed open source, but yes certain pieces are released as .o's that you just link with it, and additionally it is released as source dumps instead of nicely managed versioning. If you go to http://www.opensource.apple.com/release/ios-61/ and download WebCore you will see plenty of source in there. The point is that you could for example meaningfully edit WebCore (for example adding python scripting support, or perhaps even putting in ogg support, whatever), link it with the .o pieces, and have an interesting new product, but still not be able to release it on iOS due to the rules.


(edit: This comment's first paragraph is wrong, as pointed out in the reply below; in fact, this functionality is not in WebKit, it is in WebCore. However, it is still closed source, and it is not available on the website that was linked to: you cannot find the source code for anything that makes WebKit on iOS work on iOS, as it is all closed. The point stands.)

WebCore != WebKit. You were saying that WebKit was open source and forkable: no, WebKit isn't. There is a library used by WebKit called WebCore that is, but WebCore doesn't provide most of the juicy iOS functionality; Apple actually seems to actively avoid touching WebCore, lest their lives become more difficult due to it being under LGPL.

The reason, then, that I brought up WebCore was as a demonstration that even for things where Apple must release at least some source code (as WebCore is under LGPL), they still weasel around it: WebKit, which is under BSD, has no such protection, and you will note that there is simply nothing available for it on iOS at all.

(Trust me: I routinely download all of opensource.apple.com, to find not just new packages but redactions, and have scripts to reconstruct git repositories out of the various tarballs for key projects, which I then export for others in our community to more easily be able to work off of; you forward me there as if I haven't heard of it... ;P.)


No, trust me: I was on the original iPhone team and was responsible for Mobile Safari and "WebKit" on iOS, and you are pretty confused about the project.

For starters, "WebKit" is an overloaded term. There is "WebKit" the framework, which is a bridge between the actual gears (WebCore and JSCore) and applications. In other words, it is an API: an increasingly thin Obj-C layer on top of all the C++ that does the real work. Then there is the "WebKit Project", which is an umbrella term for all that stuff together (WebKit/WebCore/JSCore). Chrome for example neither uses WebKit proper if I recall correctly, nor the engine in JSCore (opting for v8 instead), and yet it is still considered a "WebKit browser". That's because what makes you "behave" like WebKit is WebCore, which actually handles DOM, rendering, CSS, etc etc. So saying that Apple releases WebKit for iOS is perfectly acceptable terminology, even if you are wanting to be pedantic about it. Now I don't know what you define as "juicy functionality", but I can assure you that WebCore is not just some helper library or something, WebCore more or less IS WebKit. It is certainly enough for you to be able to build your own custom browser for iOS. In fact, even if the iOS version was completely closed source, you could still take the desktop 100% open source WebKit and port it to the phone (just like Nokia and Google did for their phones).

So I guess I'm missing the relevancy of your point. If you just wanted to rant that Apple doesn't open source as much as it should, then I sympathize, but it really has nothing to do with the point I was making that due to the separate restrictive nature of the App Store policies, it doesn't matter if WebKit is or isn't open source because you aren't allowed to ship a custom browser engine anyways (at least not one that runs JavaScript).


So, I just spent some time digging around in these libraries, to figure out where I might be wrong about this, and it seems like this is my mistake: a bunch of functionality I thought was in WebKit is actually in the closed-source parts of WebCore. Specifically, I'm talking about the tiled rendering (TileCache.mm) and all of the multitouch logic (PlatformTouchEventIPhone.mm). Even simple things like the copy/paste support are redacted, as is scrolling (yes: scrolling). For other platforms, all of this code is available.

This closed source part contains tons of simply critical things, such as how to interact with the iPhone's memory management notification system, how to manage the network state transitions, how to interact with embedded video... pretty much everything about MobileSafari that makes it MobileSafari as opposed to a less-than-half-operational desktop version of Safari with a touch screen (which would suck) is closed source.

I'm therefore quite sorry that I thought that this stuff was in the other library, but my point about the "open source and forkable" stands, and I think it stands pretty well: I can't fork WebCore and make meaningful changes to it for this system. In fact, even for people who have access to the system's internals (we jailbreak users), the few people who used to recompile WebCore for it (the Hebrew community) gave up and moved to writing Substrate extensions instead.

To be clear: I consider these iOS-specific things to be "the juicy parts" of MobileSafari: if you want to compete, you have to have really strong compatible answers for them. It isn't sufficient to take a desktop copy of WebKit and recompile it, because if you did that you'd just have a totally unusable browser experience... you wouldn't even have as much functionality as embedding a UIWebView in your application and attempting to externally script it.

So, yes: you are right that these are in WebCore, and that my complaint about "WebCore != WebKit" was wrong. The reason I made that argument was to try to reconcile your insistence that WebKit for iOS was open source with the reality that there really is no source code available from Apple for anything but "something that renders HTML (slowly, and missing features)"; my reconciliation was wrong, but the reason for it is still correct: WebKit for iOS, including WebCore, is not "open source and forkable" enough to make a web browser.

To the extent to which one can then put in the elbow grease to add back all of these missing parts in order to make a web browser, honestly one may as well be starting with any other rendering engine that isn't yet ported to this platform. Therefore, if WebKit for desktop systems is relevant to this discussion, then Gecko being open source is equally as relevant and libcurl in general being open source is largely as relevant. You can also build web browsers out of those.

Given this, I'm having an increasingly difficult time trying to figure out where your correction to kenjackson's argument was: you tried to give him a different way to think about it (in a way that might undermine his argument that WebKit is a defacto implementation due to an inability to install your own copy on iOS), but it is starting to seem like you just agree with him? Am I simply misunderstanding why you were responding to him?


Yes that is the misunderstanding, I do agree with him (you can see another response to him here where we continue agreeing: http://news.ycombinator.com/item?id=5214213 ). I was not correcting him by saying "Here's another way to think about it", I was offering another interpretation of why kenjackson is right by looking at the case of mobile in particular.

I chose to focus on mobile for iOS precisely because here you have the greatest example of how open source is helpless and irrelevant. Even if Apple were to satisfy all your requirements for WebKit being open source, you still would not be allowed to compile it and ship it in your app, let alone modify it and ship it. Even if I write the best browser ever for iOS, I am not allowed to ship it on iOS. This is why I keep coming back to it not mattering whether you can or can't fork WebKit for iOS, just like it doesn't matter whether you can or can't write a completely new engine from scratch for iOS, just like it doesn't matter whether you can or can't fork FireFox for iOS: due to the nature of the platform, the web is closed on iOS PERIOD. Apple is THE gatekeeper of all features that enter the iOS web. Continuing to agree with kenjackson, that is why he is right that a better runtime or plugin system are ultimately more important for the web to be open than source code being released: as long as I can have a direct relationship with the user where they can install a plugin and modify the behavior of their browser, then there is a shot for non-dominant market players to always influence the direction of the web (the same way Adobe created the video revolution of the web without needing to own a browser or a cell phone or any other way of forcing people to use their tech).

I was really confused why you kept arguing with me about how open source WebKit is when my point was "whether or not its open source and you can fork it, it doesn't matter because the web is closed on iOS for deeper reasons".


And that's bad why? So the web only works with one rendering engine. One rendering engine that's open source and can be used and modified by anyone for any purpose.

Standards are great for things like protocols (even languages), but an entire web browser is a tad more complicated than TCP or even C++. No two browsers have ever implemented HTML/JS/CSS perfectly and they never will. If that's the case, then what's the point of a "standard" anyway?


One rendering engine that has some serious limitations, like not being very parallelizable, because of highly entrenched implementation choices.

Which means that if you want hardware capable of rendering the web it can't be low-power highly-parallel hardware; it has to be high-power-consumption fast-serial-operation hardware. Why is that bad? I guess that's a matter of perspective. I think that would be a terrible outcome, personally.

I should point out that I'm not aware of any compiler that has implemented C++ perfectly, and I doubt any ever will given that it's a moving target. So why bother having multiple compilers or a C++ standard at all? For example, why does the WebKit argument not apply to gcc? And note that in compiler-land not being able to compile some codebases is OK as long as you can compile the codebase your user cares about, while the Web equivalent (only rendering some websites but not others) is a lot more problematic, because the typical compiler user compiles fewer different codebases than they visit websites. And also because using different compilers for different codebases is a lot simpler than using different browsers for different websites.


You know that you can always start an open source project that fixes these parallelization issues and start building out an engine that is better, right? It'd probably be a 5-7+ year project, but it certainly is doable.

In fact, it's possible that the poor parallelization support will be the Achilles' tendon of WebKit on a long-enough time scale.

This is no different than the Achille's tendon of the DOM that is procedural-style immediate mode graphics instead of retain-mode graphics. Browser apps will never compete with iOS apps in terms of user experience until this procedural approach is replaced with a declarative functional reactive approach.

Think long term. The Windows hegemony eventually buckled under its own weight. There's no reason to think that WebKit won't eventually do the same on a long enough time scale. Figure out what will lead to its collapse because that is an opportunity. In fact, letting WebKit lead the way allows you to learn all the ways in which WebKit does it wrong. WebKit will continue to trail blaze on the interface, but doesn't have to be the end all be all of implementations for those interfaces.

Between Tizen and B2G, there is plenty of innovation in the web browser space. I just hope that transclusion is always considered a first world citizen in this brave new world.


> you can always start an open source project that fixes these parallelization issues and start building out an engine that is better

Sure. We (Mozilla) are doing that right now.

> It'd probably be a 5-7+ year project

If there is no WebKit monoculture. If there is, such that the project has to duplicate WebKit bugs after reverse-engineering them, then it's a lot longer, if possible at all (because some of the bugs are parallelism bottlenecks).

Which is precisely my point. A WebKit monoculture would make it less possible to start such an open source project.

> Think long term.

You mean the one in which we're all dead?

Even if a hypothetical WebKit monoculture "merely" delays the advent of more-parallel rendering engines by 20 years, as opposed to preventing it altogether, that's still a huge loss in my book.


I don't know if it does or does not. Maybe. Was the Opera browser engine doing anything to provide a more parallel engine option? If not, it arguable wasn't helping in this respect either.

Since WebKit is open source, can't you just submit bugfixes for the bugs that are parallelism bottlenecks? Seems like that would make a lot more sense than coding another engine to accommodate those bugs. If it truly is a bug, there shouldn't be any problem with submitting a bugfix and getting it accepted.

Since you're working on FireFox, are there any examples of WebKit "bugs" that prevent parallelism that you could not fix yourself? Does Mozilla have a team of WebKit engineers whose sole job is to fix WebKit so that those monoculture problems are mitigated and don't become a problem for other browser engines? At the end of the day, all you guys need to defend is the interface, not the implementation. Fixing each other's engines before bugs become features seems like a good way to accomplish this. ref: http://xkcd.com/1172/

The most important abstraction to fix isn't even a WebKit abstraction, it's a W3C abstraction. Everything was doomed from the get go because of the one-to-one relationship between the window and the document. I agree with Kay here about TBL & Co being shortsighted in what the web could have become if a richer interactive experience instead of a document based experience had been considered from inception.

Look at Twitter. That's not a document. That's an application. Each tweet in the interface is a document. Every tweet in that feed is a document that has been "transcluded" into the app Twitter built on top of a document. There needs to be a standard way to "transclude" documents with reference URL that allows interactivity. The fact that the only hyperlinking option we have today is the <a> tag is unfortunate. There need to be more ways of hyperlinking than an <a> tag. You need to be able to window directly to a document or document fragment at a different URL. The #hashanchors aren't sufficient, since they only describe a beginning, not an end to the fragment being excerpt. iFrames kind of provide an alternative, but this was never explored properly. The host app should also be able to provide a cached copy of the contents of any sub-document for performance and to guarantee that a copy of the referred document is always available in the parent context.

edit: downvote? srsly? without a response? downvoting is for comments that don't contribute to the conversation, not for comments you simply don't agree with.


You're arguing that it's not necessarily terrible if Opera switches to WebKit, in terms of monoculture issues.

That may well be true. But what others in this thread are arguing is that it would also not be terrible if everyone else switched to WebKit too, and I believe they're wrong about that.

> can't you just submit bugfixes for the bugs that are parallelism bottlenecks?

You mean bugs like being written in C++ and not architected around parallelism?

Bolting on parallelism is _hard_. Have you ever tried doing that with an existing multi-million-line C++ codebase? I've tried with Gecko, and I've spoken with people who have tried with WebKit, as well as reading a good bit of WebKit source, and it's not really feasible unless you drop everything else and focus all your energy on it. And maybe not even then.

> are there any examples of WebKit "bugs" that prevent parallelism that you could not fix yourself

And get the patches actually accepted in WebKit? Lots.

Again, you seem to think the problem is some small issues here and there, whereas the problem is more that you have lots of code touching non-const data through C++ pointers, which means that if you try to parallelize you either get tons of data races or lock contention that kills performance or most likely both.

> At the end of the day, all you guys need to defend is the interface, not the implementation

As long as there are multiple implementations. Unless you include in "interface" everything that's web-visible, but policing that is a huge endeavor that no one working on browser implementations has the manpower for.


Assuming that no webpages are reliant on those bugs, sure. One of the problems with everyone targetting a single rendering engine is that they become reliant on the bugs of that engine, to the point that it becomes difficult to make any changes without breaking compatibility. Look at IE, for instance, especially the IE7-compatibility mode in later versions which wasn't IE7 compatible.


Hell don't look at IE, look at Windows, Windows guys had to rewrite a CORRECT implementation of logic check if SimCity was running and turn of a special flag to run the old version of allocator.

http://ianmurdock.com/platforms/on-the-importance-of-backwar...

Now multiply that story for every badly written WebKit site that relies on some backward ass crazy bug that no maintainer sees fit to fix.


As an example, I personally figured out where legacy color parsing is in the Netscape classic source: http://stackoverflow.com/questions/8318911/why-does-html-thi...

It is so subtle even Netscape's own Gecko rewrite did not get it completely right the first time: https://bugzilla.mozilla.org/show_bug.cgi?id=121738


And to stretch that a bit, the same apply for operating systems, text editors, heck, clothes color, clothes type, food, etc.

No choice & diversity = always been a bad idea and never, ever led to efficiency, progress, etc. It has always and every time led to the opposite.


This in an incredibly short-sighted comment. The engine is just a client application. There are millions of websites on the other end of the line. Any changes to web standards also affects them. That's why it's incredibly harmful to treat any single implementation as a de-facto standard.

We need (good) web standards, because we need consistent web architecture where features work together and somebody does long-term planning.


A Monoculture by itself has a very bad property.

It's easy to use and misuse on a large scale. What runs same everywhere is great for app and crackers alike.

Its size make it a valid target of all kinds of dubious organization to target. And not just target exploits existing, but introducing new exploits into the source.


Web being open has nothing to do with the number of rendering engines for browsers. Even having zero of them would not affect the openness of the web in any way. Firewalls would.

  > It's driven less by standards and more by de-facto implementations. 
That was always the case. In fact, one of the goals of WHATWG (fathers of HTML5) was to standardise how the code is rendered even if it is invalid.


> It feels like we're walking into the dark age of the internet.

The persuasiveness of your argument is harmed by this sort of melodrama.

By the way, the "Dark Ages" are named such because of a lack of written historical records from the Early Middle Ages. The negative connotation attached to the phrase by the general public is considered inaccurate by historians.


> Soon we can get rid of the standards committee and just talk to the implementers of webkit to define the "standard".

Your comment in 1995: Soon we can get rid of the standards committee and just talk to the implementers of Netscape to define the "standard".

And in 2001: Soon we can get rid of the standards committee and just talk to the implementers of IE to define the "standard".

And in 2006: Soon we can get rid of the standards committee and just talk to the implementers of Firefox to define the "standard".

...and for more comparison:

Your comment in the late 70's on computers: Soon we can get rid of the hobbyists and just talk to Apple to define "standard".

And in 1996: Soon we can get rid of Apple and just talk to the creators of the PC to define the "standard".

And in 2010: Soon we can get rid of the PC and just talk to the creators of the iPad to define the "standard".

Just sayin'


+1

I've said for years that a pluggable javascript engine should be something that was fundamental to browsers. Extending it further, plugins that allowed for alternate and complementary lower-level technologies (easily embed a python engine, for example) would lead to more competition and innovation.

We just had a discussion (or, I had a rant) about this at a local web meetup last night. 10 years ago it was "IE only". We're moving in to "webkit only" these days, especially if you're targetting mobile users. In some ways it doesn't feel like we've progressed all that much.


if the code for the de facto implementation is open source, does it matter? i see the code and the standard as the same thing in different languages personally.


Of course it matters. The "standard" becomes driven by the peculiarities of a specific implementation, rather than the best thing for the base of customers that are served by the standard.

I think it's fine to have a reference implementation, but we need a broad set of implementations (with actual users) so that the standard doesn't get blinders on it due to an implementation decision made on a de-facto standard.


i see what you mean yes, that does make sense. to keep it honest so to speak.


They are different things because its unrealistic to fork WebKit and get any significant market share. So even though you might make a worthwhile change to the engine, realistically you need that change to be accepted by WebKit proper for it to matter.


Forking WebKit is effectively the same as writing your own browser, with regards to standardization. At the end of the day you want to get everyone to agree on the standard -- having an implementation that everyone can use only helps to sweeten the deal.


The response was more in regards to the grand-OP's desire for a lower level extensible runtime and lamentation about plugins, more than a comparison to standards now that I think about it. In other words, which web is more open:

1. One in which all the code is open source but there are huge hurdles to releasing your own browser, and any new feature is thus at the mercy of just a few big companies (Apple, Google, etc.).

or

2. One in which perhaps all the browsers were closed source, but adding new features to any such browser really was just a matter of referencing a script on a web page?

The questions is more or less only useful as a thought experiment by this point of course, and in particular I don't feel that the "standards" process was ever particularly open to begin with, so I don't think things have, or will, necessarily get much worse.


I like how you've stated it, and made my original premise less confrontational and more of a question about what you value as "open".


I don't think you can say the code is the standard. An implementation will have many quirks or things not related to the issue we are interesting "standardizing." Where do you draw the line? That would mean no implementation that was not simply 100% the same would conform to the standard.

You could say "Well standards document is irrelevant because no one follows them anyway" but that's another issue.


I'd be a lot more worried about implementing proprietary codecs like h.264 and HEVC, and DRM schemes on the web than about having one open source rendering engine as the "standard" for web browsers. At least you can fork webkit, and anyone can collaborate on it anyway.


Presto was not open source. WebKit is open source.

So, the internet is becoming more open, not less.


proprietary binary plugins != open web


I don't disagree but I also see it more as WebKit as the "Linux kernel" of browser engines. Safari and Chrome both use WebKit but are very different in what they offer.. so they're like the Ubuntu and RedHat.

Is it a bad thing that AIX and Solaris fell by the wayside in a rush to Linux? I don't think so. So neither should adopting WebKit as a sort of common kernel in browsers, IMHO. But that's all it is.. MHO ;-)


"Is it a bad thing that AIX and Solaris fell by the wayside in a rush to Linux? I don't think so."

Note that Solaris innovated with ZFS, which helped spur Linux to implement btrfs. Competition matters, even in OS kernels.


I'm a little ignorant of the details here, but was ZFS a part of the Sun kernel?

I know the linux kernel is mostly monolithic, is the Solaris/openIndiana kernel the same?

Even if it is, it seems unlikely to me that the core kernel team had much to do with ZFS.

It's really more about competition between file systems, or so it seems to me. Maybe I'm splitting hairs.


Yes it was part of the kernel. Sure it had a team dedicated to it but so does everything big.


The difference is that we've been actively moving to trying to make things people actually care about (like, say, access to basic information or government services) kernel-agnostic, and we view cases when a specific kernel is required to accomplish some basic computing task as a failure. With browser engines, some people think that way but many don't, as this discussion makes clear.


But Linux isn't even close to being the only kernel being worked on. Neither absolutely nor from a marketshare perspective.


But WebKit isn't even close to being the only engine being worked on. Neither absolutely nor from a marketshare perspective.

Hence, the comparison. Firefox and IE both comprise a major chunk of the market.


What's your point? There's also Trident (IE) and Gecko (FF), just like there's NT and BSD.


There are dozens if not hundreds of kernels in development today. Some developed as small niche side projects. Some in proprietary embedded products. Some purely as research experiments. I do not see that type of diversity in browser engines and I think it is absolutely as important.


Diversity for the sake of diversity isn't going to get anywhere. That is basically gambling that something good will fall out of a different implementation just because it is different.

Instead, focus on solving problems. If the best way to solve a problem is to use webkit, then why do otherwise? If the best way is to go back and fork webkit from 2 years ago, and start your project from there, do that instead.

But don't be different just because you want to be different. Target your ambition toward something more meaningful.


difference ensure competition and variety of point of views. The problem with monoculture is that it _enforces_ the lack of difference. So if the guys in control of the monoculture stop progressing, or progress in a direction that you do not agree with, you're basically fucked and it takes enormous effort and time to change it.

It happens so many times and with so many things (not even software related) that I'm amazed it's not the first thing that comes to mind.


So Firefox would be Mach, Trident Windows and the assorted other smaller browser engines other 'kernels'?


I disagree with much of this article.

> as a contributor to WebKit you have the complete ability to drive it in a direction you wish (often for the better)

Not really. Follow the internal WebKit politics and you see a lot of conflicts. For example, Google wanted to push multi-VM support (for Dart) and Apple blocked that.

> WebKit is already a de facto standard

On mobile. Mobile isn't everything.

Also, should we have said "ie6 is already a de factor standard and given up"?

> I think one this is clear already: WebKit has completely and unequivocally won mobile at this point. They are nearly the only rendering engine used on the vast majority of mobile browsers, including the soon-to-switch Opera Mini/Mobile browsers too. There is no reason to worry about a slippery slope, the slope has already been slid down. In order for any other browser to remain relevant in the world of mobile (which, you must admit, is quickly becoming the only world we live in) they must keep feature parity with WebKit.

Again, this is utterly defeatist. Even if it were 99% true, should everyone give up?

> At this point it’s honestly a business/engineering decision for Mozilla and Microsoft (as it always has been).

No, Mozilla is a nonprofit and the decision would also regard whether it is good for the web, or not. I'm surprised to see John Resig not realize that - he used to work at Mozilla.

edit: And regarding the main point: jQuery worked in a space that was not standards-based. There were multiple JS libraries, and they fought for market share. No one tried to develop a standard that there would be multiple implementations for. Comparing jQuery to WebKit is odd.


"On mobile. Mobile isn't everything." And.. mobile is not nothing. mobile is huge.


Agreed.


  In the case of JavaScript libraries virtually everyone has
  standardized upon jQuery at this point.
This guy really lives on his own planet. Maybe most websites that only need to add a small piece of JS functionality are using jQuery, but I seriously doubt that "virtually everyone" writing large JS projects is using jQuery. Google Closure Tools, Sencha/ExtJS, and MooTools remain quite popular, and a host of developers are skipping compatibility layers altogether and only supporting IE9+ and other recent browser versions, particularly those targeting mobile devices.


I don't think this is particularly controversial opinion to have at this point: http://trends.builtwith.com/javascript/jQuery http://trends.builtwith.com/javascript


Most websites fall into the former "only need to add a small piece of JS functionality are using jQuery" category. A whole lot of websites include jQuery in default frontend templates and don't even use it on the public-facing website (Drupal, WordPress, other software that has a huge installed base).

Vague deployment statistics mean very little.


I'm not sure what you are saying. That people have jQuery loaded on their websites but aren't using it? If that's your argument, then that's now a wholly separate argument. jQuery having a huge installed base is what we're talking about, and is what John was referring to. Seems like you are changing your stance now that it's fairly obvious your assertion was incorrect in the face of research.


Clearly you have some kind of issue with jQuery but just because you don't like it doesn't make it any less popular, useful or prevalent. Here's a fun drinking game - open up the top 1000 websites and do a shot for each one that uses jQuery. You will be in hospital by about #20 I think.


None of the top five websites use jQuery.

Of the top 20, I've got one fifth of them using jQuery: Amazon, eBay, Wikipedia, and MSN. Not in the hospital yet.


I disagree with your notion that the top 5 sites are a good sample for development trends, but you did miss some uses of jQuery.

Google uses jQuery (and hosts it for millions of sites), just not on their very optimized homepage. They built a very large framework of their own to power their apps, but a lot of their informational sites use jQuery. Here are some examples:

https://developers.google.com http://www.chromeexperiments.com http://developer.android.com


I just checked Amazon, eBay, and MSN they use jQuery.

The only one you listed that doesn't seem to is Wikipedia. You really should actually try the websites before you list them.


You ought to read comments before replying to them. I clearly said that only those four are using jQuery, and that the remaining sixteen do not. I checked all of them by hand before stating this, except for Wikipedia because it is common knowledge that MediaWiki uses jQuery.

Also, your idea below about sniffing for $ is a bit embarrassing. Firefox, Chrome, etc. have come with $ for many years as an ID or query selector, and it has nothing to do with jQuery. Prototype and MooTools also provide $, and probably a dozen or more similar libraries you haven't heard of.


you ought to read comments before replying to them.

I clearly said you can not just check for `$`.


I think Wikipedia removed jQuery due to the impact it had on battery life - see Who Killed My Battery (http://www2012.wwwconference.org/proceedings/proceedings/p41...) for some details.


People on comp.lang.javascript (which has its problems, and I've been away for a while) have been noting this for a while. jQuery chews far more cycles than comparable libraries with its function overloading, extra $(this) invocations and such.


but still the majority of them do, by a wide margin. I don't even know why you're debating this, it's completely obvious that jQuery has by far the largest mindshare.

edit: ok now I get it, I had a look at your website. You like Closure so jQuery must suck, right? Also I would be in the hospital after 5 shots....


Well, don't provide a test like "you'll be in the hospital by #20" if you haven't vetted the results of it first.

My personal advocacy for the Closure Tools does not discredit my views on other tools, and I'm hoping this ad hominem nonsense doesn't fly in such a forum. Quite a few people come into the IRC channel and are using the Closure Compiler with jQuery and I don't give them shit. I take no issue with you using, liking, preferring jQuery.

You know why I like Closure? Because it makes the hard things possible. When you're talking about JavaScript running the web, it's websites like Gmail that matter the most. And you'd have a hell of a time getting Gmail to run as smoothy as it does using a library like jQuery, or even ExtJS.


you're right in that I should have checked first, and really if you think about it it's no surprise that YouTube, Yahoo, Google etc are using their own tools. That does not detract from the point that jQuery has massive, ridiculous mind share and completely dominates the web though. Your point about it being impossible to build a gmail style app in jQuery is extremely dubious. There are an awful lot of complicated JS apps built on jQuery.


If you check you are right. Amazon, eBay, and MSN all use jQuery. (at least on their homepage).

You can't simply look for the global `$` or a single file named jquery.

I work on large corporate applications and we use jQuery, but we would fail the `$` and jquery file test. We don't expose the global `$` and jquery gets combined and minified with our own code.


You seriously underestimate the complexity of Gmail. It may well be the most complex JavaScript application in existence with an appreciable user base.


Asana to me feels far more complex and robust than Gmail. There are tons of things I can do that feel desktop like in Asana, that I can't do in Gmail.


Are you arguing that Gmail is more complicated than google docs?


You make a fair argument. Google has several state-of-the-art JS applications. I thought of Gmail in particular because it probably sees much higher demand and so matters of optimization are probably a bit more important to it. This makes the amount of network activity, client-side caching, etc. it manages really impressive.

But no doubts that Google Docs is in the same ballpark.


Before you insult a poster, please look at what he is actually refuting: that virtually everybody use JQuery.

Virtually everybody is far more than a majority.


No, I think the vast majority of websites really do use jQuery. Certainly, Wordpress uses it, so pretty much every Wordpress blog out there does. I just tried the Spotify homepage- yep- Huffington Post- yep... I think that everyone really has standardised on jQuery unless they're doing something specialised.


I don't think he's living on his own planet, he's actually right - jQuery runs the client side web. Not for everyone but easily 90% of sites (that actually have javascript functionality at all) will use jQuery


Rather, the practice of providing a DOM API compatibility layer runs the web, jQuery being the most widely used library of this sort. When you step out of the god-object API jQuery provides and look at the internals, jQuery, MooTools, and Closure look surprisingly similar. I find the latter to provide the best compatibility (so does Google), but that is another discussion altogether.


Your opinions on what JavaScript libraries provide "the best compatibility" has little bearing on your original statement, that jQuery is not as popular as it actually is.


"jQuery is not as popular as it actually is"

Hmm.


Read as "[...] your original statement that jQuery is not as popular as [I believe] it actually is."


jQuery has a 90% market share[1] among JS libraries so I think it isn't an exaggeration.

[1] http://w3techs.com/technologies/overview/javascript_library/...


But the biggest websites generally don't use 3rd-party JS libraries at all--Google, Yahoo, Facebook, Twitter, etc. all write their own javascript libraries.

Sure there is a long tail of sites that do use jQuery, but most of them don't do very much or get much traffic.

If you look at jQuery's market share by aggregate user sessions or by aggregate time on site across the entire web, it does not look nearly as important.


> Sure there is a long tail of sites that do use jQuery, but most of them don't do very much or get much traffic.

Most sites in general don't get much traffic, but it's absurd to argue that jQuery is not popular amongst large websites. You don't think sites like nytimes.com, craigslist.org, twitter.com (that's right, check the source), live.com, netflix.com and pintrest.com push a lot of traffic? Even more sites use Sizzle, jQuery's selector engine.


Yahoo uses the YUI library


YUI stands for Yahoo User Interface. They wrote it and open-sourced it, like Google with the Closure Library. I wouldn't count it as a third-party library.


Right, it's not 3rd party, but it is included in the statistics that show 90% jQuery market share, and the argument was that the biggest websites are not represented there.


I'm not sure what you're trying to say, but I don't see anybody making that argument in this thread.


twitter uses jquery.


Perhaps he is overstating it a little, given that he's the creator of jQuery.


I like the arguments: "Don't worry about <x bad thing> happening, because it's already happened!"


Pretty much! People worry that the transition to a WebKit monoculture would result in some terrible things. On the contrary Chrome has shown that a shared rendering engine does not result in stagnation. Stagnation is a separate concern from a shared core.


On desktop the only browsers that use Webkit are Chrome (about 1/3 of the market), Safari (about 1/10) and now Opera (currently unreleased). So roughly two thirds of the market are on non-webkit engines which I think contradicts your argument as there's still competition there.

However, on mobile (a market which Chrome has only just entered) you could say that webkit is dominant and that we can already see problems there. To be honest though, I think that's misleading. Mobile Safari has been ridiculously dominant compared with other webkits, in mind and marketshare, and it's that monoculture causing the problems we currently see.

Serious new contenders like Chrome and Opera entering the mobile market with webkit renderers will, I think, actually help that situation to some degree by actually competing with Mobile Safari and not being half-hearted also-rans.


According to Statcounter.com's current numbers, Chrome is 36.9%, Safari is 8.57%, and Opera is 1.2% = 46.67%, leaving 53.3% for everything else. That's not "roughly two thirds".

Also, Safari and Chrome are both trending upwards, everything else is trending downwards.

http://gs.statcounter.com/#browser-ww-monthly-200807-201302


When it comes to mobile, "everything else" is the android default which is also webkit.


iOS browsers have to use WebCore for rendering/JS, but WebKit has always been optional but default on Android, WebOS, Bada, etc.

Once you consider that mobile != iOS, it seems silly to say Mobile Safari is the ridiculously dominant platform.


i think you're forgetting opera's massive use in the far east. aren't they overall more used than ios on mobile if you count global stats?


Ah, but it does result in stagnation on many axes. Where's the Google or Apple effort to add significant intra-page parallelism to WebKit? It's too hard in that codebase, so it's not happening. If you require that codebase for a web browser, that means no significant parallelism in web browsers.

(There's some work being done to parallelize the rendering pipeline, and a bit on painting, but parallel CSS layout seems to be completely off the table in existing browser implementations.)


Since you know a lot about WebKit and parallelism, especially intra-page parallelism, I'd really like to know a lot more about the problems with it. Can you give a brief rundown of the issues and the design decisions they made (and the correct decision IYHO) and maybe some links on where I can find more info?


So a caveat: I don't feel like I know a lot about WebKit. I know a bit about WebKit. I do know a lot about Gecko. ;)

The issues that I've seen are basically endemic to every large C++ application I've seen, actually: shared memory being accessed via pointers from many different places, lazily computed values, shared global (or even just per-page in the case of intra-page parallelism) caches.

These are all reasonable things to do, so I'm hesitant to call them "incorrect", but if you want to design with parallelism in mind you have to either avoid them or carefully design around having low contention on shared resources, access to mutable shared resources protected by some sort of serialization mechanism, etc.

As a simple example, last I looked WebKit's CSS selector matching algorithm cached some state on nodes as it went (as does Gecko's). That means that as things stand you can't do selector matching on multiple nodes in parallel because you can get data races. This one instance can be worked around by deferring the setting of the cached bits until a serialization point, but you have to make sure you do this for every single algorithm that touches non-const members on nodes that you want to parallelize... And C++ doesn't really give you a good way to find all such, given "mutable" and const_cast.

Another example: When WebKit does layout-related things, it directly touches the DOM from the layout code, which is a perfectly reasonable thing to do when you think about it. But it does mean that it can't do layout in parallel with running JS that can modify the DOM. For that you need to have layout operating on an immutable snapshot of the DOM.

As far as more info.... https://github.com/mozilla/servo/wiki/Design is perhaps a start for what we think we know about how this _ought_ to work. Maybe. ;)


So earlier I said that making another rendering engine would be a 5-7+ project. How about taking WebKit, forking it and making it highly parallelizable, piece by piece? Is that even possible, or is a big rewrite from scratch with parallelism in mind the only way to go?


We certainly considered that path, with both Gecko and WebKit.

In theory, that's possible, but in practice it would likely take just as long, and at the end you would have a rendering engine that was years behind the competition. Not least because intermediate stages wouldn't be upliftable back to the main codebase because partially parallelizing things leaves you with worse performance than either parallelizing nothing or everything, in many cases.


I agree with that, it's just that the way the argument was phrased was funny :)


"…because it's already happened and the world hasn't exploded."

It's hard to know now what the very–long-term effects will be but John's point that the web has benefited from Chrome's creation is hard to argue with. As for whether it's benefited more or less than if Chrome had used a different rendering engine, we'll never know…


I believe KTHML->WebKit has pretty much illustrated that having standards that allow new implementations to be developed is a good, essential thing.

We'd have been in an interesting, and perhaps worse state if Safari and Chrome were both Gecko based instead of competing therewith.


That's an interesting assumption to make: Just because Safari or Chrome hypothetically used Gecko doesn't mean that competition would've halted. Why couldn't it have been any different from when Chrome used WebKit as its basis and absolutely trounced Safari?


I wouldn't say Chrome WebKit trounced Safari WebKit. The browser surrounding the engine was better. This means little wrt. to the engine.


Why would we be in a worse state if Chrome and Safari were Gecko based?


We'd have one less rendering engine.


I think we should at least touch on the real reason Chrome is the de facto standard and baseline for web page rendering: it's essentially to front end development what IntelliJ is to backend.


It is very nice to develop for.

Firefox isn't terrible but its developer tools always seem a little clunkier than Chrome's.


Increasingly, jQuery is not just an implementation, but is rather an API. That's why we're seeing the rise of alternatives like zepto.js that match the jQuery API de-facto standard but are implemented better (smaller, dropping IE6 compatibility).

Replace "jQuery API" with "Web API" and this actually argues against John's point. Multiple implementations are better for everyone.


I'm not sure Zepto could be defined as "better" considering that it only implements a small subset of the jQuery API. In the browser world you can't get away with implementing a small subset of a specification, it'll be consider broken and people won't use it.

I'm not arguing against multiple implementations. If Mozilla were to switch to WebKit, rewrite its DOM implementation to be 20x faster, and then release that -- that would be absolutely stupendous! Much in the same way that the Chrome team created a new JavaScript engine that was much faster than Safari's JS engine. I am arguing that the writing is on the wall for the common parts of a browser. A browser vendor's time will be used much more efficiently by collaborating with each other on the implementation instead of writing a number of separate implementations.


"If Mozilla were to switch to WebKit, rewrite its DOM implementation to be 20x faster, and then release that -- that would be absolutely stupendous!"

Rewriting WebKit's DOM implementation to be 20x faster wouldn't be possible without rewriting WebKit. The DOM implementation is one of the most central parts of any rendering engine. We're working on doing that (Servo), but not by building on top of WebKit for precisely this reason.

"Much in the same way that the Chrome team created a new JavaScript engine that was much faster than Safari's JS engine."

They did that by replacing JavaScriptCore wholesale, rather than building on top of it. This was only possible because JavaScript is a standard, not defined by JavaScriptCore's implementation. If JSC had had a monopoly and the Web started relying on JSC's quirks, then V8 might never have happened.


In as much as I consider jQuery a kitchen sink approach, anyone who tries to trim it down to the core (and dropping old browsers which need lots of ifs and elses) may well be better in my opinion.


In the past there was some amount of competitive advantage in owning the rendering/browser engine. You could add unique features, fix things your competitors wouldn't, whatever.

These days performance is heavily driven by the javascript runtime. While it's challenging to write a browser engine, it is much much more challenging to write a really fast JIT'ing javascript runtime. It seems unlikely Opera would have been able to close the gap, much less surpass, with WebKit on that front.

At that point, any competitive advantage they hold in features is being offset in a fairly substantial performance penalty. Good move making the switch. Differentiate elsewhere.


> While it's challenging to write a browser engine, it is much much more challenging to write a really fast JIT'ing javascript runtime

That isn't really true. Sure, modern JITing js-engines are — arguably —the most technically advanced parts of a modern browser. However they are relatively small and self-contained; a suitably (i.e crazy-) talented team of engineers can get ballpark comparable performance of V8/Spidermonkey/etc. in a surprisingly short amount of time.

Most of the difficulty of making a browser fast is chasing the bottlenecks across multiple layers. For example it's no use having a super-fast javascript engine if your DOM implementation is so slow that real sites always bottleneck there. And there's no point in having fast DOM if your style layer is too slow to relayout when the underlying tree changes. And having a fast style layer doesn't help you much if your final compositing and painting are holding everything back. And of course different sites have radically different behaviour and what is an optimisation for one case might slow down a different case.


This issue is so misunderstood, mostly because it's so easy to market, measure, explain and graph relative performance of javascript engines, but as James says, they form relatively little part of the over-all performance of browsers.


Somehow I find that despite all the pretty graphs showing how fast Chrome/Firefox are surf (by suckless) which is an ~1k lines of code wrapper for webkit consistently manages to feel faster.

EDIT: Much as Chrome used to feel very fast on release, and has been getting gradually slower as far as I can tell. I remember when it used to have finished rendering a page almost before I hit enter.


Correct me if I'm wrong (and I very well may be), but isn't the JS runtime separate from the rendering engine?

I was under the impression that Chrome uses WebKit for rendering and V8 for JS.

So, even if Opera switches over to WebKit, that shouldn't affect JS.


Opera announced that they'll be using v8 for JS.

So why opera instead of Chrome? They'll differentiate themselves with the UI, mouse gestures, ad-blocking etc.


You're right. The point he is making is that there is no need to maintain a rendering engine if it is no longer a point of differentiation when you can focus your efforts on improving the features that make your browser stand out.


I actually see there being a lot more room for improvement in DOM performance than in JavaScript. Most of the time, poor DOM performance is a bottleneck in my code long before poor JavaScript performance, now that JS VMs are so fast.


Ahh, looks like I misunderstood, thanks.

But will developers/users really flock to Opera just because it has a fast JIT? For me, Opera is just something to check for compliance.


> But will developers/users really flock to Opera just because it has a fast JIT? For me, Opera is just something to check for compliance.

I think it's the reverse: users will like Opera if it has UI features they like. It's highly unlikely that Opera would be able to beat Chrome / Safari on speed but there's plenty of room for innovation in other areas now that the web is becoming fast enough for most users.

Not having to support a modern browser engine just be able to start those features is the major win for Opera because they can focus on things users actually care about rather than obscure feature compatibility or baseline competitive performance.


Actually, writing a fast and compatible browser engine is a much larger project (at least an order of magnitude larger) than writing a fast JS JIT.

Just for scale, the latter takes about 2-3 years as recent history has shown, with a team that numbers a few dozen people at most. The former takes hundreds of developers, and several more years...


While it may be true, we haven't actually seen a new layout engine written, so it's pretty difficult to estimate how long it'd take. Webkit was mostly functional as KHTML many years and tens to hundreds of developers ago.


We've seen a new layout engine written. Gecko. It took about 15-16 years to get to where it is now, with on average probably ~100 developers. Granted, you can argue it wasn't the most efficient path to get there, but I doubt that you could rebuild Gecko (or WebKit) from scratch today in fewer than 5 years with a team that's about that size...

For that matter, it's not like KHTML sprang out of the void fully-formed; I distinctly recall it being written. Again, the "the path wasn't optimal" arguments may apply.


I understand that a new layout engine could be an order of magnitude more work, but I'd like to be optimistic.

The problem I see, is when we start talking about projects that require 100+ men at 5+ years of work, on a project that in particular has no real viable intermediate stages, history seems to suggest to me that it just doesn't get done. The difference with gecko, is it was being advanced with the state of the web, rather than spending 10 years catching up.

A very large part of the problem is that you can't just render part of HTML5 and expect to be in a shipping, and major browser. What do you do with a half done layout engine?

Does a project like Servo really have any hope of completion if that's the case?


That's a really good question. I wish I knew the answer.

Such projects have in fact happened. Windows Vista development started in 2001 and did not finish until end of 2006, with a much larger team than we're talking about here. Of course it's not clear to me that this was _planned_ to take 5 years...


Ah yes, NGLayout :)

And of course moving from XPCOM to a more direct smart pointer implementation, and various other stages along the way.


More like half a dozen people and something like 12-18 months (jgraham will correct me) to make the world's fastest JS JIT. But your bigger point stands.


Carakan was 16.5 months from first commit to shipping (and a month or so more to not being notably buggy), with a team of five for the vast majority of that time. The only code carried over verbatim from Futhark was the regexp engine (though that had machine-code generation added to it), and the parser was also pre-existing (though not shared with Futhark!).

For comparison: MS were the last to do a major rewrite of their layout engine, which started, AIUI, before IE7 shipped (Oct 2006) and shipped in IE8 (Mar 2009), and more-or-less reached feature parity with others with IE9 (Mar 2011). To be fair, they were working from a far more archaic codebase than anyone else, so probably had more work, but they also had a far larger team than anyone else working on this.


When discussing timeframes, it's important to keep the moving goalposts in mind.

Back in 2009 it took order of 16 months to write a competitive JIT. But today's JITs are a good bit faster than they were in 2009, which means more special cases that need to be considered and made fast, from what I've seen of JIT development.

I should also note that I included QA resources of various sorts in my team size estimate....


Right: I think if Carakan were done now it would likely take two years to being competitive, given the same team size (and I included QA in team size — varying between 1.5 and 2).


"This page best viewed in Internet Explorer".

Remember those days? This is not a good thing. The HTML spec should be the standard, not WebKit's bugs.


The difference is that WebKit is open source with several large companies contributing to it. If it doesn't follow the HTML spec you (or anyone else) can fix the bug.

IE was closed source and Microsoft disbanded the team that developed it (IE6). If it had a bug, there wasn't anything you could do about it. You just made your site work around the bug, possibly breaking it on other browsers.


Not to derail the topic, but did anyone else read that headline as a negative, only to find that it wasn't (necessarily)?


Maybe it depends on what your view of jQuery is:

In the case of JavaScript libraries virtually everyone has standardized upon jQuery at this point.

In other words, as far as he cares things other than jQuery don't have a right of existence. Although it's very popular it's probably even more arguable if there aren't better JavaScript frameworks than jQuery, compared to whether there are better rendering engines than WebKit.

I believe it comes pretty close to illustrating what I believe is so wrong about his arguments.


Are there better JavaScript libraries than jQuery at certain things? Absolutely -- look to Backbone, Angular, Meteor, etc. etc. Are they better than jQuery at doing DOM manipulation? I think that's an easy argument simply by looking at the numbers: http://trends.builtwith.com/javascript/jQuery

jQuery or WebKit being dominant platforms doesn't requite that innovation stop, it gives innovation the ability to explode: When you don't have to work about nit-picky cross-platform capabilities or standardization then you get to focus on performance and building sweet frameworks like Backbone and Angular.


  I think that's an easy argument simply by looking at the
  numbers
Popular does not imply "better."


Popularity is highly correlated with quality.

Given that A is 10-100x more popular than B, what are the odds that B is "better" than A? Slim.

Now, popularity does not imply that something is the best. Often times there is something better that isn't as popular. But generally the popular thing is better than most alternatives.


Popularity is no correlated with quality, more with "good enough for the price".


Backbone, Angular, Meteor etc etc ALL use jQuery!


Wrong. At least Angular JS brings its own jQuery-like functions and does not depend on jQuery by itself.


angularjs uses jquery api , like zepto does it is a hidden dependency. but you are basically using a subset of jquery when you use angularjs.


And Gecko uses most of the WebKit api. We're talking about implementations here, not specs.


the implementation is the same.if you use jQuery script the jQlite code is disabled in AngularJS. it is only possible if the subset code respect all the behavior or the main code. And Gecko doesnt use Webkit api at all. Chrome just cant swap its webkit code to to Gecko without significant rewrite.


"In other words, as far as he cares things other than jQuery don't have a right of existence."

I don't think he made that point at all. He is merely reflecting on the fact that jQuery has the most market share and by a long way.


I actually don't like jQuery at all and I read the headline in negative, so your theory holds true, atleast to me.


No, given that it's from ejohn.org...


Yeah, same here. I assumed this was going to be some article about why webkit is bad for the web somehow. Funny how statements like that can have such dramatically different interpretations.


WebKit and jQuery -- and Bootstrap -- are becoming the new 'Win32 API.'


The cool part about jQuery is that I don't have to use it. Same with Bootstrap. So it's not a needless tyranny.

With browser engines things get a little more political - mono-culture is not a good thing. But at least Firefox holds enough of a market share, and of course IE is now a much better player in this space - so WebKit can't get away with too much silliness.


Theoretically yes, mono-culture is not a good thing. Having more from-scratch rendering engines would be good for the robustness of web standards. However web standards are hugely complex at this point and it becomes increasingly infeasible to implement from scratch. With WebKit at least you have an open-source pluggable engine, so you're not talking about branded product monopolies where one or two bad actors can foul things up for everyone. An IE6-like scenario is completely impossible with WebKit.


This has always struck me as a really poor argument. For one thing, because webkit itself is by far the youngest rendering engine around and also among the most complete. You can't reasonably argue both for a webkit monoculture and an unreasonably high barrier to entry.

Gecko was an open source ostensibly pluggable engine (even MSIE is pluggable, in fact), so why did we need webkit to bring about the monoculture? Would advocates of it have been satisfied if it had come about then? Probably not, and rightly so, because Gecko has stagnated, as most large codebases managed by people with specific interests do.

And we are talking about branded product monopolies here. Should Google and/or Apple fall behind who's going to take up the cause of moving things forward again? People don't use webkit, they use Chrome and Safari.


What do you think exactly is my argument? I'm just talking about the way things are, not some ideal that I'm holding up. Also, your commentary has major holes in it:

> webkit itself is by far the youngest rendering engine around and also among the most complete

WebKit was formally released by Apple almost 8 years ago after being in development secretly for at least a couple years before that. That encompasses the entire era of the modern mobile web. It was based on KHTML which I believe was released in the late 90s, in an era when CSS was barely off the ground and not used in any meaningful capacity by web designers of the time. That's the same era when Gecko was first developed and released as a clean break from the previous Netscape rendering engine which was deemed a dead-end from an engineering standpoint. So no, WebKit is not new.

> Gecko was an open source ostensibly pluggable engine (even MSIE is pluggable, in fact), so why did we need webkit to bring about the monoculture?

I don't know, but neither Gecko nor IE was ever meaningfully used as a pluggable engine in a major consumer product outside of the original organization's control. I don't know if there were technical reasons why they were never embraced on a wider basis, but they weren't, therefore they are not very informative on the question of what happens with an open-source rendering engine monoculture when multiple large parties and user bases are relying on it.

> *Should Google and/or Apple fall behind who's going to take up the cause of moving things forward again? People don't use webkit, they use Chrome and Safari.

This sentence is hard to argue because it's a non-sequitur. First of all, anyone can take up the torch any time. The only time this hasn't happened in the history of the web was when Microsoft has a monopoly and their core interests were being eroded by the web. Unlike IE, WebKit is LGPL, so anyone is free to start innovating from the current state of the art. As to the second point that people use Chrome and Safari, that's not entirely true. On Android for instance the default browser is not Chrome. Also, WebKit is being embedded in all kinds of devices. But beyond that, does the fact that people use Chrome and Safari give Google/Apple to the power to hold innovation hostage? First, they are competitors, but even if they colluded, it would be very hard for them to hold off the whole market if everyone else's web browsers were getting better and better. This is not the Windows-monopoly-plus-unsavvy-consumer market of the early 2000s. Everyone is used to computers now and in the web era they won't be corralled into an inferior platform by sheer business inertia.


I did not say it was brand new, but it is the youngest and that's inarguable. It's pre-webkit state also was not exactly much to write home about, and I say that having been a user of Konqueror back in the day.

Google seems to be replacing the AOSP browser in Android with Chrome, btw.

I think you underestimate the non-technical cost of entry of a new browser by a lot while overestimating the technical. I'm not saying that it's a small feat to make a compliant browser by any means, but if a non-Google company were to attempt to market a new browser the way Google did (including ads of equivalent value to links on Google's front page), it would be absurdly expensive. There's a reason all the dominant and growing players in this space are massive corporations: a plucky underdog stands little chance against Google, Apple, or Microsoft. So who else's browsers would they have to hold off exactly?


No, it's only 15 years old out of 20 years of the history of the web. Sheesh.


I think you're missing the point. Firefox & IE are browsers, WebKit is a browser engine, market share has nothing to do with WebKit. Chrome gained market share because instead of reinventing the browser engine, they improved on it and re-branded it. John's point seems to be that IE & Mozilla using WebKit wouldn't result in a mono-culture because they could still produce their own features and not worry about reinventing the engine.


That assumes that the WebKit engine is already as good as we'll need it to be or can be improved as needed in arbitrary ways.

But it's not, and it can't be.

Think about how something like the Servo project would happen (or not) in a WebKit monoculture world.


Why say Win32? Why not say WebKit and jQuery are the new Linux? It's actually a better analogy.

Software monoculture is not inherently a bad thing; it can be a good thing. Re-inventing the wheel isn't progress. The world is much better off with Linux than 20 different Unix variants. And the world is likely better off with WebKit than dozens of different browser variants.

The issue isn't having a software monoculture, it's who is in control of it. The situation with both Linux and WebKit is that we have all the benefits of a monoculture and very few of the downsides.


Re-inventing the wheel isn't progress, but coming up with radically better solutions is. And that's much harder to do if you have to not just match a specification but also be bug-for-bug compatible with an existing large codebase that you have to reverse-engineer to achieve said compatibility.


Because that would be a terrible analogy. Using linux as an example actually goes against your point. We do have "20 different unix variants". Just because you live in a linux bubble, doesn't mean there isn't a world outside of that bubble. Almost all software that runs on linux isn't using the linux api, they are using a gradually munged up collection of APIs from over the years, including posix, bsd and sysv APIs. That software works on on dozens of different systems. Software written to a common API that is implemented by dozens of different systems seems a lot more like an argument in favour of more rendering engines implementing a common API.


WebKit, yes - because all websites must comply with it if they want to succeed.

jQuery and Bootstrap, no. They are useful libraries, but developers can choose not to use them and still succeed. Few of the biggest websites use them, for instance--they mostly write their own javascript and templates.

But they all test against WebKit.


True. I don't work at a large company, and we definitely don't have a huge install base, but we develop several large JS apps, none of which use jQuery.

Only our smaller, cobbled together pages use jQuery. If it grows, we switch to something that meets our idiomatic style better: smaller modules in a CommonJS/AMD style.

JQuery takes over your code, and it seems like WebKit is doing the same thing. Some of our developers only include -webkit prefixes until I complain loud enough for them to throw in the rest: -moz, -o-, -ms.

I hope FF never switches, because then we'd be even more locked in.


I don't see that connection at all. How at all are they similar to the 'Win32 API'? Win32 API was required to write Windows applications. It's like saying that WebKit and jQuery are becoming HTML4. Does not compute.


Let's hope it doesn't get that bad.


WebKit and jQuery are good. Bootstrap is a fundamentally poor product, and I don't mean the concept of there being a standard framework that people can build upon, I mean the execution of Bootstrap itself.

So I freaking hope not.


I look at the who the author and hesitate to disagree, but I've had better experience in some areas with Firefox on Android (notably webgl) than I have had with Chrome / Chrome Beta.

Personally, aside from a few small wrinkles, I prefer the experience of using Firefox over Chrome on Android.


I think it is obvious that WebKit has won and by a margin when it comes to browser engines.

For Opera it is the best move for them. They can now focus the majority of their development time on making the browser great instead of putting a decent chunk of their development time in effectively replicating what WebKit does. I think in the coming year or so Opera will be become a far better browser for it.

As for Mozilla and IE. You would expect Microsoft have more than enough resources to keep working on Trident/Lynx whatever it is called.

For Mozilla is their OS tied to their own engine? I don't know how committed they are to it. For all the releases of Firefox tabs still aren't sandboxed and phpmyadmin often freezes the entire browser when looking at monster tables... perhaps they would benefit from spending more time improving the browser and less time working on rendering.

I wonder what would have happened if Microsoft, Mozilla or Opera had open sourced their browser engine with WebKit. Perhaps we would have seen a split and more competition in this area.

Now it is who has the $$$ to continue to develop their own propriety engine.


I don't know how committed they are to it.

Extremely, especially in this context. The reason why Opera is switching (web compatibility issues if you're not the dominant implementation) is exactly the reason why Mozilla would fight a switch with tooth and nail - and remember that unlike Opera they cannot care for profit when doing so.

Here's an extensive reply from a Firefox developer: http://www.quora.com/Mozilla-Firefox/Will-Firefox-ever-drop-...

I wonder what would have happened if Microsoft, Mozilla or Opera had open sourced their browser engine with WebKit.

I have no idea what you mean.


He means that there are different parts of a browser you can focus on and innovate there. If the rendering engine was 'out of the way", and didn't have to worry about that part, they could focus a lot more on other stuff.

Right now Microsoft has to focus on making Trident catch-up with webkit, and it's still 2 years behind webkit in HTML5 features. Go to html5test.com and see how far IE10 is. It's more behind than Chrome 10 was when IE9 launched 2 years ago.


Mozilla has been open source since day one.

Mozilla's engine at one point was the go to for a cross platform embedable browser but things changed: https://groups.google.com/forum/#!topic/mozilla.dev.embeddin...

Edit: Should clarify, Mozilla as an entity that releases web browsers, not "Mozilla" as in the Netscape days.


Uh.. Mozilla's browser engine has been open source since 1998 or so. It predates WebKit even existing.


"I don’t think anyone can successfully argue that Chome/Chromium isn’t a better browser than Safari [...]"

That sounds like Chrome was somehow superior to Safari. Chrome has some nice features that Safari lacks (and vice versa) but from browser engine perspective they are definitely on par.


That stuck out for me as well, I could argue that Chrome isn't a better browser than Safari for me, but functionally they're both using WebKit so they render sites identically in most cases.


Aside from JS performance, I'd argue that Chome's rapid updates make it a better browser. Safari as at best months old.


Sure, there's a lot going on that makes Chrome a great browser, but it's still not my daily driver as I prefer Safari for a myriad of non-technical reasons. I like the iCloud tab sync, the Twitter integration, the fact it looks significantly nicer than Chrome in terms of blending with the system style (which is arguable given how little else tries these days from Apple's own applications).


Except when it comes to the JavaScript engine. Safari's JS engine is the slowest out of Chrome/Firefox/Safari: http://arewefastyet.com/


I'd still argue that Chrome isn't a better browser for me, irregardless of JS processing speed.


I wouldn't say mobile is the only world we live; that's a bit of a stretch for me.


Boring. They can switch to Gecko with IPC embedding for a change: https://wiki.mozilla.org/Embedding/IPCLiteAPI


I find that funny because even if you add up everything WebKit, Gecko+IE+Opera+Others still have a higher browser share. And most of the "large parts" of the world don't even really use WebKit (I'm looking at China, India, Africa, parts of Europe etc) most of which are IE/Gecko/Opera


Mobile.


https://html5test.com/results/desktop.html

Why does Firefox do so poorly? ;_; Mozilla is such a good organization (but to be fair can only survive with Google).


Because much of what it tests isn't in HTML5. Some of what it tests isn't even standards track (e.g. WebSQL). The tests are also usually pretty superficial and test things like "does this property exist", not "does this feature work".


I know this is OT, but how hard would it be to create a thin layer so that WebKit run on bare metal?


Mozilla is doing this with Gecko. Google is sort of doing this with Chromebooks (AFIK). There is little to no incentive for Apple or Google to do this on their flagship products of iOS and Android since they already have dev ecosystems using objective c and java respectively.


Depends on the metal, which is why it's nice to have an OS as an abstraction layer :)


wasn't that the goal of webOS ?


I'd say the big risk with webkit which no-one is addressing is that it's LGPL not GPL. Someone could get a dominant position and use a closed fork of it. (The fact that Apple open-sourced Webkit is to its credit.)


The LGPL doesn't work that way. You can't fork WebKit and keep your changes closed unless you don't distribute your browser to anyone. Now, you could use webkit along with (for example) a closed Javascript runtime, but that really has nothing to do with WebKit itself.


Right, my mistake, but you can effectively borg it.


yeah because creating a closed fork of an open source project is really popular.


Great points. Nice read.




Consider applying for YC's W25 batch! Applications are open till Nov 12.

Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: