Now this is big. Doesn't this means Node.js would benefit from asm.js addition in V8 as well?
Few months ago by Brendan Eich: "Get your tums out, pal. We're taking PNaCl down for good this year with http://asmjs.org/. Cross-browser."[1] With the announcement of Unreal Engine and this, seems like he's one step closer to the goal. Now only JSC and Trident left...
I don't see this as big. Kbr wouldn't in any way be responsible for adding asm.js support to v8. He's one of the core guys on accelerated graphics, but isn't involved in the JS engine. Also, you'd be well served to consider anything Brendan Eich claims in the context of his own motivations. Because other JS engine developers aren't necessarily enthused about signing on to support a spec that Mozilla developed mostly in secret without their input, relies extensively on engine quirks, and doesn't yet (ever?) address many hard problems like code caching and threading. I'm not saying asm.js isn't interesting, but it's not a guaranteed win either.
> Because other JS engine developers aren't necessarily enthused about signing on to support a spec that Mozilla developed mostly in secret without their input,
I don't think that's accurate. We started work on asm.js only a few months ago; at the beginning, we had no idea if it would work, so we just threw around some ideas between each other. When we had something we felt could actually work, we immediately put it up on
And worked on it there. asm.js has been discussed on IRC (#emscripten and other places), the emscripten mailing list, etc., as we further improved the spec.
About the middle last month, we got some performance numbers, and felt we could say something about speed, so I mentioned asm.js in my talk at mloc.js and other more noticeable places, but it was definitely not a secret far before that.
> relies extensively on engine quirks
That would be a bad bug in the asm.js spec. Can you please elaborate?
Yet when Google spent only a few months developing the first Dart, and then released it to the public, and has been iterating on it for 2 years in full public view, and it still isn't even in Chrome nightlies (yet asm.js has already been dumped into Firefox nightlies) the same criticism was leveled at Google from Mozilla.
I think Mozilla would get a lot more benefit if people stop with the juvenile "You're going down!" adversarial mentality. If someone from the Dart or NaCL team had tweeted "2013 is the year NaCL and Dart destroy Javascript!" imagine the uproar. This is about exploring the solution space for solving technical challenges related to maximizing performance, helping developer productivity, and strengthening the Web, and there are many potential solutions. (I for one don't see the "Web" as Javascript, I see it as HTTP, URL, 'drive-by', indexability, transparency, platform independence and location independence. HTML, CSS, and JS are just particular concrete embodiments of those principles, but 30 years from now, they could be implemented in a radically different way)
I think asm.js is great for a slice of game genres, and I fully support it. The cheerleading and politics just isn't needed, let the code, benchmarks, and game demos speak for themselves.
Couldn't agree more. I think asm.js is very cool, and very likely a better approach than PNaCl (which I have been a big fan of for a while). But the "take THAT, Google," sentiment is really off-putting. Native Client was never about "take THAT, Mozilla" (or anyone else), it was about serving an unmet need that no one else was serving.
> I think Mozilla would get a lot more benefit if people stop with the juvenile "You're going down!" adversarial mentality. [..] The cheerleading and politics just isn't needed, let the code, benchmarks, and game demos speak for themselves.
I agree.
Also, btw, I think NaCl is wrong for the web, but an amazing technology, designed by super-talented people (I met a few). It solves a very hard problem in an impressive way. It already has other use cases than the web, and I imagine will continue to. For the web, however, as I say I think it is inappropriate, due to portability and standardizability concerns (PNaCl solves some of the portability concerns).
> Yet when Google spent only a few months developing the first Dart, and then released it to the public, and has been iterating on it for 2 years in full public view, and it still isn't even in Chrome nightlies (yet asm.js has already been dumped into Firefox nightlies) the same criticism was leveled at Google from Mozilla.
I think a big part of the PR problem Dart has began with the fact that it was leaked. The leak was an internal document, and it spoke somewhat bluntly - which is fine for an internal document, but it was an unfortunate way for it to be revealed to the world.
> Yet when Google spent only a few months developing the first Dart ... the same criticism was leveled at Google from Mozilla.
Yeah right. Dart was originally dash. Before that it was planned modifications to JS, but nobody wanted to turn JS into Java so you threatened JavaScript "would be replaced".
Dart/dash/google.js is just open-washing. Nobody outside of Google has any say in it, and that's why they don't take to it.
No, Dart had nothing to do with planned modifications to JS, and if you think it was in the oven for a long time, why then have 2 years passed and it still isn't shipping in the browser, and the language grammar has undergone many changes? It's spent an enormous amount of time, the vast majority, in public development. The Dart2JS compiler was rewritten multiple times from scratch since then.
I work for the GWT team, which was also owns the Google Closure Compiler. Simultaneously, at the same time, our team was kicking around taking the Closure type annotations, and making them actual (optional) grammar terminals instead of boilerplately JsDoc comments, pretty much exactly what TypeScript or JSX is today, and then propose those changes to TC39 (one difference is, we didn't have classes, we had Structural Types). A separate team was working on Traceur, which was prototyping a different set of changes to JS, and they too were planning to go the TC39 route. Google is a big company and can have multiple teams prototyping many approaches. This stuff about turning JS into Java sounds like religious nonsense from people who have an aesthetic derangement over classes and OO, there are advocates on both sides, not just Googlers (and in fact, there are Googlers opposed to classes in JS), and ES6 has been fielding Class proposals. We were testing out how various extensions "felt" and worked with tooling, for example, Structural Typing in the presence of recursive types has issues.
There's nothing "non-open" about coming up with language extensions, doing a prototype implementation, and then putting it out for public review and drawing up a spec. That's exactly how all of the IETF specs get done -- rough consensus and running code. Talk is cheap, a proposal is better understood if it comes with an example prototype. Mozilla does their own prototyping of extensions to JS, it's the only way to really do a sanity check. Compared to IronMonkey, no one is under any illusions that something like Traceur was being pushed as some kind of official product for people to adopt and use, it was very clearly, a test bed for experimentation.
If we take your definition, no one outside of Mozilla had any say over asm.js, they developed the complete asm.js spec without involving the public, dumped it fully formed, along with an implementation VM, emscripten integration, and even an UnrealEngine demo, all before any standardization activity. The draft spec has no non-Mozilla editors/contributors listed or acknowledged as far as I can tell.
One of the real shames here, all the focus on Dart/NaCL, while native apps distributed on mobile OSes are eating our lunch. I want ChromeOS and FirefoxOS to be a success, but it is more likely to be a success if Mozilla, Google, et al can work together, and avoid attacks. It pains me to see this because both sides are doing tremendous work to move the web forward, and pettiness can harm spirit of cooperation.
> if you think it was in the oven for a long time, why then have 2 years passed and it still isn't shipping in the browser, and the language grammar has undergone many changes?
As an outside observer I'd say as a company with basically unlimited money resources for a decade that Google devs probably are slacking off quite a bit, not really driven in language design to get anything done.
Asm.js is basically just a spec for what emscripten was already doing. There's not much too it beyond that other than the linking/module bit, so not much to even discuss with anybody else. Dash/Dart/Pepper is a major piece of technology that is not a subset of existing tech like asm.js is. It has a much higher expectation for collaboration.
So rather than believe that Dart wasn't anywhere near done when it was announced, as evidenced by the massive changes, 20000+ commits since then, and the time it's taken to get close to version 1.0, you prefer to think that the engineers are just slacking off.
Why is this a game technology and not a general purpose one? I think Mozilla might want to expand their message a bit, and provide some non game engine demos.
>I don't think that's accurate. We started work on asm.js only a few months ago; at the beginning, we had no idea if it would work, so we just threw around some ideas between each other. When we had something we felt could actually work, we immediately put it up on
As others have pointed out, Mozilla has been less than receptive when similar situations were reversed. I also know from working with the NaCl and PPAPI teams that they repeatedly reached out to Mozilla in an effort to publicly develop a mutually agreeable standard, but they were not well received. So, given past precedent, the way asm.js was introduced isn't necessarily bad, but it certainly feels a bit hypocritical.
>> relies extensively on engine quirks
>That would be a bad bug in the asm.js spec. Can you please elaborate?
Perhaps "quirks" isn't the right word for things that are idiosyncratic but legal either by standard or de facto convention. It's certainly an ingenious way to get backwards compatibility while basically defining a new IR, but it is a bit... quirky.
As I mentioned elsewhere, I think asm.js might really catch on. It could be the IR that finally becomes universal, in large part because your backwards compatibility strategy is so damn clever. That said, I also see a few obvious pain points that will need solutions:
* Reasonable debugging support (maybe just a metadata standard morally equivalent to symbol files)
* Threading (even just co-routines, it seems some behavior needs to be specified)
* Load performance (validation and code caching seem needed, which are very tractable but just mean more work)
> As others have pointed out, Mozilla has been less than receptive when similar situations were reversed.
That's a long conversation, and I likely know nothing of the non-public aspects of it. From what I know of the public stuff, I don't think it is the same in reverse. I do see the general similarity you refer to, I just think there are some fundamental differences that explain the different responses.
Happy to discuss this more if you want.
> Perhaps "quirks" isn't the right word for things that are idiosyncratic but legal either by standard or de facto convention.
I still don't know what you mean, can you please say what concretely in the spec you are referring to?
>> Perhaps "quirks" isn't the right word for things that are idiosyncratic but legal either by standard or de facto convention.
>I still don't know what you mean, can you please say what concretely in the spec you are referring to?
You're using a subset of JS as an IR, which is something the language was never designed for and not inherently good at. To make that performant you have to hack in things like manual memory management and use type coercions to game existing engine behavior. To get full performance you need significant engine changes including a mode switch, validation step, and special purpose compiler. And of course, the JS as IR doesn't really qualify as human readable in any meaningful sense.
And I happily admit that type coercions and similar tricks are a very clever way to structure your IR while simultaneously maintaining compatibility and improving performance in existing JS engines. However, in doing so you're already relying heavily on unspecced implementation details of existing engines.
So, my point is that it's a hack... a really brilliant hack that might actually be the best path forward for a universal IR on the Web. But it's still a hack, and it brings with it some serious pain points that still need to be resolved.
> You're using a subset of JS as an IR, which is something the language was never designed for and not inherently good at
I agree it was not designed for it. However I a not sure what "inherently good at" means - after v8 showed up in 2008, for example, many types of code suddenly became very fast. Were they always inherently fast?
> To get full performance you need significant engine changes including a mode switch, validation step, and special purpose compiler.
I disagree. Firefox did pick that option because it seemed best, but I believe it is possible to achieve similar speeds with other approaches. As the slide here shows,
on many benchmarks, even without any new optimizations modern JS engines are already fast on asm.js code. And with new optimizations, even without things you just mentioned, they should be able to get fast on the rest. I saw some activity on the v8 bug tracker indicating possible work in that direction, which I am very curious and hopeful about.
> And of course, the JS as IR doesn't really qualify as human readable in any meaningful sense.
Yes, it is not intended to be - like the output of closure compiler, etc. Note though that emscripten can generate asm.js in debug mode, which is not minified, and actually quite readable - you can recognize function and variable names, for example. It looks a little quirky to be sure ;) but it is more readable than compiler IRs like LLVM for example, in my opinion (and certainly far more readable than x86 or ARM assembly).
> However, in doing so you're already relying heavily on unspecced implementation details of existing engines.
Performance is not specced at all for JavaScript. Again, when v8, nitro and tracemonkey came out in 2008-2009, many types of JS code suddenly got fast. There wasn't a spec for any of that. No one says modern JS engines should use int32 when a value never goes out of the int32 range, but all modern JS engines do that (in hot functions).
> So, my point is that it's a hack... a really brilliant hack that might actually be the best path forward for a universal IR on the Web. But it's still a hack, and it brings with it some serious pain points that still need to be resolved.
I fully agree it is a hack, and it has various pain points. It's a compromise, not a clean solution from scratch. I hope we can resolve many of those pain points in time, and that we can do that in collaboration with all browser vendors together.
> I also know from working with the NaCl and PPAPI teams that they repeatedly reached out to Mozilla in an effort to publicly develop a mutually agreeable standard, but they were not well received.
They weren't well received I suppose because Pepper is a binary API that duplicates everything already in the browser. Why does PP have to be so large? Because with NaCl it's incredibly awkward to call javascript functions from native code, and vice versa. It's not webby.
That would be a reasonable argument if Mozilla wasn't still adding extensions to NPAPI, a far less webby technology that's a much worse design and outright dangerous for end-users. And yes, NaCl has its awkwardness, but it's far better than NPAPI, and if you look a bit you'll realize that NaCl's awkwardness is specifically because it can support things that asm.js can't. NaCl has real native threads, clean debugging, and a simpler legacy code path, but the cost is the complexity of the implementation.
The funny thing in all this is that I'm really not a proponent of NaCl. I find it technically very interesting, but I never considered architecture-specific NaCl viable for the Web, and there are still kinks to work out with PNaCl. I just find the attitude towards asm.js curious, because it carries so many of the past criticisms of NaCl (not human readable, initially developed in private, etc.).
Why would Mozilla add extensions to NPAPI, for more advanced plugins that must be trusted or expensively sandboxed because they are normally compiled? Mozilla is doing everything in the web, adding sound and graphic APIs to html not duplicating them for legacy native code. It doesn't even make sense for Google to do Pepper, let alone Mozilla.
Real threads essentially mean that every call must copy all data or else another thread could modify the data while it is being used by trusted code, or all threads have to be suspended. That's clumsy and not a good solution.
Your statements here are just nonsensical. And given that your account was created the day of this post, and hasn't commented on anything else, it's hard not to read this as simple trolling.
> other JS engine developers aren't necessarily enthused about signing on to support a spec that [company] developed mostly in secret without input from other JS engines, relies extensively on engine quirks, and doesn't yet solve many hard problems
And I would argue that any spec tagged with these kinds of complaints rarely ever sees broad adoption, regardless who made it. From my perspective, the only thing asm.js has going for it that offsets the downsides is that the code at least runs in other JS engines even if they don't explicitly support it. And that alone might actually be enough.
> Because other JS engine developers aren't necessarily enthused about signing on to support a spec that Mozilla developed mostly in secret without their input, relies extensively on engine quirks, and doesn't yet (ever?) address many hard problems like code caching and threading
It's a spec for a subset of JS with guaranteed AOT compilation — it's hardly one that needs to go through a thousand committees to get somewhere where all those concerned are happy. If you want to do AOT compilation, everyone basically needs the same subset. Indeed, the most contentious part of asm.js is probably how you isolate it off from the rest of the script. It's simply not a spec that defines very much.
And what engine quirks does it "extensively rely on"? If it relies upon anything apart from ES5, it's broken and the spec needs fixing.
Similarly, it's out of scope for it to deal with anything like code caching and threading — the former is depending on how you tackle it either implementation detail (see how Carakan and V8 implement it today) or something really quite radical to the platform (but not something that needs to be immediately tackled), and the latter is one solution to a broader problem (parallelism in JS). Neither seem like things asm.js should be rushing to solve.
An AOT compiled subset of JS is not in itself novel: this is not the first time it's ever been talked about, but what is novel (to my knowledge) is the idea of making compiler behaviour explicit for it, by having a token in code, along with isolating globals. Enabling AOT compilation is in many ways the "golden-grail", as although one must still be weary of compilation time (see SunSpider — many of the benchmarks complete in under 10ms, so you don't have the time to spend compiling them), the warmup time is currently relatively large.
asm.js is not a redesign of JS from the ground up, it is an evolutionary step in improving support for a certain style of Emscripten/Mandreel generated code that Google is at least somewhat interested in, given its inclusion in the Octane benchmark.
Performance improved in Chrome on at least one Emscripten benchmark when generating asm.js code rather than the previous backend, without any further change to Chrome, probably because asm.js does a better job at formalizing the various assumptions that this style of code relies on.
What I meant to say was not to use asm.js as target for compilation, but rather implementing performance-critical module in asm.js subset and get the extra performance out of V8. Sort of like how Rusha[1] did it.
Calling asm.js code is not free either, it's worth mentioning. I measured it at (very) roughly 2ms/call for the code I posted yesterday. (At asmjs.org they mention they intend to address this.)
It's not a no-op, it has to check types and maybe convert representations, much like a C/C++ FFI (but with more freedom for the implementor, who's not committed to the C ABI, so there's potential to go faster than a C FFI).
But I agree it needs attention, and apparently so do the Firefox devs. In http://wry.me/hacking/Turing-Drawings/ the asm.js version ran much slower than the regular JS version, until I adapted it to make fewer calls. The slowness also makes some other uses I've thought of impractical.
(The figure I gave was not carefully measured; just enough to say "yep, that'd account for why this is amazingly slow".)
It could help solve the cross platform extension problem. Windows users often get left out. This would make node.js more like java as a write once run anywhere platform.
Anyway I agree. People in the previous thread clamoring for NaCl, claiming asm.js isn't supported by other browser now look really, really foolish in hindsight.
I'm moderately happy the Web is moving in this direction. I'm glad Mozilla is moving it in that direction :)
I have no idea why Mozilla was so resistant to pnacl. It's a nice step beyond the nasty legacy JavaScript mess. Among other things, JavaScript is necessarily single threaded... Yea.
DRY? I think NaCl will require Mozilla to implement another VM or parser, and do same optimization it already does on JS. So why should they repeat themselves?
"WebWorkers are not really a multithread model. Game engines could be written to leverage message passing based isolates, but in some cases, but doing it in an optimal form isn't exactly developer friendly, it's one of pains of developing the PS3 SPEs is that they didn't share system memory and you had to DMA stuff around."
Anyway, he said that the problem with Pepper is that its API is gigantic and its only spec is the implementation in Chromium. Other browsers could port the API but it come at very high cost due to WebKit-specific glue code.
The "problem" with pnacl is that it means the end of Javascript's death grip on the browser, which Brendan Eich is interested in extending for obvious reasons
To be clear, there are two things that have historically been called "pepper". The first was a basically a better version of NPAPI, with the limited scope that implies. That was replaced by something that's basically a Google-proprietary replacement for the entire web stack, with both the scope and the problems that implies.
It may run on multiple threads but I don't think it exposes threading to the runtime. Ecmascript is certainly single threaded and async by design, so v8 would be straying far from the spec.
for opening up a new separate thread, that you can only communicate through message passing. It's not part of the core language but the browser library hence why you can't use them in Node.
They're also more directly comparable to processes than threads, as they do not share memory (though there are a couple of proposals to allow them to).
"Cross-browser". Why exactly is it asm.js cross-browser? Because they just said they'd put it in V8? The argument is moot since if Mozilla said that first about PNaCl (that they'd adopt it) it would be PNaCl that would be the cross-browser solution.
No, I don't care about backwards-compatibility. I care about the solution leveraging existing infrastructure and not being a hack.
asm.js is, at its heart, nothing more than a compiler hint. It tells the runtime that the JavaScript you're writing in this function follows some conventions that happen to be easy to optimize, but it's still JavaScript code. A browser that doesn't support asm.js will pause for a few nanoseconds to wonder what the heck that strange string at the start of the function is, then go on to execute the JavaScript code the same way it always does, with the same result as if it had known what asm.js is.
As far as I can tell they haven't said they'd put it in V8. One member of the chromium team, who doesn't work in that area, has opened up a feature request.
This seems to be much ado about nothing unless and until it's accepted and assigned.
On the Note of Node.js, i was wondering if there will ever be a Node.js based on IonMonkey + Baseline Compiler instead of V8.
The engineer in my says, cool idea and judging by libuv [0] The joyent / node.js team seems to be open to abstractions, if help is offered.
Assuming the technical task isn't too expensive in terms of leaky abstractions or ripple, one has to ask the next question, if V8 works...why, would it be worth it?
I would like to note that I don't work on V8 full time anymore and all opinions that I express on twitter are mine alone and not those of employer, my current or former team.
I am writing a blog post that condenses my feeling towards asm.js into something tangible.
I also would like to add that it still excites me much to see JavaScript to become faster and faster and that I anticipate real world JavaScript written by human hands to become faster in the future and I don't think we need AOT compilation shortcuts for that.
It's not just for the browser experience. I see all this is a preemptive move to make developing games (and other "rich experience apps") on Firefox OS more attractive to new developers. Who knows, we may even have some ports of existing games for it pretty soon.
Very smart move IMO.
Edit: To clarify. To think that the devs here are interested means the Mozilla folks got the right idea.
Can you port say OpenGL 4+ games to asm.js and the browser, or will you be able to use only the WebGL/OpenGL ES 2.0 API for it? If so, then we will only see mobile/indie games on the web for now, at least until WebGL gets more advanced graphics API's.
Don't see OpenGL 4+ making it to asm.js, but if anything has shown me with my limited experience in gaming, it's that people don't always flood to games because of the eye-candy alone. Case in point: Angry Birds.
I think WebGL/OpenGL will do nicely to create a game like Ikaruga (http://en.wikipedia.org/wiki/Ikaruga). It's probably one of the hardest, but most enjoyable games I've ever played and I think mobile/indie game developers will pull it off. If anything this opens up a new democracy in the previously restrictive world (if not outright closed as in iOS) game development for devices and consoles.
I just wasn't sure whether this is bound to WebGL or not. But even if it is, it's only a matter of time before WebGL gets graphics API's as advanced as the full OpenGL (5-10 years).
I think we're already going to see OpenGL ES 3.0 being implemented into WebGL within the next 2 years, and even if WebGL never adopts the full OpenGL (it has a lot of cruft anyway), keeping up with the OpenGL ES API's should be good enough (OpenGL ES 3.0, ES 4.0, ES 5.0, etc).
I was thinking the OpenGL ES will be the future most popular graphics API anyway (on any platform). It's just that Nvidia announcing full OpenGL 4.3 support for Tegra 5 kind've made me re-evaluate that. But if other mobile chip makers just stick with OpenGL ES, and WebGL does, too, then OpenGL ES should still become the most popular graphics API in the next few years.
>Can you port say OpenGL 4+ games to asm.js and the browser, or will you be able to use only the WebGL/OpenGL ES 2.0 API for it? If so, then we will only see mobile/indie games on the web for now, at least until WebGL gets more advanced graphics API's.
Why, did you expect AAA games on the browser?
One step at a time people, one step at a time...
Mobile games of the kind you get on an iPad, with 60fps are a perfectly good target for the web browser.
I filed Issue 2424 and so let me explain: my fix was a very hackish local way (not the way I'd really like to see it fixed) that just demonstrated how much of the performance can be gained.
A generic implementation is obviously preferred. I would like to do it myself but I don't work on V8 full time anymore so I can't dedicate my own time to it.
Now i'm gonna say that plain and clear : I don't care which wins, between PNacl and asm.js, i just care that one or them wins, and gets in every browser with decent performance.
Both approaches have their merits and drawbacks, I personnally prefer asm.js because of backward compatibilty and portability, but if we could avoid a 5 years fight about which is gonna bring near native perf in the browser, it would be absolutely awesome.
For me, its not that important, that every browser supports it. I have a large base of Business clients who have no problem, using whatever browser gets the job done best.
Smart move, Asm.js could basically be the gate opener for Chrome OS. Suddenly it makes sense if the OS is just a browser, if you imagine you could run things like Photoshop, Office or your complete development enviroment from any browser. Emscripten already has examples of QT apps running in the browser, this enables a whole new world of web development.
Whether it's asm.js or PNaCL or something else, it seems that we are headed towards a future where the client part of web applications is crafted with compiled languages rather than JavaScript. That would be closer to the way mobile is now, with the exception that applications are discovered through a browser. It's starting to feel as if the JavaScript renaissance that we are having is taking us to a place where actually crafting web applications with JavaScript is factored out. That would certainly change what it means to be a front-end web developer. Maybe it will be that JavaScript will still have a place with very simple document-type content, while heavier applications will be made with compiled langauges. It's going to be interesting to watch this unfold.
I'm just glad this development will allow you to use other languages besides javascript. Emscripten is cool, but not practical without something like this standardized. If it can render an Unreal level, it should be able to render a Qt widget sufficiently. Everything I've seen in this regard so far is inferior to just using html/css. As this tech develops that may not always be true.
are you talking about the top-10 websites? let's take them one by one:
- Google. I don't think so, it's not even one of their major server-side languages now.
- Facebook. Likewise.
- YouTube. See Google.
- Yahoo!. I don't know anything about their server-side
- QQ.com, Taobao.com, Baidu.com. Given that the Chinese do not pioneer web platforms, I don't see why they'd do this.
- Windows Live. I know MS "likes" JavaScript, putting it in Windows and all, but it's not one of their major languages. It's not even their own :P
- Wikipedia. Would require a switch from MediaWiki or rewrite. While I'd very much like to see PHP replaced, it's currently doing it's job. I have a feeling the language does not matter much, MediaWiki itself is not very complicated.
- Amazon.com. Don't know.
Do you seriously see 5 or more of these turning to server-side JS?
It might be a nitpick, but I wonder what you mean by this statement:
- QQ.com, Taobao.com, Baidu.com. Given that the Chinese do not pioneer web platforms, I don't see why they'd do this.
From what I've seen and heard, this isn't true. Taobao in particular seems to be running on a pretty cool stack, and has contributed a lot with OpenResty, which is nginx with modules, LuaJIT and some goodness, backed by Redis, Memcached, MySQL and/or Postgres (inside nginx' event model).
Top 10 is impossible to say since we don't know who will be in those slots 5 years from now. Internet time is insanely fast. Things come and go very quickly. I think it'd be fair to say half of the Top 1000 sites will be with the rate of Node's current development. Development is now getting to a point where they aren't going to be adding new features, but focusing on making it as efficient as possible in what it is (give this a listen http://javascriptjabber.com/052-jsj-node-npm-with-isaac-schl...). I made the switch recently from PHP and I don't think I'd ever go back. Love it, although yes there is a slight learning curve but if you know Javascript already you'll do fine. Knowing how to use callbacks and using some sort of middleware like express is the way to go when developing applications with it.
Very unlikely; Amazon's backend is basically Perl talking to an SoA via an internal service definition language. A change to server-side JS would require a re-write of almost everything (which isn't unprecedented; the migration from a single giant C++ binary to the Perl system 10-odd years ago was such a rewrite).
While I agree that it's terribly unlikely, one could argue that having the services backing the frontend fleet written in JavaScript was "powering" Amazon.com via JS. There's no technical limitation on what you can build those services in, so it's theoretically possible.
One could indeed, and again there is some precedent for writing services in "unusual" languages. However, the service ecosystem, tooling and deployment was/is entirely custom; they'd need to add support for JavaScript to an awful lot of internal tools to make it work well. The effort to get Java supported as a first-class development language was pretty substantial, and over the years there a number of internal teams with clever names sprung up to try and change the status quo in one way or another (SVN > Perforce! Ruby > Java!) with varying degrees of success.
So, yeah, no technical limitation, but explaining to Jeff that you've spent a year retooling the build system to support node.js but not actually shipped any features yet might lead to limitations of another sort :-)
> So, yeah, no technical limitation, but explaining to Jeff that you've spent a year retooling the build system to support node.js but not actually shipped any features yet might lead to limitations of another sort :-)
Indeed, and this is why I agree that it's pretty unlikely :).
During my stint at Amazon we barely had time to build the infrastructure we absolutely needed to launch features on the artificial deadlines imposed from somewhere in the stratosphere. How anyone had time to work on paying off technical debt or building new infrastructure is beyond me.
Err, people have moved on from straight parallel code. It's all about message passing and isolation now, which is something Javascript can get perfectly easily (and sort of already has).
First, it's the browser that has web workers, not javascript. Second, message passing is prohibitively slow for things like, say, video games. Third, it's not clear at all that we gain any benefit from javascript use.
Not saying you're right or wrong, but venerable technologies have a tough time dying quietly. Also most developers tend to be fairly conservative with what they choose especially when load and performance is concerned.
Few months ago by Brendan Eich: "Get your tums out, pal. We're taking PNaCl down for good this year with http://asmjs.org/. Cross-browser."[1] With the announcement of Unreal Engine and this, seems like he's one step closer to the goal. Now only JSC and Trident left...
[1]: https://news.ycombinator.com/item?id=5226967