What happened to the idea of standardising Mono (the open source .NET) bytecode as a new browser language?
Then you can use whatever java-like/basic-like/functional-like language you want, and have it compile to bytecode. Also, with sophisticated JITs for that style of bytecode already lying around, you could probably instantly beat javascript's performance even under the best engines, because of static typing etc. rather than relying on ridiculously complicated static analysis.
It'd have to be supported side-by-side with javascript for the forseeable future, but the earlier you get a change like this in, the better...
1. JavaScript's speed is actually not far from Mono's now. And constantly getting closer.
2. I don't think anyone but Microsoft would support .NET bytecode in the browser, simply because Microsoft controls .NET and has patents on it. It would take a lot more to reassure other browser vendors than Microsoft's existing CPs.
3. Running dynamic languages on Mono is slow. Look at the speed of all the dynamic languages on Mono (or the JVM for that matter), and compare them to native implementations of dynamic languages, in particular JavaScript and Lua. The native implementations beat dynamic languages on Mono by a large margin, simply because .NET is a bytecode made for static languages. Of course you can say that you prefer to have static languages in the browser over dynamic ones, that's a legitimate opinion of course, but not everyone would agree.
4. Standardizing on bytecode can inhibit innovation. JavaScript doesn't have a bytecode standard, which let JS engines implement very different ways of running it (V8 doesn't have a bytecode interpreter at all, for example). Of course standardizing on syntax also inhibits innovation, just in other ways, it isn't clear which is better.
5. Static languages compiled to JavaScript (that is, that use JavaScript as their 'assembly') are getting quite fast. Some benchmarks I ran found them to be around 5X slower (on the development versions of SpiderMonkey and V8) than gcc -O3, compared to Mono which is 2.5X slower, http://syntensity.blogspot.com/2011/06/emscripten-13.html
6. There is already Silverlight/Moonlight which does .NET in browsers, and it hasn't been very successful. (Of course it is a chicken and egg thing, if it were bundled in browsers it might be more popular. But the failure of Silverlight is a disincentive to add Mono to browsers nonetheless.)
For all these reasons, I don't think Mono has much of a chance to be included in browsers. Most of the same arguments apply to other static-language bytecodes like NaCl.
>> 3. Running dynamic languages on Mono is slow. Look at the speed of all the dynamic languages on Mono (or the JVM for that matter), and compare them to native implementations of dynamic languages, in particular JavaScript and Lua. The native implementations beat dynamic languages on Mono by a large margin, simply because .NET is a bytecode made for static languages.
It's not fair to compare JavaScript, which is already approaching a limit of how fast it can go, with the speed it gets running in Mono. Why? The two implementations have a vast difference of amount of energy and resources thrown at them; had Google, Mozilla, and Microsoft wanted JS to run fast on Mono, it would run fast on Mono.
Also saying that .NET is a bytecode made for static languages is kind of iffy now that the "Dynamic Language Runtime" is a part of .NET
1. The Alioth results are not necessarily final - they compare a single JS engine, and we have several fast ones now (SpiderMonkey with type inference can be significantly faster on some benchmarks, for example). Even so, the median speed there is 2X, which is fairly close. Admittably there are some bad cases though, in particular pidigits (badly written benchmark code? bug in v8?).
2. It is true that JS on Mono has had far less work done, and that the DLR exists. However, the fact remains that dynamic languages are a late addition to the JVM/.NET model. For example, one very important thing for dynamic language performance is PICs, and to my knowledge there is no good example of fast PIC performance on the JVM or CLR. In fact, we don't even have a good example of a generic virtual machine that can run multiple dynamic languages fast (Parrot exists, but is not that fast) - all the fast dynamic language implementations are language-specific, so it shouldn't surprise us that VMs built for static languages don't do that well either.
In my opinion it makes no difference that we have several fast engines where some are faster at some things than others. When executing in the browser you don't get to pick and chose how and where your application will be executed. If you run into performance problems on one of the engines you can: A) dismiss a subset of your users and their performance problems, telling them to use a browser with a faster engine (they wont), B) only allow certain functionality based on a user agent string, or C) limit your applications scope to one that runs suitably in the slowest of the engines you're willing to support. In essence, if the application runs great in browser A but chokes in browser B, are you willing to say bye bye to your A users to take advantage of performance gains on B? I've been in this situation, and in my experience I've always had to look away from the faster browser rather than the user.
Outside the browser you probably have a little more freedom, but it's not like you get to pick and chose in the style of "Oh, I'll execute this function in V8 since it does this faster, and that function in SpiderMonkey since it's faster there". For this reason, I don't think the fact Alioth only has measures for one engine would make a significant difference in the overall comparison. You'd be, for the most part, gaining performance in one place by sacrificing in another.
Anyway, in my personal experience, I've ran into performance problems in JS a lot more often than with C#. I also have to go through a lot more tedious practices to ensure my JS code runs as fast as it can, where as in C# Some.lookup.with.lots.of.dots.does.not.scare.me(). That's why your claim sort of surprised me. Then again, the last serious JS performance problem I had was 6 months ago (before FF4), so maybe a lot has happened in those 6 months.
By the way, I'm not too informed on how type inference is done in SpiderMonkey, so I may be completely wrong in mentioning this, but it sounds like they're trying to speed up a dynamic language by mimicking static typing. If that's how far they're going to improve performance, maybe soon enough JavaScript will in fact sit better in the Mono/.NET/JVM?
I agree with your point about multiple JS engines, indeed you can't pick and choose the best results. What I was trying to say is just that the best results we see are an indication of where things are going. But again, I agree, we are not there yet and right now, each user has just one JS engine, and problems on some benchmarks. Static languages have much more consistent performance.
About the last 6 months: Yes, a lot happened during that time, namely FF4's JaegerMonkey and Chrome's Crankshaft. Both are significant improvements.
About typing, yes, in a way that could let this code run faster inside the JVM or Mono. If you can figure out the types, you can generate fast statically typed code for those VMs. However, type analysis can be both static and dynamic, should integrate with the PICs and so forth. So even with that, I don't expect dynamic languages to be able to run very fast on static language VMs.
They raced up, and down, and around and around and around, and forwards and backwards and sideways and upside-down.
Cheetah's friends said "it's not fair" - everyone knows Cheetah is the fastest creature but the races are too long and Cheetah gets tired!
Falcon's friends said "it's not fair" - everyone knows Falcon is the fastest creature but Falcon doesn't walk very well, he soars across the sky!
Horse's friends said "it's not fair" - everyone knows Horse is the fastest creature but this is only a yearling, you must stop the races until a stallion takes part!
Man's friends said "it's not fair" - everyone knows that in the "real world" Man would use a motorbike, you must wait until Man has fueled and warmed up the engine!
Snail's friends said "it's not fair" - everyone knows that a creature should leave a slime trail, all those other creatures are cheating!
Dalmatian's tail was banging on the ground. Dalmatian panted and between breaths said "Look at that beautiful mountain, let's race to the top!"
Ok, sorry for using the term "not fair", I should have probably said that it's "unsound" to judge the speeds of current dynamic languages on mono/.net to the speeds of v8, SpiderMonkey, etc. The cause being that the speeds of the latter were fueled by very high browser competition and a lot of resource investment. Dynamic languages on .NET got the benefit of neither of these, so it should not be surprising that they are slower than their native implementations (which also get better funding/bigger communities). It would have made more sense if Microsoft or other companies threw millions of dollars at the Iron* languages and still couldn't make them fast.
> which is already approaching a limit of how fast it can go
Do you have data to back this up? At least SpiderMonkey has projects in the works that give significant speedups on various workloads already, and lots of headroom left...
I would not be surprised to see another factor of 5 or so speedups on various code in the next few years in JS implementations.
Yeah I know, it's not a proper citation, I tried to find where exactly that was said at Google IO but found nothing so far. Either way, I didn't question it at the time I read it because given how dynamic JavaScript is I'd imagine there's only so much you can do to speed it up. Then again, this was coming from Google, and for all anyone knows the cause might just be them focusing more on Native Client instead of V8 for apps that need performance.
Ah, interesting. Yeah, that sounds like they're just planning to stop optimizing V8 or something, since you can clearly do better than that. The type inference branch of Jaegermonkey is already faster than V8+Crankshaft on compute-heavy (as opposed to GC-heavy, where V8's better garbage collector gives it a big edge) workloads, and that's without LICM or smart register allocation or any of the other global optimizations that are still coming online.
It's unfortunate that Google is deciding to focus on Native Client, with its portability issues, if that's what's going on.
Then you can use whatever java-like/basic-like/functional-like language you want, and have it compile to bytecode. Also, with sophisticated JITs for that style of bytecode already lying around, you could probably instantly beat javascript's performance even under the best engines, because of static typing etc. rather than relying on ridiculously complicated static analysis.
It'd have to be supported side-by-side with javascript for the forseeable future, but the earlier you get a change like this in, the better...