Hacker News new | past | comments | ask | show | jobs | submit login
The State of JavaScript - Brendan Eich (brendaneich.github.com)
380 points by ryanstewart on Oct 9, 2012 | hide | past | favorite | 260 comments



This presentation saddened me.

The presentation focused on what it perceived as missing features: structs (seriously?), classes, modules, syntactic sugar, macros, etc. But the huge gaping holes in Javascript are not missing features. They are fundamental errors in the language. Things like ==, numbers as strings, eval, incorrect definitions of false, semicolon insertion, and -- heaven help us all -- improper lexical scoping.

Language designers tend to incrementally add junk to languages until they are complex, unweieldy monstrosities like C++ or Java. Rarely do they fix fundamental errors in the language because that would require backward-incompatible changes. So they stick to adding lipstick to the pig. But JavaScript isn't like other languages: its fundamental errors are so glaring, and impact so negatively on the language, that the benefit of jumping to a "JavaScript 2.0" massively outweighs its incompatibility disadvantages. That's why we see languages like CoffeeScript cropping up despite all their downsides, notably debugging.

The class bit particularly made me sad: JavaScript has a perfectly cromulent, even elegant, object model in the form of prototypes. But a variety of syntactic sugar hacks, weird constructor stuff, and general desperation to be a class-based language have sullied what would otherwise be an elegant mechanism. The solution appears to be: move more towards classes! Thus we still have all the language hacks, and two generally incompatible object models to boot. Plus structs!

Somehow after reading this presentation, I was struck with Yoda's admonition: Eich seems to be looking to the future, never his mind on where his language was.


You seem to misunderstand "structs" -- see http://wiki.ecmascript.org/doku.php?id=harmony:binary_data, this is an extension of WebGL's typed arrays, which are already in all the new browsers (IE10 too).

As for implicit coercions, I enjoyed Gary Bernhardt's "Wat", referred to it, and at past talks even mocked along with. At Strange Loop, I went through each "Wat" in the "Wat Secrets Revealed" slide series (use down arrow when you see it greyed in).

Of course (!) I regret the implicit conversions that make == not an equivalence relation with disparate types on left and right (NaN is a different story: blame IEEE754). Who wouldn't? As I said at Strange Loop, some colleagues at Netscape were hot for lazy/loose number/string matching, and "I was an idiot! I gave them what they wanted."

There may be hope of fixing even ==, if we get macros done right. You would opt into macrology redefining == to be === or whatever you want. But this is in the future.

And that's the point: JS must grow compatibly, given its cursed/blessed position in the web. There is no other option that costs as little incrementally. True, we could paint into a corner. I don't see that happening, not with the vibrant and very smart JS developer community (communities, really) with whom we are working.

On a practical level, I once ran into someone who used to work at IDEO and became a JS hacker in the course of doing a startup. I asked him about == quirks and the like. He just shrugged and said "you learn what to avoid and move on." That is the voice of practical wisdom (until such time as macros help fix the quirks for good).

So my advice is cheer up!


> You would opt into macrology redefining == to be === or whatever you want.

Oh, please let the syntax for this be something like

  let == = ===;
:-P


Let's not run an extra let statement if we don't have to!

    if (== != === || == !== ===)
      let == = ===;


Shouldn't that be var to get the right scope?


May be off-topic but: Is it just me, or are macros just a new way to get confused while reading JavaScript? Introducing language-foreign syntactic constructs seems to me superfluous and confusing - This is the job of transcompiling languages like CoffeeScript.


Reading the macros first helps. They must be defined at top of program or module, if I recall sweetjs.org's design correctly, for the staged hygienic expansion to work well.

Aside from that, you're right. But JS has higher order functions and objects with ad-hoc methods, so it can be used according to many paradigms, which can make it hard for a reader unfamiliar with the dominant paradigm in code at hand.

This is not a deal-killer for macros, although with sweet.js as the prototyping tool, your assertion about "This is the job" is satisfied. Sweet.js works with node to do AOT macro expansion at present. There's effort to integrate it in the browser too, but this will be easier with ES6 module loaders (not yet prototyped in SpiderMonkey or V8).


Of course, as in every aspect of understanding, here for source code, it is important first to learn the context, here the macro definitions. My concern is that this will impose more than just a paradigm - it will impose new syntax which could effectively completely ruin the readability of JavaScript source.

The macro syntax is definitely not simple, and it could possibly get really complex for more elaborate syntactic definitions, thus rendering the source much less readable. Is the overall benefits of introducing macros to JavaScript really worth the costs of readability? And does the effort to integrate macros into the browser mean that it'll be possible to evaluate macros "runtime"?


> The macro syntax is definitely not simple, and it could possibly get really complex for more elaborate syntactic definitions, thus rendering the source much less readable.

True, but the same can be said of any API. Reading the definitions can help but for both complex macros and complex APIs built today out of just functions and objects you need to document your abstractions. Macros don't change this, they just give you another abstraction axis (syntactic) to work with.

As with most things it just depends on how you use it. Sure you can abuse macros to make tons of crazy, undocumented, hard to understand syntactic extensions that destroy readability. But you can already do that today. Used wisely macros can increase readability, used poorly they can decrease it.

> And does the effort to integrate macros into the browser mean that it'll be possible to evaluate macros "runtime"?

Not sure what you mean here. By definition macros are expanded at compile time (well, parse time really). The browser doesn't change this.


I meant compile time of course, thank you for your clarifications. I'm eager to see how this turn out.


All abstractions can make your code unreadable; the key, as always, is to create good abstractions and document them clearly. Macros are syntactic abstractions. It's just as important to document them as it is for functions or objects. Of course, if you just have a little local macro that you're using for convenience, looking at the implementation may be sufficient. But when you write a macro that you want to share from a module, rather than requiring your clients to read the implementation, you document the syntax and the semantics, just like you would if you were writing a separate language.

But by having macros directly in JS, instead of having to use a whole language that compiles to JS, you can combine syntactic features from different sources. For example, right now there's no way to use one feature you like from CoffeeScript with another feature you like from TypeScript. You just can't combine them. But with macros, you could import two different syntaxes from two different libraries and use them in the same code.

On top of that, if we actually had macros in a future version of the standard, you wouldn't even have to precompile the macros offline, and you wouldn't need a preprocessing step at all. (For latency purposes, you might want to preprocess macros offline as an optimization. But for development, not having to do a preprocessing step is more convenient.)

Dave


So this is all in the holy name of making JavaScript the assembly language of the web? Making it possible for every JavaScripter to write "his own" JavaScript syntax definitions meaning that I'll (as a contributor or just casual watcher) would have to read his whole collection of macros before I could begin to understand the code?

I don't think this can be compared to API's as they still follow the regular syntactic definitions - This will be like reading a completely new language every time I read a different repository. (Of course this is a worst-case scenario as I imagine that many macros will be used across several projects, but still.)

Is all hope gone for writing vanilla JS gone? And isn't macros kinda going in the opposite direction than the ES specs? There's no use for many of the ES6/7 features as they could just be mocked up in macros.


"So they stick to adding lipstick to the pig. But JavaScript isn't like other languages: its fundamental errors are so glaring, and impact so negatively on the language, that the benefit of jumping to a "JavaScript 2.0" massively outweighs its incompatibility disadvantages."

"Jumping to a JavaScript 2.0" is the hard part. Any successor to JavaScript has to have a compatibility story with all the JavaScript code out there, as well as the DOM APIs and so forth. Either you ship two VMs, in which case each page has two incompatible worlds (yet both can access the DOM -- think about the massive complexity this entails), or you have to think about language versioning, which requires at the very least a functioning static module system.


The biggest challenge with multiple VMs is doing GC over both: cross-VM GC is a current research topic, and nothing has shown to be doable without a noticeable perf impact.


This presentation focused on work that's more recent. Improving the scoping story with let/const is one of the things that TC39 agreed on relatively early, though:

http://wiki.ecmascript.org/doku.php?id=harmony:let

http://wiki.ecmascript.org/doku.php?id=harmony:const

http://wiki.ecmascript.org/doku.php?id=harmony:block_scoped_...

Some of the new-function/lambda-syntax proposals even support Tennent's Correspondence Principle to some degree.


We gave up on TCP for => function syntax. I think Java reached the same conclusion.

It's really hard to retrofit TCP onto a C-like language with statements as well as expressions. At best you please only some programmers and confuse others, at a fairly high implementation cost (e.g., return from within a lambda called after its containing function deactivated).

See https://mail.mozilla.org/pipermail/es-discuss/2012-March/021... and especially https://mail.mozilla.org/pipermail/es-discuss/2012-March/021....


Thanks, it's nice to hear it from the horse's mouth.

Does it make sense to think about TCP as being a matter of degree? Would it be correct to say that the number of constructs that would break TCP is fewer with let, const, and => than with earlier versions of ES?

I think a "partial TCP" would matter for manual refactoring, if not for (e.g.) a future macro system. On the other hand, perhaps it's more confusing to mention it if it's not total.


Saw dherman's comment just now, he and you are right that there is a spectrum of TCP, and if you use a subset of JS to write expression-bodied => functions, you get the refactoring property that people associate with TCP.


Though the expression version of => comes pretty close to TCP.

Dave


I think you're being too hard on JavaScript there. Type coercion, ASI, and function scoping are all language features, not errors. Likewise, eval is not an error; it's dangerous, but it's also a feature (and one that nearly all dynamic/scripted languages provide). As Crockford said, "JavaScript is the only language people feel like they don't need to learn to use." That statement alone describes why so many developers end up scratching their heads at stuff like `1 == "1"`. That's not an error or an oversight in the language - it's a feature that the developer didn't realize he/she was using. It's hard to make it 10 minutes in ANY educational material on JavaScript that doesn't explain equality (==) vs identify (===) comparisons in JavaScript.

Lastly, the new class stuff doesn't actually change the inheritance/object models in JavaScript. It's syntactic sugar on top of prototypes, and the "weird constructor stuff" is quite analogous to existing constructor functions. For example, from the [ecmascript wiki](http://wiki.ecmascript.org/doku.php?id=strawman:maximally_mi...): "Class declarations/expressions create a constructor function/prototype pair exactly as for function declarations."

I'm clearly a bit of a fanboi and obviously biased, so take all of this with a grain of salt - but most of these new language features are a good thing. I'm glad to see JavaScript evolving in big ways. It was getting boring watching new editions come out with nothing more interesting than Array.prototype.reduce in them. Especially when you consider server-side contexts like Node.js, stuff like typed arrays, generators, Maps, etc. are welcome additions.


Eval sure, but the way that type coercion works in Javascript is an error. Boxing in Javascript is an error. You can figure them out, you can code around them, but they are bad and can't be justified. All coercion isn't bad, but some is.

Have you seen code that uses .valueOf() for anything good or useful? new String()?


Pre-JIT-compiling JS VM days, I did see new String used intentionally to eliminate auto-boxing overhead on every method called on a big string.

For ES4 (after AS3), we tried eliminating boxing. This is overtly incompatible, a web-breaking change. It won't fly.

Java has primitives that can be autoboxed or explicitly boxed too, which is why JS has them. I was future-proofing for "LiveConnect", which shipped in Netscape 3 (Nick Thompson did the integration).

But I was also in a terrible ten day hurry, so found unboxed primitives easier to deal with in the Mocha first implementation.

If I did not have the "make it look like Java" and ten-day marching orders, I like to think I would have resisted the lazy-programmer second reason. But it's all water way under the bridge now.

Implicit coercions, e.g., '' => 0, were influence by Perl 4. Nuff said!


Sorry, missed the Yoda misquote at the end, I must respond!

Yoda said "where you are" but you used "was". Possibly just a tense-typo, but it matters. Tracking where JS is is a part of the TC39 job, but only a part. It's easy to fall into a trap where we standardize only what developers can express given the language's current semantics and so miss the chance to extend those semantics.

But I'll take the tense-corrected bait: yes, I talk about the future. The past (including its most recent instantaneous sample known as "the present") is greedy. With no one to look ahead or synthesize ideas from compile-to-JS and other languages that win user support, JS will tend to stagnate, all else equal. Champions (not just me) must fight for a better future.

Stagnation on the web is a social ill. It costs developers tons of time. We know this not only from IE6, but from Android 2.x WebKit. Some not-disinterested parties might want to make use of a crisis of this sort, to force a "new JS" (Dart? remember WPF and WPF/E in the past) to be adopted for want of anything to relieve the stagnation.

Not me. The cost is too high, the lessons learned in JS will be half- or three-quarters-lost, and too much code and developer brainprint will be thrown away. I'm reminded of the XML Utopia that was planned in w3.org to save us from bad old HTML, before a few of us called b.s. and started the whatwg.org to do HTML5.

The web is an evolving system, JS is part of the web standards and must evolve too. Skate where the puck will be. Or to wannabe Yoda on ice: where the puck will be, you must skate!


> The web is an evolving system, JS is part of the web standards and must evolve too.

I don't see too many language gizmos in the presentation which reflect the requirements of web standards. Most of them are extensions to a language which desperately needs modification more than extension at this time. Evolution in a language is a matter of where you spend your development resources. &rest-args, weak maps, modules, and so on would be nice to have. But I would gladly sacrifice them to the gods to get rid of JS's global variable issues. It seems to me that ES is mostly building more and more features on top of a foundation of sand rather than taking a breath and revisiting how to reinforce the foundation.

Apple did this recently. OS X 10.6 (Snow Leopard) was an entire release that consisted of almost nothing but cleaning house. Few new features, just heavily revised internals. It's probably the most important release Apple has done in a very long time.

Now one can make the argument that fundamental fixes to long-standing language flaws is a challenging thing to produce given the bulk of development work which relies on the old language. That's a different discussion and one worth having. But moving forward with gizmos simply for the "future"'s sake, without considering the current sad state of the language, is I think misguided. I would strongly urge the committee to take a step back and identify the top twenty most problematic features of the language, and how they might be able to develop a "strict" version of the language which fixes those features, yet retains interoperability with code files written in non-strict form. Then they can go back to adding new gizmos.

(BTW, the "was" is due to the original Empire quote). http://www.imdb.com/character/ch0000015/quotes


Ok, "was" -- but not "you", rather, "he" as in "where he was".

With "me" it's a question of "is". JS is used for purposes far beyond its original design limits. A victory condition and a call to evolution. ES6 is Ecma TC39's attempt to hit that target.


>Things like ==, numbers as strings, eval, incorrect definitions of false, semicolon insertion, and -- heaven help us all -- improper lexical scoping.

==, numbers as strings => Problem: Type Coercion eval => Problem: Interpreted Language incorrect definitions of false => Problem: Type Coercion semicolon insertion => Problem: Language improper lexical scoping => Problem: Not block scope?

>The class bit particularly made me sad: JavaScript has a perfectly cromulent, even elegant, object model in the form of prototypes.

Both forms of object creation are valid.

[Object.create] http://jsfiddle.net/X4Bxq/

[new] http://jsfiddle.net/UGTga/

Both take advantage of prototypes (the later takes full advantage today, with all browsers. The former requires a hack where you lose your type information).

What is more important is that they are both awkward. So much so that most "JavaScript" programmers don't even use them. Those that do, spend most of their time arguing on which one is more correct. This is a problem. Introducing 'class' would sort this out by providing an easier syntax for class creation and an end to the arguments.


> The presentation focused on what it perceived as missing features: structs (seriously?), classes, modules, syntactic sugar, macros, etc.

The nature of a widely-used technology is that you can't remove features, you can only add. And yet adding features causes an increase in complexity. So what to do? The answer is to add features judiciously: prefer general features that cover a wide array of use cases and can provide better ways to do things that the existing features don't do or don't do well. (But also avoid over-general features that destroy important invariants -- for example, just say no to call/cc or threads.)

> But the huge gaping holes in Javascript are not missing features. They are fundamental errors in the language. Things like ==, numbers as strings, eval, incorrect definitions of false, semicolon insertion, and -- heaven help us all -- improper lexical scoping.

ES6 -- and potentially down the road, macros -- are paving paths to fix many of these problems you mention, and other important problems besides (e.g., callback hell). Lexical scoping is partially improved with ES5's "use strict" and further improved with ES6 modules. Block scoping finally exists thanks to `let`. ES6's module loaders allow translation hooks to enable dialects or alternative languages to be run in-browser (which you can do with preprocessing and build steps and directly, but using them in-language streamlines the "shift-reloadability" development experience). Module loaders also provide a saner eval, which allows you to completely sandbox the evaluated code. Macros could even allow rebinding operators like `==` to have cleaned-up semantics. We intend to try this out with sweet.js, building something like "restrict mode" (http://restrictmode.org) as a module you can import.

Dave


> We intend to try this out with sweet.js, building something like "restrict mode" (http://restrictmode.org) as a module you can import.

Oh cool!


> NaCl? Not portable.

To dis NaCl on this basis and not even mention PNaCl is dishonest. http://www.chromium.org/nativeclient/pnacl/building-and-test...

> Defined by implementation.

As it should be, until the implementation settles and it's clear what interfaces should be standardized. What a waste of time it would have been to standardize the pre-PNaCl work, for example. I wouldn't expect Rust (for example) to be standardized at this point either.

> No view source.

It's highly unlikely that JavaScript spit out by a code generator (this would be the competition for NaCl) is going to be at all readable. I'm guessing it will be about as readable as the portable BitCode that comprises the PNaCl image (which you could view if you really want to).

It's disappointing to continuously see this anti-NaCl propaganda from Mozilla. Here you have a promising and highly innovative technology that is pushing the bounds of what is possible on the web. It's being developed completely in the open with papers and code being published continuously. Mozilla's mission is "to promote openness, innovation and opportunity on the web." I just can't see what part of that mission involves campaigning against an open technology that could advance the web and help it compete with native apps.

I could understand if their position was "we're reserving judgment until it's portable, stable, and standardized." I could understand "we want to be more involved in the process." I could understand "we are waiting to see if it can demonstrate a compelling advantage over JavaScript." But everything I have heard indicates that they are publicly and completely opposed to ever supporting it, which will make it all the harder for them to ever change their mind on this point without losing face.

I grew up watching Mozilla develop from an unstable binary called "apprunner" into a full-featured open-source browser with cutting edge extension capabilities. I downloaded almost every single milestone and tried it out, craving the day when I could ditch crappy old Netscape 4 for good. I got warm fuzzies when the Mozilla Foundation was created; it felt good knowing that there would always be a way to use the web with open source software, and that there would always be an advocate for openness and freedom. I just never expected to see them fighting against open technology. It's disappointing.


PNaCl is both not done and (last I checked) not totally machine-independent due to LLVM encodings of machine word sizes. See also http://comments.gmane.org/gmane.comp.compilers.llvm.devel/43... for doubts on the wisdom of using LLVM bitcode for a long-lived, widely-distributed object file format.

PNACL is a fine research project, but unfortunately both NaCl and PNaCl are tied to Pepper, a gargantuan API specified nowhere and implemented only in chromium.org code.

To say this is "Open Technology" is to reduce "Open" to the level of "Big company Big Bucks Open-washing." There is nothing open about an unspecified research project without a proven multi-party governance structure that's dominated from start to finish by Google, and which only Google could afford to staff and push -- including via big-money distribution deals with game developers and distributors.

As I said at Strange Loop and in past talks, don't shoot the messenger: Microsoft and Apple will never adopt NaCl/Pepper. It is a non-starter as a web standard.

Why pray tell should Mozilla fall on Google's sword here? Why should we beg to be involved more "in the process" years after it started? Who are you to say that NaCl/Pepper is better for developers or anyone else than a cross-browser approach targeting JS VMs, which are already there and getting fast enough with typed array memory models to compete with PNaCl? (We aim to demonstrate this.)

NaCl/Pepper looks like an incumbent power's technological folly, similar to Microsoft Active X or Google's Dart-as-a-native-VM. Just because a big company can pay for it does not make it "Open" or "Good" or good for the web.

You've been free with charges of dishonesty, but I'll refrain from drawing conclusions about you from your position except to say that what you write is astoundingly naive -- at best. For anyone building a competitive browser that is not Chrome or chromium-based, what you propose is a money pit in direct and opportunity costs, with no clear path to standardization, where Firefox would always be behind in "Pepper conformance" compared to Chrome. The answer is no.

You'll get the same answer from any other browser vendor not free-riding off of chromium/Google.


There is a wide gap between fully embracing a technology and spreading misinformation about it. I respect Mozilla's decision not to integrate NaCl, to argue that it's premature to talk about doing so while it's underspecified and coupled to Chromium, and to set criteria that it must meet before it will be on the table for further discussion. I can understand concerns about cost and governance and an unwillingness to jump on what is perceived as a "Google treadmill." None of my comment was about any of that.

What I can't understand is the fundamentalist reaction to the very idea of native code, the ignoring/dismissing of serious work to solve the problem of portability, the liberal use of words like "never" and "non-starter," spread of FUD by invoking inaccurate comparisons like ActiveX (vis a vis its security model) and DLL Hell, and the spreading of misinformation. For example, PNaCl is not, and as far as I can tell never has been, dependent on machine word size. The link you cited doesn't apply because it is arguing against a different approach than what PNaCl actually does.

PNaCl works by defining a little-endian ILP32 machine as the target and fixing all important characteristics of this machine independently of the characteristics of the underlying CPU. This abstract machine's characteristics are defined in such a way that they can efficiently be translated to native code on any modern CPU. This is all covered in the introductory doc: http://src.chromium.org/viewvc/native_client/data/site/pnacl...

> a cross-browser approach targeting JS VMs, which are already there and getting fast enough with typed array memory models to compete with PNaCl? (We aim to demonstrate this.)

This is a far more reasonable and compelling story. By all means talk up your stuff and argue that you can win in the free market of ideas. I'm not arguing that I or anybody else should be able to dictate to developers what technology they use; on the contrary it is the Mozilla argument of "no one gets to the machine except through our VM and our GC" that paternalistically ties developers' hands and limits their options.


You are still being free with accusations of spreading misinformation and other evils. If you want to have a real exchange, cool it! Just try to imagine how a hardball from me casting aspersions on you for suspected bad or unfair (to Mozilla; "fair" to Google) motives might feel.

Thanks for the PNaCl pointer. My comment was based on LLVM bitcode having machine word size dependencies. This was an issue a while ago. I should have checked to see if it remained one.

This correction doesn't alter the general unreadiness of PNaCl for the web, on several fronts. Pepper is one, but PNaCl performance lagging NaCl is another. The Chrome Web Store features games ported via NaCl, for performance -- not PNaCl, which would be significantly slower. On this basis alone, it's premature for you to push PNaCl ahead of Google.

> This is a far more reasonable and compelling story.

Well, gee, thanks a ton! :-|

I've been telling this story clearly since Fluent in May. That you chose not to hear it and instead flung accusations and told sob-stories about big bad Mozilla is your doing, not mine.

Here's a final clue: all browser vendors, definitely including Chrome, make the rule (not an argument) "no one gets to the machine except through our VM(s) and GC(s)" -- outside of a few dying plugins, which are even source-licensed and co-released.

And that brings back my final point: NaCl is for safer plugins, which are OS-specific anyway. The likeliest evolution of SFI or CFI enforcing compilers and runtimes as plugin hosts is via the OS, not the browser. Write a letter to Microsoft and Apple, not to Mozilla!


If I sound argumentative and fired up, it's because I feel like Mozilla has been casting stones on this issue for years. Imagine how you would feel if Google executives were publicly criticizing Mozilla efforts like Persona, arguing that they would never support them and no one else will either, basing their criticisms on issues that you are actively fixing.

(For what it's worth, Persona looks promising to me personally, and I also like Rust very much, a lot more than Go. I say this to demonstrate that I'm not just a Google partisan and that I admire a lot of what comes from Mozilla).

I am much happier to discuss this dispassionately on a technical basis. I'm much happier if I don't have to argue against what to me are very unfair accusations, like being as proprietary as Silverlight.

> Here's a final clue: all browser vendors, definitely including Chrome, make the rule (not an argument) "no one gets to the machine except through our VM(s) and GC(s)"

I don't understand the argument you are making, (P)NaCl are specifically designed to allow execution of untrusted code without making it run on top of a VM or GC. And (P)NaCl executables are OS-independent. I don't understand what you're getting at here.


>(P)NaCl are specifically designed to allow execution of untrusted code without making it run on top of a VM or GC

And this is the argument he's making: that does not fly by browser vendors. They DON'T want to have code run OUTSIDE their VM/GC.


Thanks for the support, but that's not what I meant. NaCl + Pepper is like a VM where the compiler does the heavy lifting so the native code can run safely (Software Fault Isolation, SFI -- wild pointers lead to a safe non-exploitable crash), rather than a JITting or MMU- or hypervisor-based VM doing the heavy lifting at runtime.

It's quite clever, but still enough of a new thing that Chrome also sandboxes NaCl'ed code out of process. Belt and braces are good. No silver bullets.

But a VM is as a VM does. This is part of Google's VM-set and not any other browsers. The rule still applies.

Truly unsafe native code in plugins (e.g., un-NaCl'ed Flash) runs out of process too, and sandboxed to some extent, but it can cause problems that are not contained (and did at the last CanSecWest Pwn2Own contest, IIRC).


He said "including Chrome." Chrome supports NaCl. This does not compute.


It's not hard: Chrome sandboxes NaCl'ed code and links it against a runtime, Pepper. That's a "VM" by any other name.


These are shallow arguments: * pepper is "inspired" by nsapi, clarify your point. * PNaCl performance lagging isn't a solid argument, you know it'll get better, the solution might even be to cut LLVM out save for bitcode. * "nobody does this at the moment" so why does it belong in the OS?


In reverse order:

* Why in the OS? I didn't say "belong", just "likelier". That is because plugins are native code compiled by OS-dependent toolchains, and OS vendors are few (three that matter) and lock up native code these days via SDK licenses, app store rules, and even kernel-level restrictions.

In contrast, there are four or five competitive browsers, only one of which has Pepper and the rest do not -- and will not.

* I do not know how much better PNaCl can get. The shallow argument here is your assertion that "you know it'll get better". The same could be speculated about JS performance at Emscripten-generated code, and that works cross-browser. That's the cross-browser path of least resistance, compared to the practically unpassable Pepper barrier.

* Pepper is "inspired" by lots of APIs, but here the shallow shoe fits your new-HN-user drive-by. NPAPI is a sunk cost all browsers save modern IE have paid out for years. Pepper is new and much bigger. Have you even read all the interfaces?

The bottom line is that whatever PNaCl performance wins may lie in the future -- and I will believe them when Google does as shown by Chrome Web Store games being PNaCl'ed not NaCl'ed -- Pepper is the blocker for any cross-browser adoption in reality.

This ignores principled objections to more native code on the web, as a "social ill". Let's take that up separately, because it could override any technical argument. I'm happy to stop on the Pepper point for now, since Google manifestly is stuck there.


Why differentiate plugins? What makes a VM with JIT not a plugin save the browser vendor shipping it with the browser?

Why wouldn't other browsers have Pepper?

Compilers are as good as what they've been tuned for. In my view PNaCl's shortcoming is startup time because it lacks a JIT and LLVM's back end is too slow for now. Speed up the backend or JIT code and you'll get close to GCC performance while being portable and somewhat language agnostic.

Yes I have seen pepper, and most of the interface relates to the GPU. How is sunk cost better, when a big part of the API can be backed by what canvas relies on?

You would consider adopting PNaCl and pepper in FF if there were games that targeted them? If the code were contributed to Mozilla?

What do you mean by "more native code"? Can't view source?

I appreciate the answers.


>Why differentiate plugins? What makes a VM with JIT not a plugin save the browser vendor shipping it with the browser?

I think you answered your own question with the "save" part.

The vendor shipping it with the browser means it controls it, it has responsibility for it, it secures it, and it allows it. End of story.

>Why wouldn't other browsers have Pepper?

JS is a necessity for a web browser/vendor, and is already present in it. Pepper is not, and there are NO signs it will be. Do you see any movement towards adoption as of now? I see the opposite, the abandonment of even old style plugins.


I'll rephrase: why is any new VM to be relegated to the OS? The presence of incumbent VMs?

JS is an incumbent. Pepper is similar to nsapi, and has nice features which are compatible with HTML5's implementation (as in canvas). Saying it shouldn't be adopted because Nobody is adopting it is circular.

What are old-style plugins? Anything not JS?


I said clearly why Pepper is not being adopted: it is a gigantic pile of API and implementation specified only by the C++ in chromium.org svn. Other browsers cannot port all that OS and WebKit glue code except at very high cost, direct and opportunity -- and even then on a bet that Pepper + NaCl wins, and again on a treadmill far behind Chrome.

Do you actually work on a browser codebase? If so, have you worked on competing browsers' codebases at all? Do you begin to see the problem? It's not quite Active X (open source is a small help), but it's on that slope and uphill only a bit.


> Other browsers cannot port all that OS and WebKit glue code except at very high cost

Why would any other browser need that glue code? The Pepper API is large but fairly straightforward and doesn't change dramatically between revisions. In addition, I don't believe Google has ever said that they wouldn't make the development process around those changes more open (at least making them public before pushing the new implementation out to the world).

> Do you actually work on a browser codebase? If so, have you worked on competing browsers' codebases at all? Do you begin to see the problem? It's not quite Active X (open source is a small help), but it's on that slope and uphill only a bit.

I've only worked on Webkit a small amount (mainly doing security analysis) but I worked with Pepper a good deal and I've worked on Gecko for a decent while now. I really don't see the incompatibility; there are plenty of good arguments against NaCl, but I don't think there's a fundamental problem there. I can definitely understand not wanting to allocate resources to the issue, but not being opposed to the issue in general.


> Why would any other browser need that glue code?

Because other browsers do not use WebKit, or at least chromium WebKit. Are you really asserting that no glue code is required on any other browser?

> The Pepper API is large but fairly straightforward

Where is the spec? You are not in the real world here.

There are plenty of differences between Gecko's audio APIs and Pepper's. If you really work on Gecko, mail me about this. I have reason to doubt your claims here.


The link he cited still does apply. It discusses several different issues. PNaCl's portability only covers a subset of them.


Thanks! I happen to agree with with Dan Gohman (http://comments.gmane.org/gmane.comp.compilers.llvm.devel/43...), but I'm not sure where Chris Lattner ended up on this.

Much is possible in software, so perhaps some day, or under some transformation, LLVM bitcode would be suitable as a stable long-term object file format.

There's still a point here: PNaCl is pushing a stone up a very tall hill. ANDF and other Universal Object formats go back to the 70s if not earlier. It's very difficult to standardize such things, never mind Pepper.


FWIW, the aim for LLVM is to avoid breaking the bitcode format now 3.0 has shipped — not that it's platform independent or anything else yet.


Work with say Khronos group to establish an OpenCPU standard with a source code and possibly intermediate representation.

Socialize amongst CPU vendors, and interest platform makers in the mobile and desktop space.

Watch it absorbed by web standards.


I don't understand why you feel it necessary to make your points in such an inflammatory manner. Your arguments are well made, why do you feel the need to, for example, call someone 'astoundingly naive?' Being rude doesn't make your points more convincing and I would have hoped you were above that kind of thing. It's a pity because you have a huge amount to contribute.


I went out of my way to say that Haberman's position as I understood it -- not he himself -- was "astoundingly naive". This after he called me dishonest and speculated on motives. Are you using the same yardstick with me as with him? I think not.

Arguing about motives is a form of the _ad hominem_ fallacy, and I was avoiding it, in contrast to my fine counterpart. Yeesh!


Oh come on. My label of "dishonest" was in regards to a statement, not you personally, just like your label of "astoundingly naive" against me.

And I didn't speculate about motives. I'm not sure what statements of mine you're taking so much offense to, but your speech has also been brusque and uncharitable at times ("Who are you to say...", "Here's a final clue:").

I also went out of my way to empathize with Mozilla's concerns and reasoning for not wanting to support NaCl, whereas you show no appreciation for why someone might ever legitimately want to run native code on the web.


You wrote, very first comment at top:

"To dis NaCl on this basis and not even mention PNaCl is dishonest."

That was in response to my slides. You were calling me dishonest. Come on yourself!

You then went on about "propaganda" and scary salt crystals. Something is off right there. Mozilla doesn't make propaganda and we have a tiny fraction of Google's budget (which I can assure you has been deployed commercially to push NaCl).

I don't think your tone or content are balanced on any of this, and you at least climbed down on the salt crystals. Can you do likewise on the "dishonest"?


You seem a lot more interested in getting me to take back things than you are in taking back your misleading slide.

Substitute whatever adjectives you want if the ones I used offend you, but the point still remains that the most vocal criticism of (P)NaCl comes from Mozilla and it is anything but "balanced."

I would feel more inclined to issue an actual retraction if there was any indication that I was mistaken about this or that it would change.

That said I'm not really interested in arguing further, since we've clearly reached an impasse. I admire the work you have done with JavaScript, and I admire the work Mozilla has done over the years on many great products.


The slide I showed is not misleading. NaCl is not portable, PNaCl is still not ready for prime time based on Google's own actions, and you protest too much and do not practice what you preach.

"Mozilla" meaning me, bzbarsky, blizzard (previously), roc on the plugin-futures list, others have been forthright compared to the mostly-silent other browser vendors, who haven't even spoken via corporate or individual channels on this non-issue apart from my pal Maciej at Apple coining "Active G" to refer to Pepper.

If this circumstance makes you shoot us, the messengers, you need to read more Greek tragedy!

We're telling you why NaCl/Pepper are a no-sale among all the non-chromium browsers. You don't like the reasons we give, but that's no justification for your ascribing to us bad motives or a dishonest agenda or techniques ("propaganda"). We have been perfectly clear about the unacceptably high cost of Pepper, and the single-company control problem of all of NaCl/PNaCl/Pepper.

Your own misstatements are yours, and you should retract or not based on their righteous or wrongful nature, not on what anyone else does. That you excuse your conduct based on your grievance with us is thoroughly broken, as a piece of moral reasoning.

At this point you are perfectly clear: you want a free lunch (from all browsers, but especially from Firefox), we won't give it to you, so you call us names and imply that we act out of bad motives. That makes you persona non grata in my book. Good luck!


> At this point you are perfectly clear: you want a free lunch

Nope, I wanted a footnote that says "they're working on it." That is about 97% of what I wanted from this discussion. ES6 isn't "ready for prime time" either, being an unfinished spec, and yet the entire presentation was about that. You're comparing JavaScript's future with NaCl's present, and not mentioning that it is Google's stated purpose to remove the glaring limitation that is the basis of your discounting it as a technology (at least as far as that slide is concerned). How is that not misleading?

I have never once suggested, in a single one of my messages on this thread, that non-Chromium browsers should adopt NaCl/Pepper in their current form (in fact I have said exactly the opposite, that I can understand their reasons for not wanting to), and yet you continue to attribute this viewpoint to me, calling me "astoundingly naive" for it, and issuing no retraction for that (despite all the retractions you demand from me). How is this the moral high ground?

I would be happy (more than happy, actually) to completely retract my statement, since it certainly gives me no pleasure to think that you would mislead your audience, but you are declining to demonstrate that I was wrong or that you have an interest in being entirely forthcoming with your audience. By pressuring me to retract my statements while feeling free to say what you want I feel you are bullying me. I'm not a huge fan of how you are ascribing inaccurate viewpoints to me and calling me names for them, and yet I am not demanding that you retract it all or I will discount your existence as a person (incredibly harsh, by the way).


I've expressed my personal opinions about NaCl here on HN before too: http://news.ycombinator.com/item?id=2057611


For what it's worth, I think the comment you linked is far more fair and balanced criticism. If all of the statements I was hearing from Mozilla people sounded like that, I would have no beef.


I have a different writing style from Brendan, but I don't think there is any significant substantive difference between what he said in this thread and what I said almost 2 years ago.

(Although, the fact that PNaCl is still an experiment and not the mainstream of NaCl nearly two years after I wrote my comment should be further cause for concern.)


The significant substantive differences, to me as a reader, are:

- You don't make final-sounding judgments like "never" or "non-starter" that preemptively reject any future evolution of the technology.

- Your criticisms are highly pragmatic and specific, such that it is clear what hurdles the technology would have to clear to address them, and you don't close the door to the idea that they could (even if it seems unlikely to you).

- You don't fall back on ideological arguments like native code as a "social ill" that would suggest that your true objections run deeper than what any technical improvements could possibly address.

Thank you for that.


Wow, I didn't notice that the thread continued for so long after I posted.

The things you cite are pretty much all because I would never give 0% probability to a future event. Who knows? Things change. But I think it is quite unlikely that NaCl will become a widely accepted part of the Web platform, and I think that would be a bad thing in its current state.

My "highly pragmatic and specific" criticisms seem to me like they say the same thing as Brendan's original slide bullets, just with more detail. I did not mention "no view source", but I agree that is a significant downside, if not necessarily as much of a showstopper as the others. Being a single-vendor-controlled technology is the biggest showstopper.

Another big issue that I didn't mention, and which I think also aligns with Brendan's criticisms, is that adding a major new technology to the web platform requires tremendously compelling use cases and a good argument that they cannot be handled with existing technologies. I don't think that case has really been made for NaCl.

And yes, I did jokingly coin the term "ActiveG" to refer to NaCl. Though I believe it was another wag who later referred to Dart as "GBScript".


Oh hey, I just saw this now.

For what it's worth I thought you were from Mozilla when I wrote my post, not that it matters that much either way.

I think you made some substantial points that were not covered in Brendan's slides, specifically:

- A standard with only one implementation is de facto controlled by one entity. This is a great point, and different than Brendan's point "defined by implementation." Brendan's criticism would be solved simply by standardizing (P)NaCl under multi-party governance, which I fully expect Google will do at some point. [0] Your criticism is not solved unless it is actually practically feasible for someone else to implement that standard.

- Relying on binary-level validation of binary code has a lot of attack surface. This is a great point that I've seen others make, though I believe it is being addressed (perhaps since you wrote your message) by having multiple layers of defense (ie. also running in a separate process inside a ptrace sandbox).

It doesn't bother me that you joke around with your friends by calling it "ActiveG," because in the context of serious discussion you acknowledge that it has "a better attempt at security design than ActiveX." It does bother me when others seriously compare the two, as if a completely unsandboxed execution environment can be compared to a serious attempt at sandboxing.

In any case, now that it supposed to be shipping soon (http://news.cnet.com/8301-1023_3-57534803-93/google-offers-l...) we should get a better chance to see if it truly can demonstrate a compelling improvement over JavaScript.

[0] Just wanted to mention that though I work for Google I am not involved in (P)NaCl and have no inside information about it.


The purpose of my "final-sounding judgments" is to cut the crap in the short term that keeps getting dumped on Mozilla by you and others with an enlarged sense of entitlement, and an unjustified assumption that NaCl/PNaCl + Pepper is somehow "better" in the large, and so should get free support from non-Chrome browsers.

If you think overcoming some hurdles enumerated by Maciej will get PNaCL and Pepper support into Safari (where, BTW, Pepper is closest to porting cleanly by any measure, e.g., patch size), you're dreaming. Remember, I did not coin the funny "Active G" phrase. That shoe was fit by the guy you're trying to suck up to here.

If an indefinite-future-tense, and therefore worthless, promise from me to be open to portable SuperPNaCl in 2020, will get you to stop bashing Mozilla and flattering Apple, then here it is. Indeed my plan is for JS to evolve and mutate to be that portable object format for native code to compile to.

Emscripten is a promising sign along this path. And the PNaCl folks are targeting JS too, I hear, so this looks like a common goal.

Now can you stop defaming Mozilla?

/be


Now can you start ignoring haberman like you said you would? Your belligerent comments are getting a bit old at this point. You hate NaCl. We get it. Criticizing a slide makes you upset. OK. Understood.


I always reply when I have something worth saying.

"Hate" is the wrong word. There's nothing to hate in a well-done SFI-enforcing compiler. I admire the R&D effort. We -- bad old Mozilla, including evil-me -- are seriously considering using NaCl (not Pepper) for SFI in native code parts of Servo. No "hate" here, so do try to grow up: "You hate NaCl" is just weak, beneath you, as an analysis of motives and causes.

Let's step away from NaCl as pure tech for safer compilation of unsafe languages.

I do call Google's business strategy pushing NaCl+Pepper via games and even (till it all fell apart last November) Flash heavy-handed, and one-sided to such an extent that what's pushed can never be standardized.

What's belligerent here is for someone (haberman), who I hear works for Google, a company that has all the money and power in the world, and which makes actual "pro-NaCl propaganda" as well as pays game developers and others to use the Chrome-only tech (NaCl, not PNaCl, note well), to fire first on this thread. It wasn't me who showed "belligerence".

That first shot even tried open-washing, lamely, and it tellingly put Mozilla in the subordinate position. ''I could understand "we want to be more involved in the process."''

If you don't work for Google or hold a brief for them, ask yourself why haberman's presumption is that Mozilla, or any other browser maker, should be asking or begging to be included in a process that we were excluded from and practically speaking could never participate in without equal big bucks and market power.

The great thing about the Web is that no one owns it. Browser vendors, tool makers, and developers (whom the first two groups avidly court) have to reach consensus via standards. And I find that developers, not just pure web devs, definitely including @kripken (Emscripten's creator), @jashkenas (CoffeeScript), and others, are more creative and faster moving than many highly-paid C++-first hackers, including those perpetrating single-vendor follies such as NaCl+Pepper.

Whine at me, cry "hate" and "belligerence" while turning a blind eye to the big gun who fired first, and shoot the messenger. It doesn't matter. NaCl+Pepper have already lost to JS for portable cross-browser high-performance native code compilation.


That's why you have been supporting Emscripten with... just one developer, it's creator? Shouldn't there be a whole team towards this goal you envision?

Currently emscripten should have the amount of people three.js has.


There is a team working on the longer-term goal, including game platform, JS engine, and Mozilla Research people. Why did you assume otherwise?

Also, Emscripten has a strong github community. Unlike Google we can't afford to pay everyone who might be needed -- we also prefer not to if we can build a wider community from day one.


Ah I see I should have said "Google" where I said "NaCl". My mistake.


You're doing better! But no, I don't "hate" Google. Big companies and big groups of people in general have inherent morally failure-prone properties. Google fights these, and in many ways still manages "don't be evil".

Heavy-handed and one-sided strategies and tactics may be odious, I may "hate" them -- you should too if they're not well-justified and likely to prevail -- but that's not the point of this exchange, which haberman started. Nice try deflecting, though.

The point of my slide, and of my comments here, is to make the case for what's best for the Web. So let's get back to that.

What is best for the Web? Not NaCl, we all agree. PNaCl? Not with Pepper as a mandatory API for all browsers to adopt. And PNaCl has a JS back end.

Steve Jobs killed Flash. Plugins are in decline. However well Google, Mozilla, and others use NaCl for native code safety, on the Web JS looks highly likely to continue, and to get super-fast for the well-typed and GC-free code produced by Emscripten.

This all points to a future where evolved JS is the PNaCl format that works cross-browser. We're already working on this at Mozilla, but via Emscripten not PNaCl. If Google aims its formidable engineers at the same goal and works in the standards bodies early and fairly, great. I'd love that.


Do you think the current status of Emscripten justifies your words? We'd all like to believe that.


The challenge now is more on the JS VM side, optimizing the Emscripten-generated idioms and the typed array memory model. Also, longer-term, we're working on JS language evolution via Ecma TC39.

These are focus areas of the team I just mentioned.


Well we agree. My answer is not to justify PNaCL, but to let you know that when you say "we commit to this", it means we give some serious investment on Emscripten and building a community. So far Emscripten has only been mentioned as 1-2 slides on each JS talk. And I think we both agree that a serious project as it is, it needs a few people building posting updates and letting everyone about the great work that is being done by @kripken.

Thanks though for your answer.


Maciej and I agree, but you are holding Mozilla to a different standard from Apple. Another reason I'm striving to ignore you (for both our sakes).


Please stop signing your posts.


Why?


House style: http://ycombinator.com/newsguidelines.html

Please don't sign comments, especially with your url. They're already signed with your username. If other users want to learn more about you, they can click on it to see your profile.


Ok, sorry about violating house style. Old habits die hard.


does PNaCl work? do you know the answer to that question?


I have not personally used it, but the documentation at http://www.chromium.org/nativeclient/pnacl/building-and-test... indicates that it is at least capable of running spec2k. I don't know what's complete and what is incomplete. I do know that it is the stated goal of the NaCl project to achieve portability through PNaCl; that alone makes it deserving of mention in this context (https://developers.google.com/native-client/overview#distrib...).


Taking your points one at a time:

1) If PNaCl ever happens and is not directly tied to Chrome's internals (which it is at the moment), the discussion can be revisited.

2) Google is opposed to anyone else being more involved in the process. Other people have tried.

3) Mozilla has no particular concerns about "losing face" if a pragmatic decision is needed. We're a lot more worried about consequences for users and the web than we are about our egos or "face".

4) Calling "NaCL" an "open technology" is about on par with calling Silverlight an "open technology", for what it's worth. Granted, the source is open, but again it's tied to various Chrome-specific stuff that is underspecified and would be incredibly difficult to integrate into any other browser.

Basically, as far as I can tell your argument comes down to saying that Mozilla should be open to implement PNaCl (not NaCl), if it were being developed completely differently and had different goals. We might be, if that counterfactual held. But it sure doesn't, and I don't see any hope of it holding. If that ever _does_ happen, we can revisit this discussion, of course.


> Calling "NaCL" an "open technology" is about on par with calling Silverlight an "open technology", for what it's worth.

Silverlight is closed-source, patent-encumbered, and released by a company with a history of "embrace, extend, extinguish." That someone who appears to be speaking for Mozilla would draw this comparison is, again, disappointing.

> If PNaCl ever happens and is not directly tied to Chrome's internals (which it is at the moment), the discussion can be revisited.

This claim is directly at odds with the public statements of Mozilla's Chris Blizzard, who argues against the very idea of native code delivery to browsers. His arguments aren't against the NaCl implementation, process, etc, they are fundamental arguments against native code in general: http://www.theregister.co.uk/2011/09/12/google_native_client...

> Basically, as far as I can tell your argument comes down to saying that Mozilla should be open to implement PNaCl (not NaCl)

I'm not even hoping for that at the moment, at this stage I'm only hoping for them to stop maligning it publicly, like Chris Blizzard saying it will lead to DLL hell, or like with Brendan's slide that desaturates a picture of salt as if (P)NaCl is going to come for your children in the night.


> and released by a company with a history of > "embrace, extend, extinguish."

It's worth considering technology on its merits, not just based on past behavior of companies. In recent years, Microsoft has been much more of a team player in the web space than Google has, for what it's worth.

That said, I didn't claim NaCL was in all respects identical to Silverlight. I said it was comparable. It's more open in some ways (open source), less in others (e.g. no independent reimplementations, and precious little chance of any as things stand). The provenance is equally unpalatable, from my point of view; Google may not be aiming for "extinguish", not least because that's not very likely with the web at this point, but it's certainly aiming for "embrace, extend, coopt", which is not much better.

> That someone who appears to be speaking for Mozilla

In general, people who work on Mozilla speak for themselves. The cases when they're speaking for "Mozilla" are very rare and always marked as such. In this instance, I'm speaking for myself.

> This claim is directly at odds with the public statements > of Mozilla's Chris Blizzard

Chris and I don't always agree on everything. But some of his arguments are certainly valid. I didn't say I'd adopt PNaCl with open arms; just that the discussion should be revisited. As long as we're talking about things that are hardware-dependent, there's just no point having the discussion at all.

> at this stage I'm only hoping for them to stop maligning > it publicly

What you view as "maligning" someone else may view as an attempt to keep Google from pushing hardware-dependent code as part of the web platform, which is what they're trying to do. All a matter of perspective, I suppose.


Was the salt crystal image scary? Boo hoo!

It was from Dave Herman, and it was not intended to be scary at all. It's appealing to physics and chemistry nerds. Salt has had a bad rap, to borrow from Montgomery Burns on eggs.

This is descending into silly-season political talk. Google chose the NaCl + Pepper pun. They can take the scary images, if those images truly are scary.

BTW, Chris Blizzard works for Facebook now.

Back to more serious topics...

UPDATE: to be fair to blizzard, he was objecting (as bz reminds me) on behalf of Mozilla to paving the web with x86 or other machine-dependent code, however compiled. Mozilla still opposes more such machine-specific plugin code. Think of NaCl as a safer plugin compiler, nothing more. We believe the Web should not need plugins to fill decade-long gaps from the '90s that real "coopetition" among browsers in standards bodies can fill much sooner, without the problems that plugins bring.


So this is why my ears felt warm this morning.

Yeah, even though I don't work at Mozilla anymore, I still think {P}NaCl is a bad idea for the web as a whole (and its users, by effect) and also for Mozilla (as one of its most important stewards.)

Carry on.


> or like with Brendan's slide that desaturates a picture of salt as if (P)NaCl is going to come for your children in the night.

Really now. I created that slide, taking the picture from Google's own NaCl web site:

https://developers.google.com/native-client/

I zoomed it in because I thought it looked pretty. Let's all just take a deep breath.

Dave


Mea culpa on that point (the image). I drew the wrong conclusions about its intent.


As it should be, until the implementation settles and it's clear what interfaces should be standardized.

I believe Brendan Eich is suggesting that the preferred process is for a technology to be defined by a draft of a spec, and have multiple implementations' interfaces settle down before being standardized, rather than having one definitive implementation that unilaterally determines what is settled down and what isn't.

That seems quite reasonable to me, when such a thing is possible doesn't it sound like it would lead to a better spec for a better technology?

It's disappointing to continuously see this anti-NaCl propaganda from Mozilla.

Could you elaborate on all the anti-NaCl propaganda you're continuously seeing? Is it actually Mozilla's propaganda, or Brendan Eich's personal opinion? Personally, I thought that at least in these slides, he was quite balanced in presenting both the pros and the cons of a technology directly aimed at taking the spotlight from his baby (by being used instead of JS for high-performance browser games and stuff).


>I believe Brendan Eich is suggesting that the preferred process is for a technology to be defined by a draft of a spec, and have multiple implementations' interfaces settle down before being standardized, rather than having one definitive implementation that unilaterally determines what is settled down and what isn't.

As in, Mozilla would implement a common core of NaCL features, perhaps omitting some of the less-core ones that Chrome has, and adding their own extensions, and they could see what was good and what was bad, and experimental features could gradually become part of the standard? Sure, sounds good.

But for google to try and define a standard when they only have the only one immature implementation would just be stupid.


I've seen it too from multiple people, and I don't think it's hyperbole to call it propaganda. We had a representative from the Mozilla foundation speaking at our local Js conference and I asked him if the situation with NaCl w.r.t. Firefox was because no one has implemented it or because Mozilla is "morally" against it. He answered that Mozilla is categorically against the idea of NaCl.


Mozilla is categorically against the idea of NaCl, yes. Because it's tied to particular hardware.

PNaCl, if it ever happens, will be a separate discussion.


The "no view source" argument is pretty weak. Most major websites nowadays serve you unreadable compressed JS soup already. Personally, I'd love to have a cross-browser bytecode alternative to Javascript, preferably running in a VM accessible outside of a web browser.


A lot of people say they want a bytecode for the web. The problem is, bytecode vs text is just an encoding issue, and it's practically insignificant. It's a platform either way, and the real issues are what services the platform will provide.

For example, NaCl doesn't provide Garbage Collection (GC) last I checked. Applications can link in GC libraries for themselves, however that's more code for clients to download, and it may mean that the GC can't do all the low-level things that modern GCs do to get good performance. Is this an acceptable tradeoff for the web? I believe questions like this one are the important ones.


People who want bytecode has nothing to do with bytecode being better; they want bytecode because they want their language of choice to be a first class citizen and JavaScript-as-assembly means their language will always be second class.


I've never seen a bytecode that wasn't just a straightforward source-to-source translation of some language. JVM bytecode is pretty much just another way to encode Java (invokedynamic notwithstanding). .NET bytecode is pretty much just another way to write C#.

Bytecode really is nothing more than just a compressed source encoding of some language. It doesn't magically result in a VM that can efficiently encode all semantics of all programming languages, and I don't believe such a VM can exist anyway.


Bytecode really is nothing more than just a compressed source encoding of some language.

I was with you until you said "compressed".

There are plenty of examples where the bytecode representation is bigger than the original source files.


Not sure about the JVM, but .NET compiles to an intermediate language called CIL before creating the bytecode. I agree with you, but to bytecode advocates, this is an emotional issue. They do not want JavaScript to be the first-class language and their pet language to be "held back" by it.


What about LLVM? It seems able to deal with a fairly large number of languages with fairly different paradigms (including Rust :) )


That's similar to saying that x86 asm is able to deal with a large number of paradigms. LLVM IR might be a a viable target to avoid writing your own native code generators, but that does not a universal bytecode VM for dynamic languages make.

It sure would have been nice if Parrot had succeeded there...


Languages targeting bytcode platforms are just as liable to be second-class citizens as ones targeting JavaScript.

Scala and Clojure don't support proper tail calls. Why? Because the JVM is the Java Virtual Machine! I believe there were similar issue for dynamic languages on the JVM for the longest time as well.

And now just imagine implementing a Haskell->JVM compiler. I wouldn't be surprised if that's actually more difficult than implementing a Haskell->JavaScript compiler.

So yes: with JavaScript as the target, other languages are second-class citizens. But with some bytecode, that won't probably won't change much. Knowing how these things get designed, the end result could easily be that every language including JavaScript becomes a second-class citizen.


It's rather easy to make minimized JavaScript readable again though. For example, Chrome's Web Developer Tools have such feature built-in.


Readable is a bit of a stretch when the js is thousands of lines of files concatenation together with variable renaming. Google Closure compiler also does function inlining among other optimizations; it's not like you can do view source in Gmail and expect to get anywhere without a lot of reverse engineering effort.


Rust is never going to be part of the Web. Content will never be able to execute Rust code. If Mozilla had been proposing to integrate Rust as a potential client-side scripting language (which wouldn't happen to begin with), the process would have been totally different.


Thats a good thing. Rust is THE alternative to C++, and it should not lose focus by compiling to javascript.... Rust should not compile to javascript.. It should remain native(LLVM).

@pcwalton Please assure me that rust won't compile to javascript, and it will remain Native (C/C++ compatible, ahead-of-time). Rust is my last haven, let it remain rust only!!


Well, we aren't going to stop, or even discourage, people from writing Rust-to-JS compilers, but we won't be working on such a thing...


Hey, thanks...

I believe - Anything can reach very high, granted it dosen't lose aim/target... Javascript as a target is intrestingly beautiful, but this might fork up a way for yet another dart. Rust should remain, what its meant for....


LLVM doesn't mean native: LLVM means whatever there is a LLVM backend written for, and this includes higher-level things such as JS.


On a side note, has there been any experimentation with Rust on NaCl or with Emscripten? I'd love to write some web demos with it.


Rust can't run without it's runtime yet, which means the runtime needs to be wrapped into JavaScript code on the Emscripten side. The runtime needs some important parallel and memory functionality and last I heard most of it isn't easily to implement in JavaScript (workers and typed arrays and other HTML5 goodies probably bring it closer to being feasible).

NaCL is a more realistic target - in theory it should be as simple as making the compiler use NaCL's GCC instead of the system's C compiler. However, I don't know about how strict the sandboxing is, there might be certain things that the generated Rust code isn't allowed to do.


See Xax for another approach.

The issue here is that NaCl and Xax are proprietary approaches done inside companies coupled very tightly with existing ISAs and not developed entirely in the open.

They are both brittle with their own particular weaknesses - NaCl is weak at handling dynamically generated code, and Xax has safety issues (there are user mode instruction sequences that can freeze some x86).

PCC may be clever, but the fact remains less effort would be required to add sandboxing of processes and peons and yield higher performance.

PNaCl from what I recall relies on LLVM which is yet to demonstrate low latency code generation performance - in contrast to older more mature code generators such as TAOS, LuaJit.

Don't get me wrong, I am in favour of a WebCPU component for low latency, near native code emission and execution. I just don't think those with the pedigree are doing the work.


> The issue here is that NaCl and Xax are proprietary approaches

If you had told me in the 90s when Microsoft was king that someday "proprietary" would be used to describe a completely open-source, documented, published technology that its creator encourages others to adopt, I would have laughed and said it's impossible.

> NaCl is weak at handling dynamically generated code

They got it working for x86 (http://static.googleusercontent.com/external_content/untrust...) and it's currently just not prioritized for PNaCl AFAIK.

> PNaCl from what I recall relies on LLVM which is yet to demonstrate low latency code generation performance

Sounds like the kind of problem that can be addressed once other much bigger problems are solved. It's not like any such problem would be fundamental or insurmountable.

> in contrast to older more mature code generators such as TAOS, LuaJit.

I don't know about TAOS but LuaJIT 2.0 (with it's completely rewritten code generator) is only 5 years old compared to LLVM's 12 years. Even LuaJIT 1 is only 7-8 years old.


"proprietary" is a spectrum, not a binary decision.

You can be open-source, documented (though NaCl is not so documented in practice because of the Pepper dependencies), published, and encouraged to adopt, but if your development is controlled completely by a single company and if you depend on other, undocumented, parts of that company's software stack, then you are more proprietary than something with an open (as in, developed in the open, with many participants) standard and no dependencies on a particular implementation.

Obviously you'd be less proprietary than, say, ActiveX, but that's a pretty low bar nowadays. NaCl is not competing against ActiveX; it's competing against the web platform as it exists. And that's definitely much less proprietary than NaCl.


Sigh. Not all open source projects are community projects. That doesn't make them proprietary. I'm disappointed to see this kind of confusion on HN.

Rust doesn't have much of a community around it, besides Mozilla. Does that make it "proprietary"? Nope.

We all know NaCl is not going to be adopted-- not because it's not good enough, but because it's too good, and would threaten the native app ecosystems of Apple and Microsoft. pNaCl, same story. Google might end up using it as an app delivery mechanism in ChromeOS; that's about the limit of its potential usefulness.

It's particularly ironic to hear Brendan Eich complain about the lack of a standards-first approach in NaCl, since ECMAScript was designed behind closed doors at a single company. Anyway, ECMAScript seems to be good enough for building web UIs, and it's even a little less verbose than its "older brother" (whom it resembles not at all). So I think its quasi-monopoly is secure. I hope they pull the TypeScript extensions into the core language in the next version.


Rust has a healthy number of committers who are not employed by Mozilla. I will let pcwalton fill in details if necessary.

"Proprietary" as in "sole proprietor" is appropriate for a project with zero governance, launched by Google after some incubation closed-source, dominated by Googlers.

NaCl is not adopted because it's machine-dependent!

PNaCl is not ready. Show me Chrome Web Store games compiled with it and not NaCl, then we'll talk.

As I've written before on HN (see https://news.ycombinator.com/item?id=2998374, "I've paid my dues"), JS was created by me in a tearing hurry in 1995 for Netscape, the would-be market power that nevertheless avoided a monopoly conviction (unlike the other guys).

There is no "quasi-monopoly" here. Someone on HN schooled me on "monopoly" (http://news.ycombinator.com/item?id=2998590).

The issue with JS is not "monopoly" in the econ 101 sense. The issue is that JS is more than good enough, and getting better under competition and cooperation in the standards bodies. Therefore it is very hard to displace, and just as hard (if not moreso: a displacing language might be backward compatible) to supplement with a second language/VM in all browsers.

You should respond to this technical fact (by which I mean, the circumstance is well-founded in software costs).


According to the Apache project, a project is "considered to have a diverse community when it is not highly dependent on any single contributor (there are at least 3 legally independent committers and there is no single company or entity that is vital to the success of the project)." Rust might meet that standard in the future, but it is not there yet.

With regard to JavaScript versus NaCl / PNaCl / etc-- I've heard all the debates before, and they are kind of tedious. ECMAScript is a good language for some things, but making it the only option is goofy. I think Mozilla is shooting itself in the foot by not supporting PNaCl, which is the one thing that could potentially save their "boot to Gecko" initiative from disaster. I guess the Adobe Flash and ActiveX experience left emotional scars that haven't healed yet. Oh well. Their loss, Apple/Google/Other app stores' profit.


There's nothing to "support" yet with PNaCl. It's still at the "we have no idea how to make this work" stage, last I checked.

And if it _could_ be gotten to work, you still haven't addressed why Mozilla should be willing to get on the Pepper "upgrade to keep up with all this unspecified stuff we're changing" treadmill.

It's not just scars from Flash and ActiveX; it's a distinct reluctance to bet everything on a technology you have 0 control over, and which one of your direct competitors controls completely. Now why would Mozilla be hesitant to do that? You tell me.


I think you missed my point.

Say you have an open-source project with a single owner, who makes all the decisions about it, and is willing to totally change it around to suit his needs.

Would you stake your business on use of that open-source project? Only to the extent that you're sure your needs align with the project owner's. Unless, of course, you're planning to fork anyway.

That's where NaCl is at the moment.

It's also where Rust is, even more than NaCl. Anyone who is not in the business of working on Servo is nuts if they're relying on Rust for anything important, so far. In my opinion. So yes, from my point of view Rust is definitely proprietary to Mozilla at the moment, and asking anyone else to use it (again for anything important, not just experiments) is just a recipe for disaster.

As far as NaCl adoption, Apple and Microsoft have their own reasons for not adopting it, for sure. But this subthread is about Mozilla's reasons, which certainly don't match those of Apple and Microsoft.


Let's check wikipedia.

"Proprietary software or closed source software is computer software licensed under exclusive legal right of the copyright holder.[1] The licensee is given the right to use the software under certain conditions, while restricted from other uses, such as modification, further distribution, or reverse engineering"

Muddying the waters by referring to open source software as proprietary software does not help. And I am sure the folks at Mozilla, being open source advocates themselves, would tell you the same thing.

Reminds me of an ex-boss (from a long time ago) who referred to all open source software as "freeware." Hey guys, the 1990s called, they want their bad hacker movies and confusion about the software business back.


You're confusing "proprietary software" and "proprietary technology". They're not the same thing.

There are lots of examples of open-source software implementing proprietary technologies of various sorts (example: x264). There are lots of examples of proprietary software implementing open technologies of various sorts (example: Opera).

I'm not talking about the licensing model for the NaCl source code; I'm talking about the openness or not of the entire NaCl technology stack.

As far as folks at Mozilla go, I think they might point you at http://www.readwriteweb.com/archives/how_to_spot_openwashing...


On TypeScript, are you seriously asking for warning-annotations? The class syntax is in for ES6, not original to TS. Or do you mean 'interface' as structural type (record width subtyping relation) declaration form?


I like the structural subtyping. This is a feature I also like in Go.


I know +1's don't help but ditto from me.

I haven't had time to use TS much yet but the combination of structural type system (not needing to inherit from IFoo, if you have the members it's enough) and optional typing look like exactly what I want.

Hopefully this pays off in terms of not only self-descriptive code but also greater support for refactoring tools.


In many ways, TypeScript is the "anti-Dash" (see leaked Dart memo). It builds on ES6. It does not inject novel runtime semantics. It just checks annotations _a la_ JSLint. Smart!


> It's highly unlikely that JavaScript spit out by a code generator (this would be the competition for NaCl) is going to be at all readable.

With SourceMaps it's possible to make them readable[1].

Also, problem with NaCl is it forces developers to move away to a different toolchain. Most developers would feel more comfortable and productive in developing in the browser, rather than moving to a IDE.

[1] - https://wiki.mozilla.org/DevTools/Features/SourceMap


> Also, problem with NaCl is it forces developers to move away to a different toolchain. Most developers would feel more comfortable and productive in developing in the browser, rather than moving to a IDE.

So make a toolchain that runs in the browser. There's nothing stopping you from building NEXE modules at runtime; you could compile LLVM itself for NaCl and embed the whole toolchain in the browser and compile at runtime.


There's audio processing stuff that I'd love to do in web browsers, but I refuse to touch Javascript with a ten-foot pole. I am the kind of person that would like NaCl to be adopted outside of Chrome. If you are the kind of person that is more interested in building the "app" part of some application of that kind of program, you have nothing to fear from NaCl -- just think of it as opening up the "standard native code" (native browser code, like video codecs, gzip decompression, etc) part of the web browser to everyone. You don't really care about what's going on under the hood when your users decompress some data you served to them, as long as everything works properly, right? That's pretty much the definition of something that most web developers couldn't care less about, and that people who do (for lack of a better term) "fast computing" care desperately about.

To give a recent example, what if you'd like to start serving Opus audio to all your users? The standard response to that idea now is "that's funny, Internet Explorer will never even support Vorbis, and that's over a decade old." If safe native code execution was a standard part of the browser right now, it would be trivial to distribute an Opus decoder alongside the "web app" you implemented in Javascript. Opus would just be your competitive advantage, not something you need to beg people to implement. It would already be ubiquitous.

Before anyone mentions X implementation of Y audio codec in Javascript in an attempt to discredit the value of native code execution, just stop. I highly doubt it will work acceptably on my mom's computer (the average computer that accesses your site is almost surely a hell of a lot less powerful than you may assume it is), and I highly doubt you'll ever be able to rival the performance of native code for processing on the order of video codecs anyways. Not to mention the fact that the majority of the nontrivial Javascript audio demos you've heard were made possible in no small part due to standardized, native code linear filtering and convolution.

"Frustration" would be the word that sums up the whole "web app" movement to me. When I see that someone's made a client-side GIF animating "web app," all I can think about is how it would have taken two seconds to hook an existing highly performant C GIF encoder up to some Javascript, and instead we had to wait for someone who knows Javascript to hack a painfully slow (and therefore useless for the vast majority of users) alternative together. You know how ubiquitous similar server-side services are? That's because all it takes to make one is a simple Javascript/HTML form (or HTTP, if you want to get trivial)-based interface to an existing C program. Just think of where client side web apps could be right now if the same were true for them.


I've been doing audio processing in Flash for a while (most recently, sample-based synthesis implementing a decent portion of SFZ and SF2, with 64-voice polyphony and filters) and I did have to push an unusual degree of effort into optimizing the sample copying code, with a Haxe macro that generates an optimal inlined loop for each combination of parameters. It can still use up most of a core when maxed out...

...however, I did some math and some extrapolation and determined that within the next three to five years this domain will be completely reasonable to approach from JS, driven by a combination of technologies:

-access to GPGPU from the browser. DSP work can generally be defined in terms of a shader, although it's still a very poorly understood area.

-more general-purpose cores, faster JIT performance, and possibly single-threaded hardware improvements as well. These things compound easily, so we could end up with a 50-100x larger JS performance envelope for this domain without even considering the GPU.

-maturation of the existing and planned audio APIs for common tasks. As you point out, this isn't interesting from a "ground-breaking tech" perspective, but in covering typical application needs, it's as important as the others since it's both convenient and optimized out of the box.

-maturation of cross-compilation technologies, smoothing over the "code has already been written" issue.

In a lot of ways, all JS has to do to be competitive is the "catch-up" work. It takes quite a while in tech time, but in human time, most of us will be around in the next decade.


You're right. Browsers are constraining innovation to a top-down approach, where browser vendors try to design and implement alternatives to things like TCP, POSIX.

What would be better is if browser vendors exposed a core low-level API to trusted installed web apps (as opposed to web pages) and then let open source build on that.

For example, instead of coding up IndexedDB and leaving no alternative, just provide proper POSIX, and let the database community do its thing.


Because there's more interest in improving the (publicly accessible) web as a platform than improving installed web apps.

Regardless, to expose such a low-level API would eliminate one of the big advantages of the web: given a browser, I can use any web app on any device. Given, for example, given a TV (which are typically closed platforms, but increasingly often include a fully featured browser), you likely wouldn't be able to install anything that a certain web app depended on (and even if you could, what are the odds that it's been tested on a big-endian platform?).


No, it would not eliminate the big advantage of the web as you say, merely deepen it.

See Tim Berners-Lee: http://lists.w3.org/Archives/Public/public-webapps/2012JanMa...


None of what timbl said in that post (nor what anything, from memory, in that thread said) covered shipping native code to the browser: it was merely talking about privileged web apps, which is a very different problem area (and one that should definitely be explored!).


Yes, my comment was concerning the top-down approach taken by browser vendors, and that web apps need to be empowered with low-level APIs such as TCP and POSIX. I don't think I said or TBL said that we need to ship native code to browsers to do this. We don't. We don't need another language. Javascript is fine. We do need better low-level APIs. We do need to get away from relying on browser vendors to spec and implement everything correctly, because with such a massive surface like they're targeting they won't. We do need to push this responsibility out to the edges. We need to drastically reduce the standard API and move the functionality from high level (as in IndexedDB) to low level (as in POSIX). We need a strong powerful stable small core and OSS must do the rest. That's the general idea. The finer details can be nitpicked and that won't invalidate the general idea. Browser vendors are simply doing way too much.


How would you expose POSIX on Windows?


POSIX is an interface not an implementation.


> but I refuse to touch Javascript with a ten-foot pole.

Why? Honest question here: what are the problems that are making you unwilling to even try it? Is performance the main problem?

> just think of it as opening up the "standard native code"

As long as the browser is running on a small set of target hardware architectures. And everyone else gets locked out, right?

If PNaCl ever happens, that might change, but at the moment that's how NaCl works: you tie your "web page" to a particular set of hardware architectures when you use it.


I don't think NaCl is targeted at web devs that are comfortable developing in the browser. It's targeted at the other devs that aren't comfortable with JS and who write C++ (or other) code (like in games or other graphic-intensive stuff).


I agree ... and I'll extend it by asking why you'd "dis" the other languages he attacks. Once again the old adage is proved: "When all you have is a hammer, everything looks like a nail". Let's get over the JS insecurity, admit it's useful for some tasks and that it sucks for others and get on with some useful discussion about when each of those statements is true.


Also Nacl supports threads while emscripten does not. For a large set of applications (including my own games) this is a deal breaker to using emscripten.


The last slide was interesting: http://brendaneich.github.com/Strange-Loop-2012/#/50

  - First they said JS couldn't be useful for building 'rich internet apps'
  - Then they said it couldn't be fast
  - Then they said it couldn't be fixed
  - Then it couldn't do multicore/GPU
  - Wrong every time
  - My advise: always bet on JS
It turns out that JS might soon be a mature language which can be used to build real apps. That future has arrived (or at least you can see the train).

es6/7+hardware accelerated graphics+ continued broadband adoption and speed increases (gigabit internet) make the 'web as a platform' dream a reality. I'd love to peek at Microsoft's medium term plan to deal with this.

Also: I wonder how long until we see a full replacement for something like eg:3DSMAX with GUI on the client and rendering done in the 'cloud'?


It's disingenuous to claim that javascript is not a "real" language already. This is like claiming that PHP or VB aren't "real" languages. You may not like it, you may think it's missing crucial, key features, but there is a lot of JS code out there in the wild producing value for businesses and customers and generating billions of dollars in revenue.

Edit: looks like I misread the parent post, my apologies.


Looks like I didn't communicate too well. I wholeheartedly agree that JS is a real language. It's my main language and I've written tens of thousands of lines of the stuff. It has it's quirks but I like coding in it. My "JS as a 'real' language" was a tongue-in-cheek go at those who constantly criticise it for being a toy language. I've edited my original comment.


I'm honestly surprised he didn't mention Atwood's Law: any application that can be written in JavaScript, will eventually be written in JavaScript. http://www.codinghorror.com/blog/2009/08/all-programming-is-...

The crux of the argument is, "As a software developer, I am happiest writing software that gets used. What's the point of all this craftsmanship if your software ends up locked away in a binary executable, which has to be purchased and licensed and shipped and downloaded and installed and maintained and upgraded? With all those old, traditional barriers between programmers and users, it's a wonder the software industry managed to exist at all. But in the brave new world of web applications, those limitations fall away. There are no boundaries. Software can be everywhere."


But without an ecosystem of frameworks like OSX or .NET, it will be difficult to develop apps with good and consistent experience.


Any toolkit that attempts to shoehorn a consistent experience on the web is doomed to failure. People have made efforts, earnest efforts, at solving the problem (dojo, ext, closure, capuccino). None have caught on and at the moment I'm assuming none will.

Bakground paragraph: I've been thinking about the issue since 2004 or so and always assumed that a widget toolkit would develop like it did for desktop environemnts and one or two would eventually win. The launch of mobile app development changed things. The kinds of apps being delivered were outright better than what equivalent web apps were delivering. Not the perf, I assume perf is going to be solved by better hardware and engine improvements, but what the apps accomplish. My conclusion is that every project has a semi-fixed budget of resources and that web developers burn roughly 50% of that budget on building a custom one-off interface solution for every project they work on. The obvious answer is components but component frameworks had not been catching on so the question is why.

The normally cited problem is the composition problem. It's incredibly difficult to achieve component isolation in the DOM and the Web Components effort is attempting to address. I approve the effort but I do not believe it will solve the problem. It's possible(ref yui3) to write components that will cleanly drop into pages that keep their CSS tweaks class based and that restriction isn't unreasonable in an app. Component use happens more frequently in apps but is not the default like it needs to be to escape the one-off tar pit. The issue is social.

The web's roots in documents mean that pretty much everybody wants a custom UI. All existing component libraries, following desktop toolit precedents, come as a packaged markup/CSS/behavior combo. These two requirements are opposed and that opposition is the core problem. I have never used a component library and not had to customize a signfiicant number of components. When doing so, I either have to maintain a private fork of the component or suffer in download/perf as the framework is super heavy and caters to all possibilities. The yui3 (and looks like current dojo) approach of core+plugin composition with auto dependency resolution is the best solution I know of but overriding is still non-trivial. Tweaking the CSS usually means having to build a complete theme and using generated markup is hit and miss.

My current thinking is that Web Components are a start but there needs to be a good solution for selectively overriding a part of the markup and for overriding style. CSS preprocessor features are also required to achieve styling goals efficiently. The rise of Bootstrap might prove me wrong by starting as an incomplete system, getting buy in (multiple themes, adoption for simple uses), and growing to a more complete one but we'll see.


Thats another Then... you could add

  - Then they said it didn't have an ecosystem of frameworks like OSX or .NET
the frameworks will come. It's undeniable that the web, and tooling surrounding it is reaching critical mass.


Yeah and we'll probably only have to wait another 5 years!

It's funny how .NET languages can have a consistent, predictable and precision framework built up around them and yet somehow we're supposed to believe that JS is better than bytecode.


Is it just me or does the proposed ES6-standard smell of heavy feature creep?

- we get classes _and_ modules _and_ typed objects. Yeah, those all have their merits and are all somehow different, but having them all seems to add only slightly more value while increasing the overall difficulty of the language considerably.

- same with macros and codegens. While they certainly are different animals, there seems to be overlap in the area they will be employed.

To me, there seems to be feature envy on the side of the JS crowd to try and bake every nice feature into the language. I am not convinced this is the right direction.


we get classes _and_ modules _and_ typed objects

There aren't "typed objects". There is a new library for working with binary data efficiently.

Classes and modules serve different purposes. Classes provide syntactic sugar for prototypal inheritance. Modules provide a baked-in mechanism for encapsulating and sharing code.

same with macros and codegens

Neither of these are on tap for ES6. Brendan showed them as interesting experiments that may be proposed for a future release.


Quite right about macros, but codegen is happening now, all over the place.


> Is it just me or does the proposed ES6-standard smell of heavy feature creep?

I've been on TC39 for years and the vast majority of what we do is cut.

> we get classes _and_ modules _and_ typed objects.

Classes are syntactic sugar for one of the most common dynamic patterns in JavaScript: object factories. Modules are static collections of code. Very different things, especially in a dynamic language like JS. Typed objects are not even remotely related; they're a low-level API for efficiently working with structured binary data.

> same with macros and codegens.

As Yehuda says, macros are not part of ES6 but rather an experiment at http://sweetjs.org to consider for future editions. I don't know what you mean by "codegens." Brendan's slides weren't talking about a feature, but rather an existing usage pattern that we need to serve in additional to human code-writers.

ES6 actually does a remarkable job covering many use cases with small modifications that smoothly integrate with the existing language. You might want to try experimenting with some of these features, many of which are implemented partly or completely in SpiderMonkey (as Brendan's latest blog post details). Most of the features are small improvements to paper cuts, and as everyone I've ever spoken to who's written SpiderMonkey-specific JavaScript attests (for example, in the Firefox front-end or in addons), they make your life so much nicer. I'm particularly thinking of let and destructuring, and I suspect parameter defaults and rest-arguments will be hugely popular as well. The few bigger features, especially modules and generators, are for addressing the most important gaps in JS.

Dave


Aside from my personal distaste for his backing up the semantic truck and dumping it into ES6, I think it's a bit annoying -- to the point of being disrespectful -- that Brendan doesn't mention V8 in his history of JavaScript. Without V8, there is no JavaScript on the server-side (sorry, Rhino and SpiderMonkey), there is no Chakra and there is no TraceMonkey/JagerMonkey/IonMonkey: given that JavaScript had survived for a decade in its pre-V8 state of abysmal performance, it's entirely reasonable to assume that it would have slept away another decade had it not been jarred out of its slumber by V8. So it would be nice to see some respect where it's clearly due...


> I think it's a bit annoying -- to the point of being disrespectful -- that Brendan doesn't mention V8 in his history of JavaScript

Bullcrap. Why would V8 deserve any more mention than Spidermonkey (which had none, by the way) in a talk about the semantics and language evolutions of javascript when it's Gecko/Spidermonkey which pioneered and implemented roughly 95% of these evolutions in the first place? Why would it deserve more than Trident, which — through xmlhttprequest — is the one responsible for the vast majority of the language's actual popularity? Because it's your pet runtime and you dont like others?

> So it would be nice to see some respect where it's clearly due...

You may want to take this advice for yourself, your comment is dismissive, insulting, contemptuous and contemptible.


> You may want to take this advice for yourself, your comment is dismissive, insulting, contemptuous and contemptible.

It was none of those things. You two merely disagree.


> It was none of those things.

It was, and still is. It chides Eich for not including a blurb about something which has little to no relation with the presentation itself (the presentation's core was not javascript runtime performances or even javascript runtimes in general), and claims "disrespect" over that non-inclusion, all the while — as other commenters also noted — getting most if not all of its assertions wrong.


> Without V8, [...] there is no TraceMonkey

TraceMonkey shipped before v8 did.

v8 might exemplify the JS performance trend, but it didn't start it. Mozilla, Google, and Apple (with squirrelfish extreme) independently developed JS JITs at around the same time, driven by the increase in JS-heavy web apps.

So no, JS performance wouldn't have "slept away another decade"; the changing web situation demanded the change (as evidenced by the fact that all 3 orgs independently moved in the same direction at essentially the same time).


Edit: Actually, I just realized I'm off here. The June 2009 release of FF was a final release, while the September 2008 release of Chrome was a beta. From what I can tell, though, the beta release of TraceMonkey was a day after the beta release of Chrome.

>TraceMonkey shipped before v8 did.

No, that's not true. TraceMonkey shipped in June 2009 with FF 3.5. V8 shipped with the first release of Chrome in September 2008. At the time, TraceMonkey was in beta, but it's unclear which began development first.

That said, the idea that V8 sparked the JS arms race is preposterous.


I was going by this blog post from Brendan Eich:

https://brendaneich.com/2011/06/new-javascript-engine-module...

Where he says "[...] TraceMonkey, which we launched ahead of Chrome and V8".

Maybe "shipped" was the wrong word. I suppose there's some definition of "launched" that makes the statement true; tracemonkey landed, and was announced, in August 2008. But you're right, the Chrome beta (Sept. 2, 2008 according to wikipedia) did precede FF 3.1 beta 1 (Oct 18, 2008).


TraceMonkey was announced on 2008-08-23, after 2 months of development[1].

V8 had its first public release together with Chrome 2008-09-02 as you already mentioned, but development appears to have started in 2006 according to some of the copyright notes in the initial SVN export[2].

[1] https://brendaneich.com/2008/08/tracemonkey-javascript-light...

[2] http://code.google.com/p/v8/source/detail?r=2


Which seems to confirm there would be a Tracemonkey even without V8 ever existing.


Yes. I did not take Bryan to task on that, but it really gives V8 a bit too much credit for it to cause Andreas Gal to work on trace-JIT before 2006 (on Java, for his UCI PhD; then on JS in collaboration with Adobe and Mozilla).

Tracing was a good rocket to strap on SpiderMonkey-the-2008-era-interpreter but it fell to a combination of the PIC-based approach V8 championed and Brian Hackett's Type Inference work (PLDI 2012, http://rfrn.org/~shu/drafts/ti.pdf).


> Without V8, there is no JavaScript on the server-side

The first implementation of server side Javascript was nearly 20 years ago:

http://en.wikipedia.org/wiki/JavaScript#Server-side_JavaScri...

Naturally, it was never all that popular, but... it has been around.


I maintained an app using Netscape's LiveWire 12 years ago. Had that project been free (as in beer) I think it could have seen pretty good adoption.


He meant server-side JS was relatively non-existent before node.js, he even listed alternate server-side JS impls that did use server-side JS, nothing of which is anywhere close to the phenomenon that node.js is now.


> I think it's a bit annoying -- to the point of being disrespectful -- that Brendan doesn't mention V8 in his history of JavaScript.

Chill. It's a timeline of language evolution - I don't see how V8 is relevant. Mentioning the AJAX revolution isn't strictly relevant either, but serves the purpose of separating orthodox from next-gen Javascript.

Btw, while work on V8 started 2 years before work on TraceMonkey, apparently the latter was announced first.


>there is no TraceMonkey/JagerMonkey/IonMonkey

The latest date I can pin down for the beginning of the JS arms race was Safari's JavaScriptCore in March 2008, followed by SquirrelFish in June. V8 has had a huge impact in driving innovation and setting the pace of the race, but it did not fire the starter pistol.


Howdy, Bryan. A few comments.

First, it's not all about _moi_. As with any mature language with multiple implementations, there is a committee, Ecma TC39. It has reps from all the bigs plus PayPal (Doug Crockford) and Yahoo!. We avoid design by committee, instead focusing on paving the cowpaths and in a few cases working with champions -- single innovators or RPG's "resonant dyads" _a la_ ken&dmr -- to do focused design. The best example in my view is the Proxy design by Mark Miller and Tom Van Cutsem, followed by Modules by Dave Herman and Sam Tobin-Hochstadt.

We have a few goals, including minimizing kernel semantics and extending syntax for usability only where we have experience and positive user-testing results (e.g., destructuring, let, generators, classes as prototypal sugar a la CoffeeScript).

In this light, we aren't dumping new semantics into ES6. If you count fairly, we are filling the big semantic gaps in ES5 (AKA JS as we know it). For instance, where is client-side synchronous "require" without the ES6 module system? No fair that Node's require is sync (against all sound doctrine!). Sync XHR is from the jank-devil, and we cannot do sync require as an API client side.

Second, as others have noted, V8 was stealth'ed for two years (I learned of it in 2006 but Google withheld, see https://brendaneich.com/2011/06/new-javascript-engine-module...) and then unleashed just as Apple and Mozilla got their JIT acts together. At the time some said V8 was 3x faster, but the actual ratio on stupid benchmarks was more like 1.3x at first.

V8 is an amazing piece of work, but it did not contribute to the language evolution that my "very brief history" slide diagrammed. It certainly contributed by being the fastest, but until after 2010, that was the extent of its influence, for several specific reasons.

For the first few years as open source, V8 foreswore diverging from JavaScriptCore (the first WebKit engine, Apple's fork of KJS from KHTML, which V8 in some ways forked WebKit to replace) on any new ES5 or ES6 proposals, to try to keep on (or get back on, in view of the fork) Apple's good side. Fat chance!

Even now, V8 still keeps prototyped ES6 implementations hidden behind a flag. I expect this flag to be lifted before ES6 is ratified.

Another thing that hurt: Google has moved V8 to Munich because the Aarhus team wanted to do Dart, Google explicitly chose to invest in Dart over JS, and the no-remote-teammates rule hurt. IMHO the Munich team is quite strong but only ~10 people. This limits its effectiveness compared to Chakra (~60 heads?) and even SpiderMonkey (~12, more with interns).

Finally, whoever thinks I disrespect V8 clearly does not listen to my talks. The video will be up, give it a view when it's live if you have time. There is no disrespect from me toward V8 or the team that built it. I do mention it favorably in all talks, including Strange Loop (if my memory serves).

Lars Bak did the Strange Loop first night closing keynote, which my insane schedule prevented me from attending. He and I have had no meetings since that one in 2006, although he met with some Mozilla colleagues in 2009. I bear no bad feelings, although Lars did tell a gratuitous whopper around 3:50 in this talk: http://www.youtube.com/watch?v=T2TJYBmDZHI (see my blog cited above for what happened with Mozilla being "offered" V8 in 2006 -- Google reneged). He is the best VM hacker I know of. (Mike Pall is another superhacker of note.)

I hope this sheds some light. I respect V8 while noting that, as with anything human, it and the technical/product/open-source/standards politics around it show some significant flaws. That is inevitable.

Mozilla has flaws too, but they tend to be different in kind. For example, we do open from the start and try to partner even against long odds (e.g., Tamarin/ES4). I'm not bragging, because I think that can be a mistake. I've learned from V8 and modified Mozilla strategies accordingly.

So, hats off to V8!


Brendan, I appreciate the thoughtful reply. I think the only reason for my criticism (and the word "disrespectful" was too harsh -- I should have left it at "annoying") was that your last slide ("Always bet on JS") implies (to me, anyway) that JavaScript's high performance is its manifest destiny. And that, to me, understates the contribution of V8: the high performance of JavaScript was not a foregone conclusion; it required guts, innovation and hard work. As for the assertion that a trace-based JIT beat V8 to market, a trace-based JIT also has well-known failings, so it's not a terribly meaningful data point -- other than that it speaks to the aspirations perhaps for a higher performing JavaScript. But we clearly agree that Lars and team developed a terrific VM in V8, and that V8 was important in the history of JavaScript -- which was my only point.


Never mind TraceMonkey -- the JavaScriptCore kids rallied in 2008 and did SquirrelFish Extreme, which held its own going into late 2008.

The chosen measure of these new VMs was a set of benchmarks, SunSpider from Apple and the V8 Benchmarks from Google. While V8 had the best GC and most optimizations, on these suites at least, for about two weeks for TraceMonkey (and longer for SFX), V8 was not that far ahead.

You can find the charts via Google still.

V8 had the longest lead time, not just working on what was released with Chrome but trying other approaches first, learning from them, and starting over. That's huge and it has paid off well.

But I don't agree that any architectural failing of one VM counts more than public, reproducible benchmark scores. Even V8 had to do Crankshaft.

Architectures evolve and supersede one another, but the developer and user benefit -- the public benefit -- comes from the competition. V8 was not alone in driving competition.


I wonder what he would have said about TypeScript, it seems that it was launch after this conference (though there is a large chance he was in the loop, no pun intended)



Too bad he wasn't in the loop though, MS could have gained some additional momentum involving him: https://twitter.com/BrendanEich/status/255649668548136960


> I wonder what he would have said about TypeScript

Likely nothing. The presentation has a slide on compiled-to-JS languages, and a link to a listing site. Eich doesn't seem to mind languages compiled to JS so far.


The fact is that adding classes to JavaScript fundamentally alters the lispyness of the language in a detrimental fashion.

I don't want to see JavaScript turned into Java script.


People already use it as though it had classes, so it makes sense to add the sugar. It will prevent errors such as SomeObject.prototype.foo = {} by not allowing data on the class body.


Some people already use it as though it had classes.

I don't like those people.


It has objects which you can use as prototypes and create instances of those objects using the new keyword. Doing this has the side effect of my better performance. With all of that being the case, what do you expect people to do? The good thing about maximally minimal classes is that it prevents people from making mistakes by assuming that prototypes work just like classes in java/etc.


In my experience (and some benchmarking), there's no real difference in speed between using a Module-based factory method or using object prototypes. Modern JS engines optimize for both.


Well it's not very good at not having classes either, due to the constructor mess (and `instanceof` being broken if you're doing constructorless JS as far as I know, which is very frustrating — of course it's even more broken if you have cross-frame objects involved) so...


Agree. This is just speculation, but JavaScript classes also seem like they could be easy to mess up, particularly for a beginner web developer.

I think there's a lesson from C and C++: don't add complexity unless it's absolutely necessary.


Here's a more detailed write-up on some of the ES6 features mentioned in the slides -

https://brendaneich.com/2012/10/harmony-of-dreams-come-true/


My jaw dropped on the byte code slides. Even though probably foolish to expect that the Javascript creator would offer non-Javascript solutions, I would guess that it would be significantly better to compile Javascript/CSS to a more efficient, stable, portable target byte code or source language (Go? Scheme? Postscript?) than to compile languages to Javascript. For example, when I played with it the Hello World ClojureScript example compiled to a Javascript file of over 100kbytes. It seems like Go and JIT'ed languages have demonstrated that extremely fast compilation of extremely efficient high level native code is possible.

Long-term it seems like it would be significantly better to encode these web language extensions (new CSS/Javascript syntax) as libraries instead of as API/run-time extensions so that each new version of web languages doesn't require all the run-times (Firefox, Chrome, Safari, IE) to adopt it and old run-times to phase out. The current standards/run-time process doesn't seem to be extremely well-thought out. Has anyone on these standards communities ever had to actually build or maintain a web application?


> Has anyone on these standards communities ever had to actually build or maintain a web application?

Yes.


Am I the only one who really dislikes the recent trend of just posting slides without any kind of commentary at all?

Slides are not very helpful on their own :(


Surely the fact that NaCl is currently led by Google should not disqualify it from being a potentially valuable contribution to web apps. Most prevalent technologies started in this way, include js. And most started in a much less 'open' way than NaCl. The point surely is not where they originate but rather whether they offer enough distinctive value that they are compelling enough to be adopted more widely and hence become a key part of our ecosystem. Hence the question is, does NaCl have such a compelling value proposition? Evidently, in Brandan's eyes, and those of several of his colleagues, the answer is no. But other people can disagree and hope that the value is sufficient to eventually win over a broader market share. A good example of such a community is those with large code bases in say C, or C++ who would rather port their code than rewrite it all with all the ensuing maintenance problems. Imho NaCl does have something distinctive to offer and I wish the project well.


Kind of annoying presentation style quirk: but some slides are on a vertical stack which you can access by hitting the down arrow instead of going right. You can tell in the bottom right which directions are available from the slide you're on.


You know, maybe it was optimized for the guy presenting it? :P


I just used the mouse wheel and it always went the right order, even if vertical.


Same, but it still seems unintuitive and pointless. Why not just keep them going from left to right? I'm pretty sure the target audience for this only really cares about the content and anything else just gets in the way.


"You are in a maze of twisty little passages, all alike."


Personally I use it to map my presentation into chapters. It allows me to better know where I'm at while presenting, and allows people reading my slides to get an idea of how it's laid out inside my brain.

It was just used by a colleague to organise his slides into:

Topic ▼ Code example ▼ What that code renders into

The last slide was actually rendered live from the slide before it. Absolutely badass.


Probably to provide the possibility of not drilling down into details so you can change the presentation based on the audience.


I really liked the presentation of slides, and enjoyed using the arrow keys to navigate. It was fun, and could have been very boring (for me).. it made me read more than I would have.. It was good viewing those slides in isolation, made me study some of the functions with more focus.


if you hit the space bar you can see a "map"; space again to return to where you were.


Every time I see a slide deck with zero context or explanation, I want to light my eyelashes on fire.


Video coming (conferences like Strange Loop are good about this but you had to be there to get the full effect), plus see https://brendaneich.com/2012/10/harmony-of-dreams-come-true/ for a companion blog post that focuses on some of the slides.


Hope they have the video out sometime! Seems to be some crossover with his keynote at Fluent: http://www.youtube.com/watch?v=Rj49rmc01Hs - but these slides go into a lot more depth and include some of the newer goodies coming along.


We're working on it....


I would really like support for 64-bit integers and native 64-bit math.


See https://bugzilla.mozilla.org/show_bug.cgi?id=749786 based on http://wiki.ecmascript.org/doku.php?id=strawman:value_object.... ES7 at this point. Help test. I need to put up a rebased patch...


Thanks Brendan, that is good news, added to the cc list, looking forward.


There are some proposals to make this happen... They're all in pretty early stages, sadly.


What kind of apps would this help with?

JS is never going to take over on the server for me until it can compete with Python's statistics support, I didn't see much in these slides that suggests it could.


Node.js modules and apps that work with C++ libraries/services that return 64bit numbers. Not having 64bit integers is an issue.


Is the lack of bignums (more than int64) the real issue? JS has IEEE754 binary double so that's the same as in Python.

Library code can be cross-compiled a number of ways, so I wonder whether the blocker is the lack of Pythonic long.


This is from the a section on custom iterators in the related blog post (https://brendaneich.com/2012/10/harmony-of-dreams-come-true/):

"We require opt-in to avoid future-hostility against custom iterators for collection objects. Such objects probably do not want any kind of general property iterator default, which if left on Object.prototype, might be object-detected and prevent installation of the correct custom iterator factory."

I can see the sense in that but I find all the little caveats in JS are one of it's weaknesses which makes me dislike this idea. Am I wrong to think they should have just put in a default iterator but made it easy to spot it so that you could replace it with a custom iterator where appropriate?


Python does not have a default iterator for its objects. It does for dicts of course, but JS objects are not dicts (lots of issues there).


I would just think default iterators are a sensible default especially for maps and that getting them in would be worth the very slight pain it might cause people wanting to attach custom iterators.


Maps have a default iterator as shown in my slides. Sets too.

Best to take this to es-discuss. For now, we're sticking with Python, which did not give its top-class (non-dict) Object a default iterator.


I would expect to hear something like that not from a respected tehnologist but rather from yet another "cool kid". He could just say "JavaScript is awesome" on a single slide and that wouldn't tell much less than what the whole presentation did.

> First they said JS couldn't be useful for building rich internet apps

Who said it is? Rich - yes. Anything near to match complexity of the desktop apps - never(think Photoshop).

> Then they said it couldn't be fast

Benchmarks, maybe?

> Then it couldn't do multicore/GPU

Webworkers are nice, but you can add bindings to all of this stuff for virtually any programming language.

> JavaScript's parser does a more efficient job... than the JVM's bytecode verifier.

Figures again?

> No view source

How viewing at minified JS is going to help me? Especially in the light of what he had on his previous slide:

function f() { L0: g(); L1: if (p) goto L0; o.m(); goto L1; }

Good luck "view source" on what he calls "the assembly of the web".

Then the screenshots of 3D games that presumably use WebGL - that just doesn't cut it. Just about every game demo I tried out on my previous generation hi-end ATI graphics card had performance issues. And the level of graphics is comparable to what native games had 10 years ago. That's a joke.

> Typed arrays

Until they add records so that I can declare an array of any type efficiently don't even bring this up. This is an ad-hoc solution.


Why do we need to make Photoshop in JS?

I can't make a printing press in C++, but why would I want to?

Photoshop was made during an era where computers were used to make media for physical printing. You would take a picture with an analog camera, digitize it, manipulate it in Photoshop, and then have it ready for print. "Save For Web" is the closest you get to using Photoshop for publishing to the web, and you'd have to admit it's a bit of an afterthought in the whole experience.

Software doesn't exist on it's own. It exists in an input and output environment, beyond just mice and monitors. It exists to capture information, manipulate it, and then republish it.

In many ways, memegenerator.net does a better job of consuming, manipulating, and publishing content for the web than Photoshop does.

Photoshop will probably not go away. There are still printing presses and there are avenues to publish things made with them. Media tends to gain a lot of inertia by the time it gets to the point of being a household name. However, this inertia doesn't really impede some other form of media from gaining it's own momentum. Photoshop is busy being Photoshop, not memegenerator.net.

If you ask my opinion, I'd say there is plenty of ground somewhere between memegenerator.net and Photoshop and there is no better language and environment to create these tools than the environment where they will be published: JavaScript running in a web browser.


> Photoshop was made during an era where computers were used to make media for physical printing. You would take a picture with an analog camera, digitize it, manipulate it in Photoshop, and then have it ready for print. "Save For Web" is the closest you get to using Photoshop for publishing to the web, and you'd have to admit it's a bit of an afterthought in the whole experience.

Javascript was made during an era where rich web applications didn't exist and there only was a need for client side verification for form input. Why do you refuse Photoshop in evolution but allow it for JS?


Photoshop is a notorious pig. I know many designers who have already ditched it for in-browser development. See the latest JSConf.eu talk on this topic: http://2012.jsconf.eu/speaker/2012/08/29/because-f-k-photosh... (slides: https://speakerdeck.com/u/nrrrdcore/p/js-dot-conf-dot-eu-201... -- but you probably had to be there).

I sense trollery here: "but you can add bindings to all of this stuff for virtually any programming language." Who said otherwise? The bogus claims against JS (going back to the "RIA" era, where IBM and Macromedia/Adobe made such arguments) already fell.

The issue is not what languages can program the GPU somehow -- because only JS is supported directly in browsers, the issue is whether JS cannot. Clearly (WebGL, River Trail, even GLSL embedded in an unknown-type script and downloaded via JS) that's false. But it seemed true once, which led to the false anti-JS prophecy.

At the risk of feeding a troll, I suggest you use the down arrow on the "goto L0" slide to see how JS enables compiling control effects without goto.

And play BananaBread, for crying out loud (https://developer.mozilla.org/en-US/demos/detail/bananabread). You simply do not know what you are talking about by your next-to-last paragraph.

Yes, typed arrays were ad-hoc (so are many incremental web standards that win). Binary data (arrays and structs, which compose) as proposed in ES6 are not.


> See the latest JSConf.eu talk on this topic: http://2012.jsconf.eu/speaker/2012/08/29/because-f-k-photosh.... (slides: https://speakerdeck.com/u/nrrrdcore/p/js-dot-conf-dot-eu-201.... -- but you probably had to be there).

Just skimmed through the presentation, yes I had to be there because the slides don't tell too much. On the other hand there is no point in going to the conference where the speaker uses the work "fuck" on every other slide and concludes it with "YOU ARE ALL AWESOME". I'd prefer to attend more technical talks, without any "awesomeness". And I can't see how she made Photoshop irrelevant. I'd expect a more credible study on it.


> Photoshop is a notorious pig. I know many designers who have already ditched it for in-browser development.

The point was not about using Photoshop for design on the web, but rather its complexity. IDE's, CAD, engineering applications. Well, we don't even have to go that far. When Google Docs are going to process 500k row spreadsheets? I use Google Docs casually for quick and simple stuff, but for any serious work - probably not.

> I know many designers who have already ditched it for in-browser development

I know many people(all of them are extremely bright) who use Linux on the desktop. I use Linux too(I am not implying I am a clever bloke here). Does this mean Linux is winning the OS war? Otherwise this is just argumentum ad populum. I'll look through the slides and get back to you.

> I sense trollery here: "but you can add bindings to all of this stuff for virtually any programming language." Who said otherwise? The bogus claims against JS (going back to the "RIA" era, where IBM and Macromedia/Adobe made such arguments) already fell.

I apologize if you suspected a troll in me, but I did not imply comparison with Flash. Flash is just another competitor for you. What What I meant by "but you can add bindings to all of this stuff for virtually any programming language." is if use C/C++ or any other language with access to the OS APIs I don't depend on the wits of browser vendors shipping a particular API for me.

> And play BananaBread, for crying out loud (https://developer.mozilla.org/en-US/demos/detail/bananabread). You simply do not know what you are talking about by your next-to-last paragraph.

Believe me or not, but this is exactly the game I was referring to. The link was on the HN last week or so. To be more specific, my graphics card is AMD Radeon HD 6970, capable of running Crysis 2, and I was experiencing lags. As for the graphics, have you ever played Half-Life 2? It was released in 2004 and I can't see this game having any better graphics if not worse. Not to mention that Half-Life would have at least 3x higher FPS on my hardware. Could you please clarify on "You simply do not know what you are talking about"?

> Yes, typed arrays were ad-hoc (so are many incremental web standards that win). Binary data (arrays and structs, which compose) as proposed in ES6 are not.

Ok, it's good to see JS is going in the right direction. And thanks for taking time to responding to my message. I am actually excited about the work you do at Mozilla on Rust, and I'd like to see a language like that to be available on the client side. But I understand that it's not going to happen anytime soon.


Every time I read an article or see a presentation like this - about JavaScript, Node.js, etc I feel like I am somehow appeared in the club of very particular interest (say, leather clothes with large openings on the back). And, despite of discussing the main subject, the members every time tend to talk about world politics, religion, and science, and how all these are affected by the size of the openings on the back. Wierd.


Need some advice here. I want to learn JavaScript -- but after reading all that, I wonder if all those changes/additions means that I should wait? Or will learning JavaScript now make little difference and all those changes/additions will make sense when enacted?


Don't wait. First, no guarantee that stuff will actually get adopted, it's still quite abstract. Second, even if so, it will take a while for browsers to catch up, so people will be coding in current JS for a while. Third, much of the current conceptual hurdles will still be relevant, particularly functional style and async design patterns.

Would highly recommend Javascript: The Good Parts, don't know where I would be without it.


For the record, I just bought "Javascript and jQuery: The Missing Manual 2e" and am impressed with its style of teaching.

Yes, as someone who has programming background, it sometimes seems to be directed at those with no programming experience, but it still helps to start from the beginning and walk through.

As someone with a some CSS, HTML knowledge and no CLUE what the DOM was or how javascript interacts with HTML/CSS, it's been a good few days for me.


Overall pretty impressive feature list. I guess proxy is intended for the monkey patch crowd.


Or for anyone who wants to implement a DOM in JS, say. Or a security membrane (Caja, say). There are a bunch of other use cases too...


One of the slides mentions WeakMap. The issue with them is that they have weak keys, not weak values. If you're trying to keep track of weak values (say to prevent duplicate instantiations of managed objects, while still allowing them to be garbage collected, for example), WeakMap won't work.

https://developer.mozilla.org/en-US/docs/JavaScript/Referenc...

Is there any work to add weak values to JavaScript? There's a node module, but nothing for client-side code.

https://github.com/TooTallNate/node-weak


I think he missed mentioning the new lambda syntax. Hope it's still in ES6.

Also, my pet peeve: there is no language that needs await/defer as much as JS.

EDIT: Maybe it could be done with macros, which is on its way.


> there is no language that needs await/defer as much as JS.

http://wiki.ecmascript.org/doku.php?id=harmony:generators

is a more general abstraction, which allows expressing async/await-type constructs: http://taskjs.org/


I skipped arrow function syntax (=> only in ES6). It was a bit much and I wanted to focus on APIs and compilation.



See https://speakerdeck.com/u/dherman/p/es6-deep-dive a bit more than half-way through -- the "Something Completely Different" section.


Thanks Brendan, stops an absorbent waste of time :P

Great presentation btw.


I tried running it through a few different free online OCR services and haven't had any success yet.


To those who say that creating a JavaScript 2.0 would break compatibility... couldn't you have a .js2 extension and compile a .js as a fallback. The newer browser would ignore the fallback and use the js2. Thoughts??

Example: <!-- JavaScript 2.0, unsupported browsers ignore --> <script src="main.js2" fallback="main.js"></script> <!-- Fallback compiled to current JavaScript --> <script src="main.js"></script>


There is also a thread about the related blog post:

http://news.ycombinator.com/item?id=4629952


I was really upset that I don't know all this about the future of JavaScript, and then I googled and realised this is from the guy who invented it.


With regards to the comments here:

I don't think pNaCl could ever happen how do you get an IR general but with enough detail to capture enough information to be able to optimize effectively and without pulling in half of llvm codegen?

The issue isn't a web 'bytecode' the issue is a standardized compiler bytecode.

Also haberman are you still working on the gazelle parser?


The Strangeloop talk seems to be quite similar to the one given at BrazilJS:

http://www.youtube.com/watch?v=84l0BrOlJwk (part 1) http://www.youtube.com/watch?v=IlQTjb794as (part 2)


What's the compile-to-JS language mentioned on

http://brendaneich.github.com/Strange-Loop-2012/#/22

with the lambda-in-yin/yang logo? It mentions macros, but nothing on the altjs.org front page mentions macros.


That's Clojure (or—in this case—ClojureScript, which compiles down to JavaScript), which is a Lisp derivative that runs on the JVM. http://clojure.org


And https://github.com/clojure/clojurescript/wiki for the ClojureScript site-ish.


ClojureScript a Clojure to JS compiler.


ClojureScript is really defined as a Clojure dialect, it has semantic differences from the "main" Clojure and is a slightly different languages (in part because features are missing from CS, in other parts because things were changed to work better and may or may not be folded back into Clojure).

ClojureScript macros are written in Clojure though.


Aren't most of these language features already in the now defunct ES4 aka ActionScript 3? I'm just wondering why that language is never mentionned when it's already a glimpse in the future for Javascript (for better or worse ... who knows!)


Not working for me in Chrome, works ok in Firefox. Could really do with a very basic guide on how to use the slideshow. Simplicity is good but this is bordering on frustrating and irritating.



I thought we weren't supposed to use javascript directly anymore. that's why all the 'first intro to' tutorials for all the hostest frameworks are all in coffeescript.


I am wondering what would look like Javascript would look like if it was built from the ground to be the ASM of browser languages.


Is there any estimate as to when this (mostly) awesome stuff will actually be implemented in browsers?


Firefox already ships with some (most?) of these features.


And according to [0], Chrome apparently has some (most?) of these features locked behind a big flag (seems to be "Enable Experimental JavaScript" in about:flags, no idea what's implemented though)

edit: http://kangax.github.com/es5-compat-table/es6/ considering the sea of red, it's rather clear that both Firefox and Chrome only implement some ES6 features (note: this crashes Chrome 22 after enabling "experimental javascript", which may explain the big flag, and the latest Firefox 16 improves quite a bit on the listed Firefox 13: adds const, default params, rest params, proxies and all the new Number functions)

[0] https://news.ycombinator.com/item?id=4633069


Trying to read this on my Galaxy Nexus was a non-starter...


The first slide was a pita but once you get pas that its OK. I swiped over the top half of the screen in portrait. Google chrome beta.


It was clumsy, but worked well enough for me. Any idea what library is being used?



Fixed that for you ;)

  - First they said JS was not practical for building 'rich
    internet apps' because of performance issues combined with
    the well known maintainability problem with JS in large
    code bases.
  - Then they said it wasn't currently fast
  - Then they said it hadn't been fixed
  - Then it doesn't do multicore/GPU
  - Right every time
  - My advise: always bet on JS* being behind most other
    languages because it is a slow moving standards based
    language created in a dictatorial fashion by a committee
    and input from a few powerful entities.


I am sure glad that a small committee of people can use their influence to dictate the standards of the web. We get so much more progress and innovation this way and at such a higher rate of speed don't we?

We don't need no stinking competition in the space. The standards people have a few people in high places here and there that the management in your company will blindly listen to to keep the standards alive.

Long live the dictatorial monarchy of the all powerful omniscient governing body of web standards.


What's the point of "Map" and "Set" in Javascript? Objects can already act like either.


* Objects as maps have limitations: only string keys, and literal objects have a prototype so you inherit potentially risky properties/behaviors

* Objects as sets are kinda terrible, they're not syntactically awesome and you need to override #toString to make them work correctly. They also don't support any set operation which sucks.


You can only have a string as a key if you use Object. You'd need to build hash for the key then. And that makes things much more complex.


Also see http://www.devthought.com/2012/01/18/an-object-is-not-a-hash... from earlier about some pitfalls that you might not immediately consider.

(HN Thread: https://news.ycombinator.com/item?id=4629544)


Objects can only have String keys. Maps and Sets can have arbitrary object keys.


In 8 years we can look forward to using this... Where will other languages be in that time???

Ahh committee decided standards.


We can already use quite a bit of ES5, and it's not so old. I'm sure the same will be true of ES6.

Also, you can compile from ES6 to ES5 or ES3 I would expect, plus there's also server-side tech like node.js.


Thx for sharing!


I was down voted cause I said something hollow. I shall pick the lesson, and:

Thx for the down-voting! :-)




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: