Hacker News new | past | comments | ask | show | jobs | submit login
Every day I learn something new... and stupid. (jwz.livejournal.com)
127 points by acgourley on Oct 15, 2010 | hide | past | favorite | 61 comments



From the comments:

Ten days to implement the [Javascript] lexer, parser, bytecode emitter (which I folded into the parser; required some code buffering to reorder things like the for(;;) loop head parts and body), interpreter, built-in classes, and decompiler... Ten days without much sleep to build JS from scratch, "make it look like Java" (I made it look like C), and smuggle in its saving graces: first class functions (closures came later but were part of the plan), Self-ish prototypes (one per instance, not many as in Self).

That's from Brendan Eich, the guy who created Javascript. Guess you never know when your 10 day project might go big and become something like the assembly for the web. Release early and iterate often seems to hold up quite well here....


Holy fuck. He did all of that in 10 days?

Even with a case of Red Bull, a bottle of Adderall, and a heart full of courage, I doubt most of us could come close to that level of productivity. We'd get bogged down in the details. I know I would.


It's not clear for me if he only coded for ten days. Or he just sad down and thought all algorithms and stuff, and coded in same time.

More over there was no social media sites like twitter, facebook, etc, at that time :)


No waterfall process at Netscape, or anywhere near me since the '80s. I designed and coded at the same time. That, plus lack of sleep, show in some of the gaffes.

But arguably (Doug Crockford may have argued this) the whole process required JS to make more out of fewer, stronger primitives (first-class functions, prototypes). I know I didn't have time for much else, as I said at the ICFP 2005 keynote.

As I told Peter Seibel in "Coders at Work", besides lack of time, I couldn't add anything like (Pythonic, dynamic) classes. That would have encroached on Batman-Java; can't have JS-Robin-the-boy-hostage getting too big for the Netscape/Sun-1995-era-batcave.


That's really interesting. I think JS is better for those constraints. Seems like a classic disruption: something perceived to be a toy turns out to take over the universe. Some people may cling to the idea that Batman is the "serious" alternative... meanwhile Robin is installed on approximately every fucking computer in the world. It took 10 years to figure out how great the DNA that made it into JS was, but this is what makes web apps possible. We're extremely lucky to have it. Thank you!


I've found that there usually is a moment in a programmers technical career where it just clicks. I find that what seems like productivity is almost always a direct result of a higher order of understanding.

An analogous situation is your average first year PhD candidate. Initially making some sort of contribution to the field feels overwhelming and almost impossible. But once you've spend a year or two reading papers and having coffee with the leaders in the field, everything comes together. That same PhD student starts to churn out quality papers every 6 months or so.

I think back to what it was like watching my dad program in Scheme when I was in high school. I got the same curious and overwhelming feeling then as I do now when working on certain areas of distributed systems. There's no reason to believe that the barriers that grownups face are any less insurmountable than the ones children do. :)


See also interviews with Douglas Crockford and Brendan Eich in Coders At Work for some interesting info.


Don't miss the comments on this one, since Brendan Eich (of "invented JS" fame) and Zawinsky worked together at Netscape from the beginning.


The one I liked: jwz to brendan eich:

“I'm still bummed that I failed to talk you in to making #!/usr/bin/javascript work back then, because I think that we were still in the window where we had a shot at smothering Perl in the crib...”


The comments were more interesting than the article. I had no idea it was created in ten days (if I understood Brendan correctly). I also had a good laugh at jwz's comment:

Brendan's house and my nightclub thank us for selling out early and often.


It may be old hat at this point but I'd recommend 'Coders at Work', the interview with jwz goes into more depth.

http://www.codersatwork.com/

I didn't know about the incredibly short Javascript creation story until that book.


I am surprised that jwz wasn't aware of this.

For the record, I think it was a reasonable decision. If you can't make something work well then at least make it simple. JavaScript's treatment of numbers is extremely simple: every number is a double precision floating point number.


Yes. While I understand annoyance with floating point accuracy bugs (having been burned by inch/mm conversion and pi/radian bugs before), numbers over 2^53 are large enough that they aren't likely to appear unexpected - if your domain needs bignums, you'll probably know it - so it's a reasonable trade-off.


I also don't understand what jwz's problem with the JS numbers is.

Once you have bit operators, you can do bit arithmetic. So yes it's a bit slower than when having pure integer types, so what? The JS engines were something like 100 times slower up to recently, and nobody really, really cared until Google did V8.

I really think Brendan was right in his decision and I don't get what jwz would like, except to maybe have explicit types in the language, like, hm, JScript.NET or ActionScript.

But even there as far as I know ints are 32 bits. 64-bit OS use is still not so common, so at the time the languages were introduced it was the best engineering decision.

jwz misses this time.


> nobody really, really cared until Google did V8

Tamarin and Squirrelfish were benchmark battling before Chrome was publicly released. I think the current focus on performance to be sparked by the CSS selector engine battle between the major JS frameworks.


SpiderMonkey (TraceMonkey in August 2008) was battling SquirrelFish (SquirrelFishExtreme in fall 2008) on untyped JS (Taramin did well only if you added type annotations). V8 was "first" only behind Google's firewall, until Chrome released in early September 2008.


jwz is a Lisp hacker, and he pretty clearly wants bignums, not explicit types.


Thanks! So he'd prefer slower execution but with less chance of accidentally losing the least significant bits? Protecting the programmer from himself? Or just having bignums for... what actually? Playing with new crypto algorithms? Why does that have to be in the core language?

It's still the question of the engineering trade-off, certainly not so clear as he presents it. In the light of the trend of always bigger importance of small devices, the best direction to take is probably the one that will most often use less of battery power, no matter what programmer's ideals can be.


See https://bugzilla.mozilla.org/show_bug.cgi?id=5856, the most-dup'ed JS bug in bugzilla.mozilla.org. Or this fun blog post from 2004 that blames Apple ("Apple Flunks First Grade Math"):

http://www.mikeindustries.com/blog/archive/2004/08/apple-cal...

Finite binary floating point representations do not handle powers of five well. It takes much greater precision (bignums under the hood) to round the last digit properly when converting from binary to decimal (see the David M. Gay and Guy Steele paper and code for dtoa.c). Without hints from the JS hacker (Number.prototype.toFixed) the bad rounding shows up -- try javascript:alert(.1 + .2).

bignums (among alternative solutions) fix this bug.

bignums also are not necessarily a whole lot slower than doubles, since you can optimize to fixnums for common cases. JITs can do pretty well. It's not as if double is so fast, even with SSE4, that int doesn't still win.

But the main bug to fix is the rounding or powers-of-five inexpressiveness issue. It's a real usability problem.


Of course I know about mentioned effects and about dtoa.c. Still thanks, it's important for the context.

I'm biased as I actively use floating point calculations and I don't know the convenient representation which would be fast enough like that one which is directly supported by hardware -- you can really get one FP addition per processor cycle(!) on modern x86 processors, potentially more on arrays. That is really as fast as ints (not considering the additional adders and shifters which are there for address calculations and can be used in parallel). But I'd also think mobile CPU's should be considered too in something as widely used as JS. Anyway, from my perspective fast indexable arrays are missing in a lot of modern languages and I think AS3 did something about that.

And I believed bignums are lists of integers, so still not enough for .1 + .2, I believed for that we'd need rationals or decimal FP?


Yes, SSE4 has some great -- and parallel as mentioned by both of us now -- bandwidth to throw at adds. FP still hurts if you have to cross the channel from the CPU and back again, and you do with common JS.

Mobile is much worse. We disable SoftFP in SpiderMonkey by requiring modern-enough ARM, but Adobe can't in Tamarin (we both use the Nanojit back end). Really slow. Even real FP is not nearly the same as on SSE4.x.

Indeed bignum is a bit int format, but since there's no finite precision limit, you can do as someone in jwz's blog comments suggested, and use milli-cents or whatever for currencies, and never suffer rounding. But you do have to scale.

EDIT: so to be clear, I agree that bignums don't solve the .1 + .2 problem. IBM favored IEEE754r to handle that case, but no one could agree on the exact integration or worthiness of that finite-precision decimal format.

Sam Tobin-Hochstadt's "fast, precise, rational: pick two" conclusion (cited by me in jwz's LJ in reply to someone pushing ratnums as all three) still holds. I think a case can be made for bignums on this two-out-of-three basis, but you'd still want double and you might even want decimal.

This need for several numeric types led us to work on value types, so library authors can extend the language with new numeric types (including operator and literal support), and the TC39 committee is not the bottleneck and the one-size-fits-nothing-well decider.


Same is true in Lua (only one number type, which is double by default), though you have the option to compile it to use a different number type instead.

http://www.lua.org/pil/2.3.html

In particular, "long double" on x86 can represent 64 bit integers without loss as well as being floating-point (64 bit mantissa, 15 bit exponent, 1 bit sign). Downside is that it's quite large: 16 bytes.


Also, Lua has a couple bignum libraries available (a GMP wrapper, etc.) - it's explicitly noted in the manual that using doubles for Lua numbers by default is just a pragmatic trade-off, and they can be adapted as necessary.


That's 10 bytes. Is it padded to 16 for alignment?


Yep. sizeof(long double) == 16, at least for me on gcc and Linux. Keep in mind that different compilers implement "long double" differently. According to Wikipedia, MSVC++ just makes it a synonym for "double," but most other compilers on x86 make it this 80-bit "extended precision" floating point.


With MSVC on x86, you can pass /Qlong-double to the compiler for it to "properly size" the long double type :)


This applies to numbers in JSON as well: if one end is JavaScript then you can't assume large integers will survive a round-trip.

(GWT emulates longs.)


It's like an AI Koan: "One day a student came to Moon and said, 'I understand how to avoid using BIGNUMs! We will simply use floats!' Moon struck the student with a stick. The student was enlightened."

Better would be ... Moon then struck the student 100000000000000000-10-1000000000000000000+10 times. The student was enlightened


In a comment a bit over a year ago, I noted that JS was being increasingly used as an object code; I asked whether people thought it was suitable for that use.

http://news.ycombinator.com/item?id=792547

There was some disagreement in the replies to my comment, but the general consensus seemed to be that JS is very suitable for use as an object code.

But this article makes me wonder. Increasingly, we're compiling HLLs into an object code that only kinda-sorta has integer arithmetic. Is this a good idea? I'm dubious, to say the least. Certainly this property of JS puts some nontrivial constraints on the design of an HLL that can be efficiently compiled into JS.

In any case, interesting post.


Since I replied in that thread, here's an update. A year's subsequent experience has diluted my enthusiasm, though not too much (say by 20%). JS is an easy language to generate because it's so flexible. Some of its most loosey-goosey weirdnesses (like the arguments pseudo-array) turn out to work really well for representing other languages' constructs (such as, say, named args). The big downside is if you're trying to do anything computationally intensive. To the extent that you want code to do anything numeric or binary, you're in trouble - forced to work at too high and too kludgey a level. Of course JS wasn't designed for anything like this and the fact that we're even talking about it in this context is a testament to fantastic technical and not just market success.

A couple side notes...

Actionscript is the inverse case, an abominable language that manages to wreak incredible damage on something pretty good (ECMAScript) in surprisingly few steps -- mainly, as far as I can tell, by trying to turn into something "proper" a.k.a Java, a fate that JS itself was fortunately forced to avoid -- but it provides access to one lower level than JS does, and this is a big deal for some kinds of programs. We're probably going to use Flash (when available) as a computational accelerator for this reason.

Second, our results suggest that the real gamechanger here is V8. I know that most of the benchmarks out there show Tamarin and other VMs somewhat competitive with V8. Not in our world. I'm talking a couple of orders of magnitude difference. It's astonishing. If everyone would just use Chrome, we would have no performance problems at all.


> an abominable language that manages to wreak incredible damage on something pretty good (ECMAScript) in surprisingly few steps

Can you please give some specific examples? I'm curious. I think you mention yourself that you can get faster execution with it, so that's the positive side. What's the damage?


I gave a concrete example in a comment on jwz's blog, referencing my blog for the 2007 @media ajax talk I gave, showing how AS3's type annotations can slow things down. If you have a loop and annotate the loop control variable as int, but it flows into expressions that evaluate arithmetic, then per the unchanged-from-JS rules, you get IEEE754 double evaluation. This means widening not wraparound on overflow. It is non-trivial to optimize to remain in the ALU. But the int storage type means stores have to be narrowed correctly from double to int.

Not annotating the loop control variable lets two things happen: 1) everything on the FPU, which if SSE has a lot of bandwidth in parallel to the integer units, which can still handle addressing and known-int chores; 2) tracing JITs can speculate, and type inferencing JITs can infer, that the loop control fits in an int, and all evaluation and storage can use int domain.


Yes, once you have type inferencing, starting from "just a number" is more convenient and forcing everywhere, needed or not, the bigger number back to int is certainly problematic. Moreover the fastest conversion (on a typical hardware) of IEEE numbers to ints is not truncation but rounding. Excuse me for not knowing that but does JavaScript today manage not to truncate at the places (if they exist) where rounding would be acceptable?


I wish!

Converting from double to int32 or uint32 in JS is far from a simple truncate or round, although it entails floor. From ECMA-262 5th Edition:

9.5 ToInt32: (Signed 32 Bit Integer)

The abstract operation ToInt32 converts its argument to one of 2^32 integer values in the range −2^31 through 2^31−1, inclusive. This abstract operation functions as follows:

1. Let number be the result of calling ToNumber on the input argument.

2. If number is NaN, +0, −0, +∞, or −∞, return +0.

3. Let posInt be sign(number) * floor(abs(number)).

4. Let int32bit be posInt modulo 2^32; that is, a finite integer value k of Number type with positive sign and less than 2^32 in magnitude such that the mathematical difference of posInt and k is mathematically an integer multiple of 2^32.

5. If int32bit is greater than or equal to 2^31, return int32bit − 2^32, otherwise return int32bit.

NOTE Given the above definition of ToInt32:

• The ToInt32 abstract operation is idempotent: if applied to a result that it produced, the second application leaves that value unchanged.

• ToInt32(ToUint32(x)) is equal to ToInt32(x) for all values of x. (It is to preserve this latter property that +∞ and −∞ are mapped to +0.)

• ToInt32 maps −0 to +0.

--- end snip ---

Note that this path is rare in code that does not use shift or bitwise-logical operators or certain built-in functions. See https://bugzilla.mozilla.org/show_bug.cgi?id=597814.


It's all the little things they screw up by trying to turn JS into Java. It seems like that's the way to 'improve' the language (add 'proper' classes, 'proper' typing). But the result is a mishmash. They've added stucco to an oil painting. Sorry for the lack of specific examples, but my way of dealing with the incessant irritations is to deliberately forget them.

Even the performance is misleading. It comes from one thing, the static typing, and having played that card there doesn't seem to be anywhere else for them to go. AVM2 bytecode is very interesting; if you try to optimize it you get counterintuitive results: it either stays the same or gets slower. (The one exception is the fast memory opcodes that were added for the abortive Alchemy project, but that's another story.)

The proper point of comparison is V8. Adobe had an eternity of a head start. They just bet on the wrong horse. Had they understood the problem more deeply they could have grabbed the V8 guys before Google even thought about it. In that case their head start might have turned into total dominance of the browser runtime. The biggest advantage Flash has -- 95%+ market penetration -- is one of the most valuable assets on the internet. (Is any single asset more valuable? I mean executables, as in IE counts but google.com doesn't.) Imagine if Adobe had done V8 before Chrome existed. The few users who didn't have Flash would have had to install it just to make web apps usable. People might not have bothered for a 2x speedup, but a 1000x speedup? I think so.

Edit: to be clear, I'm not talking about (heaven forbid) Flash apps. I'm saying that Flash could have become the standard runtime for web apps. They were on the VM performance track years before anyone else. But it's easy to see that they weren't thinking about this at all, because the facilities for communicating between Flash and the browser are unbelievably poor. (They actually marshal all calls into XML messages! Grrrrrrraaaaaaagh.)


> Sorry for the lack of specific examples, but my way of dealing with the incessant irritations is to deliberately forget them.

Pity, as the rest of us then can't learn anything without the examples of the errors.

> I'm not talking about (heaven forbid) Flash apps. I'm saying that Flash could have become the standard runtime for web apps.

And how were they supposed to reach that when having only Flash which runs as it runs? How were they to make a whole browser that would be widely accepted? What do you think they should have done actually?


What do you think they should have done actually?

What I said: make the fastest JS VM in the world years before others got started and make it cheap to call into from the browser. Then web apps could have used Flash's JS if it were available and just run more slowly if it weren't. Since V8 runs far faster than AS3 (in any measurement I've done), this was obviously technically possible. Had Adobe done this, Flash's extraordinary ubiquity would have meant we had access to something like V8 in nearly every browser out there, right down to IE6. Instead, what they produced was a dead end -- faster than the old JS implementations but not as fast as the new ones -- and a monstrosity called ExternalInterface that makes interaction between JS and AVM2 too slow for all but the most expensive computations.


> make the fastest JS VM

> ExternalInterface that makes interaction between JS and AVM2

So how were they to have a JS VM without the overheads but not making a full browser? Note that Google doesn't manage that either, they just have the thing that inserts the whole Chrome window in the IE frame.

(The examples or specifics for each your claim are still missing).


gruseom's what-if sounds a lot like my ScreamingMonkey plan of 2007:

http://brendaneich.com/2007/07/new-projects/

https://wiki.mozilla.org/Tamarin:ScreamingMonkey

The idea was for Tamarin to be as fast as V8 on untyped JS, and (via the Flash vector) distributed and integrated with all IE versions via the COM ActiveScripting interfaces to the native DOM and browser objects.

This was harder than it sounds. Tamarin was not fast on untyped code and the work to make it so never happened. We at Mozilla paid Mark Hammond to do the COM integration with IE, but he ran into global object API mismatches and couldn't make progress without help from Adobe. ES4 failed and Adobe departed.

ChromeFrame is really a super-ScreamingMonkey, but will it get distribution on the scale that Flash has? If it did, I bet authors would target it. With IE9 there's less need for it, but IE8 on Windows XP seems likely to be the "new IE6".

I suspect something different will happen from the best laid plans of Alex et al. for ChromeFrame, just as our ScreamingMonkey dreams died. Either Microsoft will manage to move people off of XP, or other browsers will convert most XP users to switch from IE8, or something I can't foresee, but not ChromeFrame ubiquity, will happen.


They could have started by not translating every single ExternalInterface call into XML and then back again. However, this is a useless argument: my thoughts are just a silly thought experiment and you're just being adversarial. Tell you what, have a free point on me.


Brendan responds on his blog to set the story straight: http://brendaneich.com/2010/10/should-js-have-bignums/


Some tea leaf reading....

So, Google built GWT as an abstraction layer over JavaScript which has, fairly effectively (I do a lot of GWT work), made JS little more than a really poorly purposed intermediate form akin to Java bytecode. GWT's pretty fast, but it's certainly limited by the need to express its compiled form in a language designed for humans. It also, interestingly, has a facility for breaking apart a codebase into multiple components for loading in the background.

Now, enter Google's Native Client... It allows a program to be expressed in a very low-level representation (a subset of the x86 instruction set) and interact with the DOM/Event model just as does JavaScript (please fact check this). The Google folks are now working on using LLVM inside the Native Client so that programs downloaded over the web may be expressed in its bit code as a means of platform independence. Hmmmm....so what's the difference between a bunch of bit code and a bunch of machine generated and optimized Javascript? Still a lot, but we're getting pretty close. One would have to send his own infrastructure (garbage collector, etc.) along for the ride with the LLVM Native Client scheme, but that seems like a surmountable problem. There's even a project to build a JVM/CLI impl on top of LLVM (http://vmkit.llvm.org/).

Google seems to be working toward making the browser into a sort of X server, and they're attacking the problem from both ends...a SDK with a community around it in a widely-used language (GWT), and a sandboxed, platform agnostic runtime which will allow for a wide variety of languages/ecosystems to run upon it. With the exception of bleeding-edge games, whatever remaining value native applications for 95% of users will be entirely eroded. We can also observe Google's web printing initiative as yet another means of decoupling most people's computing needs from any particular full-fledged operating system.

What I don't understand is how Google will manage to get this scheme adopted as a standard. I believe Mozilla said they aren't in favor of it. And, Microsoft certainly won't play along until they risk having no browser marketshare without it. End users don't care that their computing experience involves DOM and JS -- they just care that it's secure and works well. As a developer, I'd love the freedom of building apps in a language of my choice (presuming it can compile down to the LLVM and run in the browser/Native Client sandbox.

Do people on HN think I'm nuts? Is the endgame of the web a universal application "player" as Microsoft has surely feared for over a decade? What is there to lose in such a scheme? One downside for Google would be that machine-readable content is required for search. But, search is so important for app/site developers that all involved would be willing to make accommodations (see Google's published scheme for allowing GWT/AJAX-y apps to be made crawlable).


The problem I have with GWT is that fundamentally Javascript is a better language, warts and all, than Java.


You've received several votes for your comment, and I respect that.

But I also respect those who prefer a language which offers type safety. One which does not lay traps of unexpected coercion and unfathomably unusual truth tables. I respect a language which can guarantee and enforce immutability at compile time. I respect a language which has had a sensible packaging system from the outset. One which served as the foundation for many as an introduction to object-oriented programming. A language whose primary implementation is based upon a powerful, performant virtual machine, and one which has had excellent support for concurrent execution of programs and components thereof for nearly a decade. I respect a language whose implementations are not wildly divergent. I am fortunate enough to work with a language whose underlying implementation is flexible enough to host dozens of both static and dynamic languages - including JavaScript itself.

What I do not respect are blanket unqualified absolutist statements and fundamentalisms. I understand that you may have found yourself frustrated at points during which you've written Java in the past. I would like to hear about these, and understand them better. But I'd also encourage you to avoid such totalizing statements, as for a variety of reasons, many may have very good reasons for enjoying things you do not.


Javascript and Java manifest their defects differently. Javascript has its own special deployment problems (largely due to history not just intrinsic problems) but for the most part its defects surface as surprising behavior in certain specific cases (unusual truth tables to be sure, but hardly unfathomable), these can bite you but once known they can be avoided and worked around. Java's faults generally manifest as limitations to what you can do. This makes Java seem more polished but the result in practice has been the generation of quite a lot of ugly and cumbersome code to work around those limitations, this is not a good thing.

Java is not a bad language, but its limitations have led to a lot of bad engineering. Javascript is far from perfect, but its core elegance has led to increasingly sophisticated and increasingly powerful uses of the language. In another, say, 10 years the state of Java development will almost certainly continue the status quo of today, whereas the state of javascript development is likely to have considerably advanced.


  > but its core elegance has led to increasingly 
  > sophisticated and increasingly powerful uses 
  > of the language
Huh? What are these uses of JavaScript that are increasingly sophisticated and powerful? This JavaScripter wants to know!

  > whereas the state of javascript development 
  > is likely to have considerably advanced.
If JavaScript continues to advance at the lightning pace it's been advancing these past ten years we'll be in exactly the same spot we are now!


I'll let someone else cite sophisticated and powerful JS uses.

On the language evolution front: the http://wiki.ecmascript.org/doku.php?id=harmony:proposals features are very likely to be in the next edition, by end of 2013. Some are already implemented in Firefox.

http://wiki.ecmascript.org/doku.php?id=strawman:strawman contains the full laundry list of possible additions, but among those, http://wiki.ecmascript.org/doku.php?id=strawman:simple_modul... is worth calling out. The module system is likely to be a major feature of the next version of the standard.

JavaScript's standard stalled after ECMA-262 3rd Edition, but that was pretty much because of the IE monopoly and the death of Netscape. All web standards stalled, or went off to XML la-la land.

That was then (1999-2004). Since Firefox restarted browser competition in 2004; then with Safari, the iPhone, etc.; and since 2008 with Google Chrome, which clearly provoked major work in IE9; things are moving again, and standards bodies (still dysfunctional in some hard-to-fix ways) are more balanced than ever in terms vendor representation.

So yeah, the next ten years seem likely to be be different from the last ten.

Not only due to browser competition, but also from the Bell's Law device shift to always-connected, instant-on mobile and away from desktop, indicated above via "Safari, the iPhone, etc."

Some fear this shift means non-interoperable, siloed apps and closed app-stores will dominate, but my money is still on the Web. The Web can evolve to have better apps and stores too, provided browser and Web platform markets remain competitive.


I realized after making this comment that some people, swannodette included, were not aware of ES5, which my pointing to Harmony overlooked. ES5 support is almost there in the latest browsers:

http://kangax.github.com/es5-compat-table/

It does not fill some of the big gaps left in JS (no module system, that's coming in Harmony), but it helps and it got the standards committee back together (sans Adobe).


I fear this shift means non-interoperable siloed apps which happen to be written in JavaScript and deployed via HTTP. The open web of repurposable semantic markup that lives at a permanent URL is eroding.


Huh? What are these uses of JavaScript that are increasingly sophisticated and powerful?

Uh, any web app? Gmail? Google Maps? The one I'm working on? Surely we don't need to make a list.

If JavaScript continues to advance at the lightning pace it's been advancing these past ten years we'll be in exactly the same spot we are now!

That's incorrect. The language didn't need to change much. The two things that have changed are so major they couldn't be majorer: (1) it took people 10 years to actually figure out what they had in JS; (2) the implementations needed to catch up. While 1 may be more or less done, 2 is still in full swing. This will enable further innovation. It's not even clear we need major changes to the language itself. I'd be much more excited if the VMs were opened up to apps.


"What are these uses of JavaScript that are increasingly sophisticated and powerful?"

JQuery and node.js alone are perfect examples.


I'm not in love with GWT becaue it's Java although I probably am below average among HNers in my dislike of the Java language.

It has been rare that Google has introduced some piece of technology to the world where, even if it first seemed to be some one-off oddity, was not part of some grander plan which had yet to emerge. Take GOOG411 as an example -- it was a cheap way to mileage on voice recognition tools for Android. Take Google Maps, Android, and GOOG411 all in isolation, and the company seems a bit nutty. But, once it became clear that they were all part of a strategy for location-based advertising, everything began to make sense.

I look at GWT, the Native Client project, and Web Printing as a couple of similar data points. Clearly, Google is positioned to come out ahead as the Windows desktop loses prominence. So, what might we infer about Google's intentions given the historical cohesiveness of its technology strategies coupled with the existence of these projects? The point of my post was not to argue the merits of GWT but to ask whether it is logical to presume Google's intent is to replace JavaScript as the sole means of programatic expression on the client.


Except for that one time when you need a crypto library no one's written in js yet so you just use GWT to make the Java one run in the browser.


I love that most of the Google Collections stuff can be used in GWT. Sadly, its predicates and transformers would benefit from the closures found in Javascript.


It's more accurate to say Google wants to make the browser into a Sun NeWS server, which did exactly as you describe, but with PostScript, not JavaScript. Sun also had a project to do this with Tcl/Tk. And really, that was one of the early visions for Java too. So Sun had three tries at this, and it never caught on. Perhaps Google will have more luck... Or perhaps it's a wild goose chase.


So basically, the only numeric type of javascript is a float, which can only represent integers up to 2^53 accurately. Not good if you need an int larger than that.


If you need an int larger than that in JS, you use a bignum library.


...yes, that would follow.

Ow. Ow. Ow.

At least the bignum library will be easier to write in JS than in C...


You don't have to roll your own, do you?


I've given up on reading jwz. De-css-ing doesn't help and readability only shows the comments.


It's like an AI Koan: "One day a student came to Moon and said, 'I understand how to avoid using BIGNUMs! We will simply use floats!' Moon struck the student with a stick. The student was enlightened."




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: