Heh, I was waiting for this post. Coming in the next 24 hours:
- "Why I use minimal semi-colons in Javascript"
- "ASI is broken but I like it"
Honestly I'm shocked at the defense of this practice (of ASI "abusage"). It speaks loudly to Jacob's (fat@githib) ranking of ego-stroking and showboating over creating readable code, particularly for a library "Designed for everyone, everywhere" [1].
I'm confounded that such an attitude can survive in a large engineering organization like Twitter. Google's style guide [2] specifically prohibits this ("Always use semicolons.") and yes, this is the same standard we use internally for writing Javascript.
Just because you can doesn't mean you should. FFS, just get over yourselves and your hipster bullshit and use semi-colons.
What surprises me is not that Fat is a bit of a jerk (note, I "get" it, but I don't respect it), but that Twitter lets him keep being a jerk and actively hostile while acting as a representative of the company.
If he wrote Bootstrap by himself, he's free to call every single user an idiot if he wishes, but when it's being released as a Twitter product, one would think Twitter would demand some modicum of respect to be shown. Can you imagine a Googler acting this way on the Chromium or Android projects? Yeah, there are decisions made on those where people aren't happy, but the comments aren't laden with zingers either.
If Douglas Crockford comes down and tells me I'm doing JavaScript wrong, I would hope I'd show a bit more respect to someone who's most likely my better.
What surprises me is not that Fat is a bit of a jerk (note, I "get" it, but I don't respect it), but that Twitter lets him keep being a jerk and actively hostile while acting as a representative of the company.
My observation of open-source community dynamics tells me that being hostile is an effective way to lead a large community. I've noticed that contributors act more carefully around someone who is mean to people in general and that these people attract wide followings at conferences. (There are certainly exceptions, though, like Larry Wall.)
Can you imagine a Googler acting this way on the Chromium or Android projects?
Being nice also works, but if you compare Chrome's reach to this Javascript framework's reach and use that to scale the number of contributors, you'd expect a lot more contributions to Chrome.
I think fat is a jackass, but if he wasn't, I never would have heard about this project, which now has multiple front page articles on multiple tech sites. I imagine this is good for Twitter.
> I'm confounded that such an attitude can survive in a large engineering organization like Twitter.
I'm not surprised. A bit disappointed, but not surprised.
Have a look at the original OAuth specification sometime. Note how they misuse the word "key" to mean "identifier". I can't imagine that anyone writing a crypto specification wouldn't know that this would be confusing; I think it's more likely that the person just didn't care. Also, there's the fact that the whole thing could have been written as just a way of getting Basic authentication credentials, rather than making up a whole new authentication scheme.
Thank you. I agree both of these show a casual disregard for "how the real world works", where not everyone is a "rock star", some are lowest common denominators using the web as best they can (and good for them for even trying!), and some just need to get things done.
Every extra exception you require in a user's mind makes your work and the web in general just that little bit less accessible.
That it is being done for "style" in an arena of notoriously unsophisticated users, really feels like a giant middle finger.
Kudos to Google for their style guide freeing up this particular "exceptions" pigeonhole from a JS programmer's mind, so she can use the synapse to get something done instead.
Page state and uniform resource location, I feel, are utterly different. Hash bang shouldn't be which user's tweets we're looking at, or which article on Gawker.
Google's examples show the hash bang after the query. Twitter put the query after the hash bang.
"Do your best to never use a semicolon. This means avoiding them at line breaks and avoiding multi-statement lines. For more info, read Mislav's blog post [2]."
FWIW, I agree with the Google style guide. Maybe I'm interpreting it wrong, but ASI struck me as a fail-safe to protect coders that forgot the occasional semicolon and was later mis-identified as a feature. Having Brendan Eich weigh in on this is akin to having Thomas Jefferson pop his head into the Ninth Circuit and clarify an issue over Constitutional intent. If it were me, I'd listen to Thomas Jefferson.
My advice on JSLint: don’t use it. Why would
you use it? If you believed that it helps you
have less bugs in your code, here’s a newsflash;
only people can detect and solve software bugs,
not tools. So instead of tools, get more people
to look at your code.
Unless I'm missing something, that easily qualifies as the dumbest thing I've read on the net in the last week--and I've been to /r/politics.
You're not missing anything. It's dumb. JSLint doesn't find all the bugs in my code, but it does find quite a few. There are several places where I deliberately ignore its recommendations, but all in all I take what it says seriously. I'm sure you do likewise.
[2] has an inaccurate title. Semicolons are not optional, as is made clear by [2]'s final section that recommends putting ; before ( where ( starts a statement. Nothing optional in that case!
As I noted in my blog, it's not just (, either. ( is by far the likeliest source of trouble, but / + - [ all can serve as infix operators or prefix operators or (in the case of /) lexical delimiters.
Which is easier to remember, the rules in [2] plus the full "when you must use a semicolon because it is not optional" rule? Or the rules that people who know C, C++ and Java follow?
YMMV, but it's no slam dunk, and going "light" on semi-colons risks disastrous bugs. Going heavy tires out readers a bit with chicken-scratching but carries little risk of adding bugs (the worst case is if(C);T -- oops, and empty statement linting helps).
As usual with programming, there is no substitute for thinking and practicing, learning from the bottom up, and avoiding bad religion.
No, "optional" means you can skip it, period, full stop.
False advertising plus module patterns plus concatenation equals big enough trouble for people to be outraged about the false advertising. Better to tell the truth, from the title down.
If you are using your brain fully, perhaps you can avoid firing the footgun. Is this the best use of your brain? I am not so sure.
How about this, I never lead a line with an infix operator, ever, and every other language I use does not require semicolons. Why would it be a bad thing to just pay attention to lines leading with (? Because js is the exception, it actually takes more brain power for me to remember it on every line then when I am leading with (, especially without a compiler complaining at me.
Would you admit that at least for me its a YMMV type of thing?
Of course. I wrote "YMMV" and "I am not so sure" -- those explicitly expressed my doubts about the wisdom of absolutism on either "side" (there are several sides, actually).
NPM fans can use the NPM style well, I've seen it. Whether it scales to the masses remains to be demonstrated, but that may be true of any coherent style.
JS is used by many hackers who do not know all of its rules. It seems to work, mostly (an amazing feat, no?), but I have heard eyewitness testimony from people burned by statements starting with ( that ended up preceded by an unterminated statement, and the kingdom was lost for want of a ;.
Under maintenance, edits tend to make Murphy an optimist. I suspect the NPM crew selects for the best and gets the best. How that would hold up at scale with regression to the hacker mean is an open question in my mind.
As the founding member of the npm crew, I'd like to provide some history.
I started using this style originally because I -- not new to JavaScript, and adhering to the "semicolons everywhere" style -- was bitten twice in rapid succession; first by a leading [ causing a function to receive "undefined" as an argument (instead of two arrays), and then again by a leading ( resulting in a non-function being called. Both were unnecessarily difficult to debug, because my brain couldn't see the error. (Even when JSLint complained, I thought it was lying. ,] looks a lot like ], at first glance, and is valid JS. Maybe I'm a little dyslexic or something.)
So, I thought, "New rule: Commas first, prefix ( and [ with semicolons." Problem solved.
But ending a line with a semicolon seemed kind of silly when I was then prefixing with a semicolon the next line, even though I'd learned that it was absolutely necessary for any kind of "safety" (even as an expert, even as a semicolon user). So, I thought, if I'm doing this silly prefixing thing to make ASI safe, why not just let it be safe, and drop them everywhere?
So I did. And it's pretty nice.
I don't understand why people get so upset about how I write programs, especially those that they don't even use or contribute to. The first time a professional JavaScripter cursed me out at the top of his lungs in a bar, literally spitting in my face while he screamed at me, I realized that there was some essential human weirdness involved here, and it became much too interesting to drop. Every time I ask people to please not lie about JavaScript to newcomers, they accuse me of being dogmatic or a semicolon hater.
But I maintain the JavaScript in Node.js as well, which follows Google's (occasionally insane) C++-inspired semicolon-ridden JavaScript style, and I'm fine with that. It's kind of nice, too, in a different way. More dots. Vim's auto-indenter works a little better.
What's wrong with just not lying? I don't get it at all. People who already know C and Java are usually veteran enough to grok weird language warts. People from Python and Ruby have an easier time learning the exceptions than sticking to the rule, since their habits are usually to omit the ceremonial EOL punctuation. And newcomers don't have any preconceptions anyway; they're busy learning why we call "this" a string, and why this is always something different, and crazy shit like "a function is an object and you construct an instance of it with new which calls it on a this that inherits from its prototype". (I mean, seriously, read that sentence to a nonprogrammer, and ask them what it means. It's gibberish!)
But people react to it like I've insulted them personally, when I just ask that they not fill newcomers' heads with lies and fud. I still don't understand it.
I'm having trouble imagining your examples. The first one sounds like you did something like
fn([1,2,]
[3,4])
(but presumably longer) and a comma first would fix that (though so would believing jshint or using a debugger :), but a leading semicolon obviously couldn't be used there. I can't picture the second problem, though, at least not in a way that a trailing semicolon wouldn't have fixed.
On the topic of interaction with other developers, this:
> I don't understand why people get so upset about how I write programs
is unfortunate and is probably the biggest problem in this thread (and the previous one). At this point, I've accepted that some people like comma and semicolon first, and if that's the style of a project I'm contributing to, I'll say "this is really dumb" to myself and then write it in that style anyway. It's correct Javascript, and I think plenty of projects are stupidly designed, or inscrutably written, but at some point you suck it up or go elsewhere. It's just a style.
However, there are some red flags in your post that you should think about if you're interested in some dialogue. This is based just on this one post, and I don't think we've ever actually met, so I don't want to go too far here, but statements like
> I realized that there was some essential human weirdness involved here, and it became much too interesting to drop
> Every time I ask people to please not lie about JavaScript to newcomers
> What's wrong with just not lying? I don't get it at all.
> I just ask that they not fill newcomers' heads with lies and fud. I still don't understand it.
sound like pretty classic trolling for the response you're getting. I'm a firm believer in being precise when explaining topics (especially to newcomers), but there's a difference between trying to correct people that are saying incorrect things and trying to provoke a reaction out of them.
(To his credit, Crockford is actually good about saying "this is valid javascript, but it's a developer anti-pattern" and then explaining why he thinks so. I just disagree with about 25% of his anti-patterns and, of course, he's usually kind of a jerk about them)
I find it odd that they (Github) don't really bother to justify their stance on this. The guide just says "don't use them", not why they find it in their interest to do so. It'd be interesting to get a better understanding of why they think this.
Personally I don't see the point in omitting something that leads to known problems instead of including it and encountering zero problems.
Aren't the GitHub team members rather youthful, in general? Skilled, yes, and also opinionated and outspoken. Things might change as they get older and settle into a more balanced lifestyle.
That is what surprised me a lot. That there would be so much defense of the practice. I work on C/C++ for a living in enterprise environment and there are style rules followed diligently. Every one knows that there can be lot of cool/hipster code written with C/C++ but no one does because of the maintenance concerns. Writing code is much easier than maintaining it.
What are the universally accepted style rules for C/C++? Every organization I have worked in has had to develop their own guidelines around style in C/C++.
What really surprises me is that such a large number of people are quick to attack someone for writing legal code simply because they disagree with the style used. It feels awfully close to attacking someone for using a different bracing style from your preferred bracing style.
> What are the universally accepted style rules for C/C++?
Whenever I've been asked this, I've pointed people at the JSF-AV coding standards or MISRA.
> What really surprises me is that such a large number of people are quick to attack someone for writing legal code simply because they disagree with the style used.
The important part of the lifecycle of a piece of code isn't the writing of it. It's the maintaining and rewriting that matters. If one coding style makes it easier to introduce errors than an alternative, that's a bad thing. It's not an aesthetic concern. Semicolon-less JS is demonstrably less robust.
The feature in question is intended as an error correction for those cases where the programmer accidentally left out newline. A misguided feature maybe, but it's clearly being abused by intentionally leaving out every (or most) newlines.
> Honestly I'm shocked at the defense of this practice (of ASI "abusage")
Devil's advocate here: could one argue that taking advantage of ASI everywhere can make it safer, because it forces you to be fully aware of ASI?
Suppose you always put in semicolons. You can STILL get bitten by ASI. The classic example is
return
{
...stuff...
};
ASI is going to put a semicolon right after the return and break your code.
Someone who strives to take advantage of ASI everywhere they can is going to remember that ASI is going to apply on that return, and they will code the above so as to take it into account.
Now that you mention it, every language should have a "Zen" statement, or if you don't want to call it that, a short description of its philosophy. This in addition to the concise problem/solution statement that probably every piece of software should have.
I really don't understand how a particular usage of semicolons and logical operators is indicative of ego-stroking, showboating, and "hipster", as if the word hipster even means anything at all. If fat _said_ something to make you use those words then that would make sense, but as it is I'm very confused.
Simple. Look at the other comments, and it becomes apparent there are downsides to relying on / abusing ASI, and no advantages. (The only claimed advantage is that the lack of semicolons is visually appealing, but this is nullified by the fact the rules are sort of confusing and not apparent to all contributors.)
Thus people are inferring this fat fellow is doing it only out of illogical, ego-based personal preference, because nothing else makes sense.
I’m not saying they are right — I couldn’t possibly know — but given the facts, it’s a reasonable assumption.
FIY the main claim is that it's a more reliable way to write js: programmer mistakes/missing semi-colons are more easily noticed, and much less common, since the rules are simpler. It's there in every debate about this.
I misspelled FYI - "for your information". You said there are only downsides, and no advantages; I just told you what the claimed advantage is for "abusing" ASI.
I wasn’t. Apologies if it seemed mean-spirited — I only wanted to understand your point. I’m probably being thick, but I still don’t quite follow. It seems like you’re saying that if you leave semicolons out (relying on ASI), this makes “mistakes/missing semicolons easier to spot”? That doesn’t make sense to me. :-/
I am currently writing the a style guide (for a small agency), and I have added semi-colons as preferred for JavaScript. And abusing the && operator like in the now infamous code snippet is definitely out.
Our code needs to be maintainable for inexperienced developers. I don't want to make it easier for them to make mistakes and I don't want them to ponder rare (though valid) syntax. I feel this is a valid reason to maintain this rule.
// This we can all understand
clearMenus();
if(isActive) {
$parent.toggleClass('open');
}
Once a developer understands ASI and other details he is free to feel elated with his new-found knowledge, but I'd prefer if he choses to know rather than to apply these powers.
"ASI [“Automatic Semicolon Insertion”] is an error correction
procedure. If you start to code as if it were a universal
significant-newline rule, you will get into trouble."—Brendan Eich
I think the other thing that goes unmentioned is that a single semi-colon on a line in a file would pretty much end this debate.
Perhaps I've worked too much in teams, but surely "path of least resistance" has to factor some, right? I mean, if the choice is add a semi-colon to a line in a file versus asking somebody else to rewrite the compiler to be able to accept it, surely common sense would just be to add the semi-colon.
It doesn't mean that you're wrong, or that your code is broken, or that Crockford is smarter than you... but now you have code that works and compiles and that people can use.
Sure it is the easier solution right now. But in the long run that would mean the Crockford would be able to dictate the style of Javascript to whatever the hell he wants.
Given how important Javascript is now (and will be in the future) we cannot allow any one person to retard the entire future.
Anyway use Googles closure compiler -- it can actually handle Javascript.
I think that's a visceral reaction based on his rudeness.
So long as he keeps building tools and utilities that people rely on, and that people use, he is at least influencing the direction of Javascript and, knowing that, I'd think one might take heed when Crockford says "this will break in the future", as he happens to have more insight into the language than most.
I understand that he wasn't polite, and I certainly understand the desire for the 'asshole' to be proven wrong. And maybe he is. Of course, that doesn't make him wrong, and it doesn't mean that he's "retarding the future" either, necessarily.
But the fact that jsmin is perhaps the most widely deployed JS minification utility does matter, whether anyone chooses to accept it or not.
And if you really think that he is retarding the future, the correct answer probably isn't to bicker about it on the internet, but to create a better compiler (or work on getting a better compiler accepted) as de facto.
At the end of the day though, Fat is the guy who has to deal with all the tickets talking about 'This doesn't compile in jsmin', and I'd think his life would ultimately be better off if he just added a semi-colon.
Of course, that's my opinion only, so take it for what it's worth.
But of course, we won't break existing code like those two lines from Bootstrap in the future (not per the promises/concurrency strawman, anyway). This was a point my blog took pains to make.
Doug may have forgotten the [no LineTerminator here] restriction, or he may not want it to the left of infix-! for promises, but I am certain that the whole of TC39 will not agree to such a breaking change.
we don't need to work on a better compiler then jsmin, because we have three -- uglifyjs, closure compiler, and yui compressor. They are all better then jsmin by any metric you choose, and they also don't break on automatic semicolon insertion.
That doesn't sound right to me. Making s language's syntax more liberal is not a sane way to correct errors. If anything, JavaScript's ASI is an error creation mechanism.
"Here is your new apartment. You must use a key to enter it. If you forget your key, you can buzz the superintendent during business hours, and if the super is home, (s)he will open the door for you.”
If you try to take your key with you at all times, the super will save your bacon once in a blue moon when you forget your key. On the other hand, if you think of the door as only needing a key when you wish to have it opened on nights and week-ends, you are on your way to trouble.
This is not a good analogy. More like the superintendant will let you into a different apartment, or let you into your apartment and then punch you in the face. You just don't know.
But you don't need analogies to understand why ASI is a bad idea. ASI means that when you leave out a semi-colon you get unexpected behavior instead of a syntax error. Failure is always better than the unknown. At least then there's a chance the bug will be found and fixed.
Trusting artificial intelligence to make up for human stupidity doesn't look like a very good idea to me.
ASI is supposed to work in exceptional scenarios where you would have ended up making a mistake but wouldn't want to be reminded of. And that is supposed to be in rare scenarios.
Now if you make exception the norm and expect tools to make up for bad practices then its not going to help.
And this is why I think Python's forced indentation is in some way bad. Because it makes the code from a bad and good programmer both look same. And merely forcing code indentation won't magically transmogrify a bad programmer to a good programmer. There are many things to good programming and indentation is just one of them. Worse it will make both's code look the same.
Forgiving or masking or making bad practices look good doesn't help on the longer run. It only encourages such behavior. I am sure bad programmers can slip in easily into these communities than else where, because they are difficult to flush out and their mistakes are often forgiven or made look good.
The difference is that if you forget your key, you know that you forgot your key, and you have to actively call the superintendent to have them "save your bacon." With ASI, you might never know that you had forgotten your key.
I agree that it would be nice to have something like "-Wall" that could warn you that you're doing things the "wrong way" but, uh, isn't that what JSLint is?
That's a terrible analogy. What's the analog for receiving a syntax error instead when you "forget your keys"? The analogy gives you bizarre and wrong choices and tradeoffs.
As usual, analogies do more harm than good, especially when understanding a relatively simple technical issue.
You forget your keys, the super doesn’t let you in, you’re locked out. That’s the equivalent of a syntax error.
On another topic, this isn’t really a technical issue, it’s a people issue. Everyone understands what the JS interpreter’s behaviour is, what Bootstrap.js does, and why JSMin doesn’t minify it. What is being discussed here is what choices people make to please themselves and others as opposed to the compiler.
If @fat didn’t care about people, he’d use semicolons and JSMin would compile Bootstrap. If Crockford didn’t care about people, JSMin would compile Bootstrap just the way it is. When we’re talking about multiple ways of writing code that does the same thing, it’s almost entirely about people and not technical considerations.
Yes, he's acknowledged many times (including in this post) that it was a mistake. That was the intention, but it wasn't successful. It should have gone all in (no semicolons at all) or not have existed.
I think it was a reasonable decision at the time. The idea was that JavaScript was not going to be some language people wrote hundreds of thousands of lines of code in, and that the target audience would become frustrated if the language didn't "help" them. For example, if JavaScript truly were to always require semicolons at the end of lines then that means this would be a syntax error:
<a onClick = "someFunc()" > ...
> Error, missing semicolon after ")" on line 1
That could get pretty annoying pretty fast, and simple little event handlers like these were probably the main use of JavaScript for the first x years. It also matched the existing philosophy of HTML which was to really go out of its way to make sense of whatever was in the file.
Now of course, the constraints are different: people want to use JavaScript for huge codebases and serious projects, so it makes sense that these decisions are no longer appropriate.
Given that these are all formal languages, though, isn't that equivalent to saying that ASI is, formally speaking, part of the language's syntax? It's all just parsers! Apart from the social conventions around them, of course.
No, there's an important distinction. ASI does not even kick in without a syntax error.
Yet the "expectation of ASI" or (I think more likely) "expectation of newline significance" makes people believe that they'll get a ; inserted by separating two things by one or more newlines.
Most languages do not specify error correction. HTML5 of course does; CSS too; among general programming languages it's much less common. The spec for ASI does not fit in the tried-and-true LR(1) formalism used by ECMA-262. Parsing is not all ad-hoc or equally well-formalized and proven.
In addition to ASI, ECMA-262 has to use lookahead restrictions and a bit of semantic checking to cope with what could be purely syntactic concerns (say, if it could use GLR instead of LR(1)).
I suppose I'm thinking of that as implementation details; from the perspective of languages, it appears ECMA-262 does specify a well-defined language. Any input string is either rejected (not in the language at all), or is mapped unambiguously to an abstract syntax tree. So from that perspective, any sequence of characters that gets you an abstract syntax tree is a program in the language! How precisely it gets mapped is "innards of the parser" if you view languages as just string->AST mappings.
The error-correction view seems to want to add a third category, strings that are in some sense "errors", but nonetheless get unambiguously mapped to an AST. Which is strange from a classical formal-languages view, because if a string gets mapped to an AST, it's in the language, and the procedure that mapped it constitutes the parser! That category seems more like "warnings" to me, i.e. you probably shouldn't do this, but it will produce a program if you do.
Yes, ECMA-262 completely specifies (modulo bugs) sentences that are accepted or rejected, and for those accepted, their meanings.
But that doesn't alter ASI's error-correction nature, which is not an "implementation detail" -- it's in the spec and all too observable.
You're right, it has the character of a warning system, like Dart's unsound "types". But if it had been noisy (consoles in the early days were costly), too many developers would have ignored the warnings, and users would have paid for the overhead.
Your concluding sentence is spot on, I agree people should use semicolons in JS. Relying on a Ruby-like coding style in the large is way too risky.
I'm curious, does ASI actually hurt the performance of JS parsing ? I mean, will there be any performance difference between a code that properly uses semicolons versus one that relies on ASI..
It's interesting because the 5.1 specification doesn't say it is an error correction mechanism, it states: "For convenience, however, such semicolons may be omitted from the source text in certain situations".
Just because you can doesn't mean you should. A majority of the JS code out has semicolons. Key individuals on the committee responsible for furthering the development of the language recommend always using it. Not including semicolons has shown to result in code errors. It can also introduce build errors when your code is minified. It can also create mental overhead when other developers try to read your code. I just don't get it. Why fight it? Then again, I'm coming from a C/Java background and they just seem so natural to me.
I suspect it is a matter of background, as you imply. Twitter has its roots in Rails, and I believe they also use Scala heavily. Both Ruby and Scala use newline as a statement terminator. GitHub is another Rails shop that avoids semicolons in JS. Zepto.js recently removed the semicolons from its code base, and its author was a core Rails developer.
Is it me, or did the author tack on a new topic regarding the use of && and ||?
The fact that they return the controlling operand is quite useful. Consider a situation where you want to check the property of an object, but don't know if that object is null:
var foo = obj && obj.bar;
I find this much more readable than:
var foo = null;
if (obj) {
foo = obj.bar;
}
You can accomplish the same thing with a ternary operator, but it's longer and, to my eyes, less scannable:
var foo = obj ? obj.bar : null;
Also useful when setting optional parameters:
function myFunc(foo, bar) {
foo = foo || {};
bar = bar || 'default_value';
}
(take care however: the empty string evaluates to false, as does [] and 0)
Your last parenthetical point is exactly why I'd avoid those uses of && and ||. Write a default_to() function (or a method in languages that allow methods to be defined on the null object) that wraps all those corner cases up.
Your post has made me realize however that I'm not sure if the language I'm reimplementing is meant to be value preserving in the case of those operators where an explicit corrosion from the a type to Boolean has been defined. Shows how often they're used like that in production code I guess.
The eternal fires of high level bike shedding: semi-colons, whitespace and curly braces, no programming discussion or language can escape all three.
Be liberal in what you accept and conservative in what you send (i.e. just use semi-colons) and your systems will work and play well with others being interoperable.
True, but that is a language, products and implementations sometimes need to be more this way since they are the front lines/interface/api. Even falling back on broken standards or less clean implementations (url over version header on restful apis for one example) to work in some cases where absolutely needed but make it harder to do so.
Shipped software should try to work or be usable in the shipped format, get funky behind the product layer if you want.
Here the minified javascript that everyone uses, should just work with the current infrastructure. If it doesn't, provide the tool that fixes the problems for people to ship with.
This whole argument is silly (the semicolon in JS argument, not this post itself). JavaScript was built for semicolon use, use them. If you don't want to use semicolons then at least make sure you aren't breaking major libraries that are in mass use. If it does break something many people use then work with the creator to fix it. If he refuses to fix it then ignore everyone using these libraries, provide a weird workaround, or just use a semicolon! Personally I've created a branch off of bootstrap and another library (can't remember off top of my head) that adds in semicolons as to not break JSMin which is critical to my deployments of websites.
You haven't been around for long enough or just aren'y cynical enough, I went to bed knowing that there would be at least 5 more posts about it in the morning
If it's anything like the "Email is dead" one, we've got some more coming. We need at least two arguing for Crockford (one with a liberal amount of "hipster" name-calling), someone arguing for @fat, and @fat eventually explaining himself (because it would be a cold day at Oracle before Doug Crockford said anything other than "you suck").
If I see people errantly slinging around "hipster" for any reason, let alone to pettily attack people who prefer a certain style, well, I'm going to be even more annoyed than seeing more god damned "email is(n't) (the best thing/going to screw your mother)" posts.
And you thought one of those five would be from JS's inventor, participating in this thread? I didn't expect that -- makes me feel HN is still a great resource.
Quotes:
On the use of '&&' instead of an 'if' statement: "If you were really having fun with it you could lose the if all together... Each is perfectly valid. Each behaves the same. It’s just a matter of preference and finding the style that makes most sense to you."
JSLint is described as a "unnecessarily strict linter".
"The majority of lines however don’t end with semicolons because they simply aren’t necessary and I prefer the minimalist aesthetic. For me, \n character is enough and the semicolon character is redundant."
... IMHO: Don't be a JavaScript hipster. Add semi-colons.
The purpose of the semicolon in javascript is to eliminate ambiguity. Choosing not to avail yourself is like driving without a seatbelt -- of course it's possible, but I always thought the job of a programmer was to be as unambiguous as possible.
You're not the only person reading your code. Your code isn't just being parsed by five wholly different javascript engines, it's also being parsed by the brains of every programmer who chooses to read your code.
Expecting others to learn a few stupid rules and lame syntax hacks in order to understand and match your style seems moronic. And defending the decision by noting that "the spec permits it" is doubly so.
When I program in C-syntax languages, I always surround loop and conditional blocks with braces, even if they have a single line of code. Even though these languages allow braces to be omitted, I consider it bad style, and error prone (although I have never seen a real bug where a programmer made an error of including multiple statements without braces). The languages do allow this style, and however distasteful I personally find it, I would expect compilers to handle it properly.
I think the same should apply here, the minifier should handle the language as specified, if it can't then it is has a bug.
I wish I had made newlines more significant in JS back in those ten days in May, 1995.
I wish Brendan was not as skilled a hacker as he was. It's an amazing achievement that he produced Javascript in such a short time. Unfortunately it ended up having the sorts of flaws that are inevitable with such a timeframe, yet it was "good enough" so that everyone decided it was easier to live with the flaws than break compatibility and fix them. If the initial version of Javscript had been worse, we might be using a much better version today.
If javascript were any worse, it may have been sidelined entirely before it had a chance to improve, and we would be using proprietary application stacks like iOS even more heavily for client side development.
JS has improved a lot in 17 years. It's still incumbent, and with ES6 and further Harmony work, it is still improving (much of ES6 are in V8 and SpiderMonkey, or coming very soon).
Good luck displacing JS. I mean that sincerely, especially if you work in the open. Dart had a rough start but even ignoring that, I don't think it will succeed. We'll see.
This looks like a good place to fulfill a promise I made to report back if my hopes were dashed about Dart (http://news.ycombinator.com/item?id=2989686). They were dashed pretty quickly. I've seen nothing compelling enough to prefer Dart to a fixed-as-much-as-is-practical JS. Especially if such a JS becomes more suitable as a compilation target (better numerics and so on).
If a major framework breaks a major toolchain and they both disagree on a fundamental principal, then they both don't promote pragmatism or "Get Things Done".
Lars Bak, who leads chrome's v8 team and was tech lead for java's hotspot vm, argues that javascript is a competitive 'bytecode' even compared to java or CLI bytecode [1]. Among other things, Bak claims that javascript source is more compact than traditional bytecode.
[1] about 3/4 into this interview: http://channel9.msdn.com/Shows/Going+Deep/Expert-to-Expert-Erik-Meijer-and-Lars-Bak-Inside-V8-A-Javascript-Virtual-Machine
I'm personally in favor of leaving JS mostly alone, especially at the syntax layer, and then let transcompiler solutions like CoffeeScript evolve to relieve the syntax burden and add semantic improvements.
CoffeeScript is far from perfect--it has syntax quirks of its own--but it's more pleasant for me to write in, being used to Python and Ruby. (And, yes, I understand JavaScript too; I just don't like the syntax.)
Eventually transcompiler languages will evolve to take advantage of different JS engine improvements. So far, this isn't a goal of CoffeeScript, but other abstraction layers might already be doing that.
Eventually transcompiler languages will evolve to take advantage of different JS engine improvements.
I wonder what opportunities there are here that haven't been exploited yet and that don't require replacing JS with a new language. For example, could JS implementors define a more-easily-optimizable subset of JS? Then transpilers seeking performance could target just that subset.
It's an interesting idea. My assumption is that stronger JS engines already do a good job of optimizing simply-written JS, whether the JS is hand written or produced by a transcompiler.
Transcompilers that were target-specific could probably target server-side JS at first. e.g. If you're just building a node.js app, there's no reason for the polyfill language to make any concessions to IE, for example. So one "optimization" is simply avoiding legacy cruft in the generated output. But I could also see exploiting specific features of cutting-edge JS engines.
One of the stated reasons for Dart is that the V8 team was hitting a wall with making JS fast, beyond which a language with cleaner and more optimizable semantics is needed. I wonder, though, whether the nature of that wall has been clearly written up anywhere. I'd like to know if there are tricks in the category of "This would be annoying to write by hand in JS, but would be easy for a compiler targeting JS" that could be used to get around it.
Transcompilers that were target-specific could probably target server-side JS
I had an idea recently that (to me at least) is super exciting: someone should make a good language that compiles (à la Coffeescript/Parenscript) to JS but also to Lua. JS and Lua are close semantically, so it might not be so hard. (If it did turn out hard, it probably wouldn't be worth doing.) That would be a really interesting server-side alternative to both Dart (whose philosophy appears to be "run our VM on the server and compile to JS for the client") and Node.js.
JS is becoming the compiler target "bytecode", without needing verifiers or new and complex standards to be adopted by multiple browser vendors.
JS has gaps as a target language, for sure. We are working on filling them (e.g. 64-bit and other int types).
These are easy bugs to fix compared to creating a new, portable, and future-friendly bytecode standard in addition to keeping up with a competitive JS engine. Since JS is incumbent, it's hard for any browser vendor to justify a new thing with zero users at first, and too much risk of non-standardization or JVML-like albatross status.
Languages should either (1) treat newline as a statement delimiter (as in Python) or (2) not do so and instead treat some other character (e.g. ";") as a statement delimiter (as in C etc).
JavaScript falls between two stools and that's bound to cause problems.
Just use a fucking semicolon. Seriously, it's like people are embarrassed to write JavaScript. If you ever work with other people this shit doesn't fly very far.
- "Why I use minimal semi-colons in Javascript"
- "ASI is broken but I like it"
Honestly I'm shocked at the defense of this practice (of ASI "abusage"). It speaks loudly to Jacob's (fat@githib) ranking of ego-stroking and showboating over creating readable code, particularly for a library "Designed for everyone, everywhere" [1].
I'm confounded that such an attitude can survive in a large engineering organization like Twitter. Google's style guide [2] specifically prohibits this ("Always use semicolons.") and yes, this is the same standard we use internally for writing Javascript.
Just because you can doesn't mean you should. FFS, just get over yourselves and your hipster bullshit and use semi-colons.
[1]: http://twitter.github.com/bootstrap/
[2]: http://google-styleguide.googlecode.com/svn/trunk/javascript...