I hope the question doesn't come across as flamebait, because I'm genuinely interested in the response here. My background is mainly in the largely semicolon-less Python and Ruby. This may have impeded my understanding of the issues involved. :^)
ECMAScript-derived languages have inherited many syntactical conventions of C-style languages (e.g. braces to demarcate blocks). However, while semicolons are required to terminate a statement in C-style languages like Java, they are optional in modern variants of Javascript and Actionscript, where a line feed is considered to imply a semicolon if the statement on that line is complete.
Being able to leave out semicolons is blissful as far as I'm concerned. When coding in Java, I frequently run into compiler errors because I have neglected to type a semicolon, whereas I can't think of a single time when omitting semicolons in Javascript or Actionscript has caused me problems.
However, the code style of others (e.g. website Javascript, or Actionscript open source projects) suggests that my opinion mustn't be widely held; I see semicolons everywhere.
Now, Javascript on a web site has to run in many different browser environments, and I can understand how fear of incompatibility could keep coders locked to semicolons (although like I say, I haven't seen any browser problems myself). But Actionscript gets compiled, removing the uncertainty about how syntax will be interpreted on the client. So why are semicolons still being used now that they are optional in Actionscript 3.0? Do semicolons provide a secret benefit that I'm not aware of?
1. Minification -- If you want to minify your JS, you might run into problems if you don't insert all necessary semicolons.
2. Convention -- If you wrote js code with no semicolons and then showed it to other JS programmers, you'll get a funny look, a slap in the face, or a free copy of "Javascript for Dummies", depending on the context.