This may possibly be intentional, but seems a bit confusing. The original name of Javascript was LiveScript when initially released with Netscape in the 90s.
Oh, and if you buy a ticket by Sunday you could win a FREE pair of conference pyjamas with the attractive slogan 'Old programmers never die they just become Mostly Functional'
Self-serving self promotion: I work for a small little shop that does conference recordings, so If you need help figuring out that side of things, you can shoot me an email at chris at intelliquestmedia dot com.
We're based in the States, but the owner is a Brit. I'm sure he'd jump at an excuse to head back and do some recording in the UK.
I would absolutely love to attend this, but I don't think I could justify an international trip right now (getting married in 1.5 months). What are the chances you'll do another one next year?
Well Turing is an annual event (http://turingfestival.com) and I usually organise something on the Fringe @ Turing - it varies from year to year. Last year it was a CEO to CEO event, the year before there was a mini-Erlang factory and a Lean Startup day. Best thing is to follow @turingfestival on twitter and me @gordonguthrie and mebbies @mostlyfuns. Mebbies we will do 'it' again, mebbies we won't, prolly something(s) else or related...
What this article calls currying is actually partial application.
Partial application is a technique where you take a function that requires n arguments, pass in the first one and get a function that needs n-1 arguments.
Currying is a technique where you take a function that takes n arguments and turn it into a function that can be partially applied. E.g. in Haskell it works with tuples as arguments. There is function 'curry :: ((a, b) -> c) -> (a -> b -> c)' and its counterpart 'uncurry :: (a -> b -> c) -> ((a, b) -> c)'.
You get a function that can take N arguments and return a value, or 1 to N-1 arguments and return a function representing a partial application, which itself exhibits the same behavior:
add3 = (a, b, c) --> a + b + c
x = add3 1
y = x 1
z = y 1 # z = 3
If you look at the compiled code, it is actually an abstraction over a partial application, but at that point is that not just an implementation detail?
The main difference between the two I see is that partial application basically binds a certain argument to a fixed value, while currying only changes the way the function is called (thus allowing for easier partial application). It is easier to see on a function with more than two arguments. Imagine f:(A×B×C)->D. Currying it will yield f':A->(B->(C->D)), so you would call it as f'(1)(2)(3). On the other hand, partially applying on first argument it would yield f'':(B×C)->D.
Of all the "Javascript.next" languages I hear on a regular basis, it seems to me Microsoft's Typescript is the one most likely to become successful, if only because every time I read an article about it, most of the comments are extremely positive about it and it looks like there are already some fairly large projects written in it.
In comparison, I don't see a bright future for Dart nor Livescript (although I secretly root for Dart because I have more confidence in Google to take this language somewhere interesting).
I really like Julia. I'm currently playing around with it whenever I find some spare time. I don't see it as a competitor to R, I think it could well be something along the lines of Python or Go.
It is LLVM based, and already really fast even though it is still a 0.2 and the JIT seems to have a lot of room for optimisation.
Whats more, it seems to offer just the right blend of language features:
- Easily include C libraries via a simple ffi [1]
- It is homoiconic like Lisp and thus allows for fantastic macro facilities [2]
- It has solid parallel programming support via a Coroutines implementation (Tasks) (similar to Goroutines as far as I can tell)
- It is a non-pure functional language
- In contrast to Go it has generics, so functional constructs like map, apply, drop, reduce, fold, partition & friends are already in there (or can easily be implemented) [3]
- It has optional types, so that you can assign types and the compiler will check for it and mark errors and will be able to create optimised code, but you don't have to [4]
- Running external programs is a joy [5] (Example: a=readall(`echo hello`))
The community seems to be very alive. There's a simple web framework called "Morsel" and I've recently set it up against a couple of contenders from the web framework benchmark (cpoll-cppsp, phreeze, and some others), and even though it is still a version 0.2, the performance for the json serialization benchmark would be pretty close to Scalatra (I yet have to publish these numbers, will do so soon).
I really hope that Julia will grow, as I love the choices that went into the design of the language, and it would be a shame if it would be only a replacement for R instead of something much bigger, as it is such a nice language.
I have been following Julia development for a while now (about a year and a half). It seems like it will be a great tool, but it is still VERY immature. The amount of breakage, especially with regards to important packages like plotting, has become something of an "in" joke. I'm really looking forward to seeing them get their package management situation worked out and a set of core packages nice and stable.
> I'm really looking forward to seeing them get their package management situation worked out and a set of core packages nice and stable.
I really don't understand why each language ecosystem has to go through this phase. What would be so bad about implementing packaging the way it has been done in OSGi for a decade? It's language agnostic, since the bundle can export/import namespaces or arbitrary capabilities, it has solved the dependencies-are-a-graph-not-a-list problem, it supports versioning and multiple parallel versions of the same package.
Believe me, I'd much rather use a standard packaging system for all languages. I'm not a Julia developer, so I don't have control over what they're doing. But ultimately, I need to be able to install packages in order to use the language to solve problems, so whether they reinvent the wheel (in your view, I'm not as familiar with it as you seem to be) or roll their own, it still just needs to get done.
I concur. I can't really use it for any important projects right now, so I'm just following along and playing with it for simple and small scripts where breakage and failure are not a problem :)
Julia does seem promising. Interesting that you see it as more along the lines of python than R.
I just finished the data science coursera class, and while we used Python, we didn't get into R. I've played around a little R on my own, and while I certainly don't think it's too difficult to learn for a programmer with a math background, as a programmer I feel much more at home with Python. Given the choice, I'd rather use Python than R just because it feels more natural to me.
If Julia is a truly a programming language, I'd agree that it would be more of a competitor with python than R (in the sense that its audience would be people like me who would lean toward python)... but I think it could be very successful by making it very easy for a programmer to stay within a programming language. In other words, it wouldn't compete with R, it would compete with python by bringing what you get from R to a programming language.
(Hi there, I took the coursera course too!) I disagree with your compare and contrast of R as not-a-programming-language. R is absolutely a real language in every sense of the term. R, and its precursor SPlus, both have a very lispy feel. If you should ever try scheme after learning R, the biggest superficial difference you'll find is that the parentheses move:
function(x1, x2) becomes (function x1 x2)
R has a number of warts as a language (e.g. how it deals with copying objects in memory, amongst others), however it's just as "real" as Python, which also has its warts.
I do like the direction Julia's heading, even if I'd prefer to see something like Clojure/Incanter, or better still, something like Racket be the next baton holder.
Oh, I'm not too surprised to hear that R isn't a full fledged programming language. I found python very natural to use, since I currently program mainly in Ruby (and, Java/C++ a while ago). Python felt easier for me than R, but that's probably related to my background.
Some of this may also simply be a result of doing some exercises in Python (in the coursera class), and not doing any R. I was pretty bummed that we didn't't get at all into R in the coursera class, because sometimes just getting set up a little bit (like having some sample skeleton files for homework) and a few exercises can make a big difference. Not meant as a knock on the course, which was free and (I thought) very good. But a bit of exposure to R would be helpful (while I appreciate the importance of visualization, I do think that given the choice between tableau and an intro to R, I'd definitely have preferred R).
I was pleasantly surprised by Tableau. While I feel ggplot2 in R is the platinum standard for sane statistical visualizations, it was clear that the Tableau people have put some thought into some kind of underlying grammar. I'd love to see a ggplot style interface to Tableau.
R was really where I started to see the world through functional eyes. I originally came to R with a background involving a little C, a little Perl, a little ObjC, and a whole lot of Python.
Eventually, I hit a point with R where I was trivially doing pretty complicated things that would have been... ugly in Python. Even my Python now looks more like R. This was partly due to finding Peter Norvig's Udacity course, but also partly due to the fact that his somewhat un-"pythonic" python style really, really resonated with my R-tainted brain.
I agree that Tableau was more impressive than I expected, however I also think mere pivot tables in Excel are under appreciated for perhaps their biggest strength—many mathematically-minded and analytical in starting positions who do not work in the IT department supporting desktop workstations. For people in these positions even Python may be considered too low level, yet, they already have Excel, and if they have the budge for Tableau they can get that too without threatening any sys admin types.
Anyway, on the topic of R, I highly suggest the Johns Hopkins data related courses on Coursera. Many of them use R as the central tool, and the three I've taken really stood out for how much the instructors reminded me of getting a private tutorial from a bright colleague on an area of their own expertise.
Based on your description, you'd probably already know most of what would be covered, but the Roger Peng class on "Computing for Data Analysis"[1] is starting again soon, and it includes an overview of R that might have some gems for you. I liked that the lectures were relatively succinct, and the assignments put it in practice.
For X to be the next baton holder, X would need to be quite widely adopted, and the superficially matlab-like look and feel of Julia should really help in this.
The (self-described) bland, Python-esque syntax is absolutely a boon for mass adoption amongst those who feel that Python-eqsue syntax is "natural." That, however, doesn't change my preferences for parentheses, or my wish for people to stop fearing the S-Exprs!
I've been reading the Julia docs this morning, and it actually looks like it might be a good fit for my problem domain (video); most of the real heavy computational lifting in my applications happens across a C API boundary, but I still push enough bits that having e.g. unsigned ints (gazes balefully at Gosling) is a requirement.
Wow, it's come along nicely since the last time I saw it. I seem to recall they didn't even have macros and were doing everything with eval. As somebody with no interest in scientific programming, Julia still seems like a very well thought out language. I'll have to try writing an application with it soon.
Julia is aiming at Matlab while Clojure is aiming at JVM/Lisp. If you're used to using Matlab for scientific computation then at some point Julia will become a much better option. Much of their work is around lifting heavy numerical components up into the runtime which are wasted on a lot of the programming one would do in something like Clojure.
Julia isn't (currently) targeted as a general-purpose programming language, but there's nothing in the core language that's specialized for scientific computing. While the linear algebra libraries etc. are part of the standard library, they're either written in Julia or wrappers for external projects. Its closest relative in terms of paradigm is might be Dylan (http://en.wikipedia.org/wiki/Dylan_programming_language), which Apple intended as a lispy general-purpose language for the Newton. IMHO, making it useful in the domains where people use Clojure is really just a matter of library support (and whether you can tolerate 1-based indexing).
While that's true, I can't believe that given its numerical bent it'll take anything less than an incredible push to move it to a more general playing field.
Quite different. I think a good way to imagine Julia is as a science-flavoured Go with dynamic types.
Well, no, that's not quite right. Really, it's more like someone took Matlab and made it a real general-purpose language, and built it close enough to C to be generally fast, not just fast at a few things. Matlab actually has more in common with Clojure, being a JVM language.
Matlab is not a JVM language, it just has some ability to call Java built in, and some of the IDE is in Java implementation. Launch Matlab without booting a JVM: matlab -nojvm
I bought a Clojure book and started going through it, and I have some limited experience with ELisp and Racket. I really like lispy (is that even a word?) languages, so I tried to get into Clojure, also because I really like the ideas behind ClojureScript and the various libraries for it. So far, my experience has been mixed, I guess because I didn't read enough, or maybe it is my unfamiliarity with anything jvm-related (having never been a Java guy), as that's also something I constantly run into when I'm doing Scala.
I'll try to write a bit more in Clojure, but so far Julia felt easier to get into, and I really like the ability to use types so that stupid errors don't result in runtime issues.
There're way too many interesting languages these days :)
I'm acutally playing with Clojure and Julia right now and I'm running into the same problem. Never used Java and a lot of the Clojure material assumes you're coming with at least a little bit of Java background. That said Clojure is a much more mature language, Julia is still very new, its APIs are changing rapidly, etc.
Agreed about too many interesting languages though :)
There's a few threads on SO that try to summarize what you need to know if you haven't programmed in java before(and, yes, it's a decent amount), e.g. what a JAR is, classpath, -server vs. -client, -Xmx and Xms, GC options etc. Alternatively, ask on reddit, IRC or the mailing list
Not exactly a trivial task... Incanter uses several sizable Java libraries (such as Parallel Colt) under the hood, so a port would first need to either finding suitable replacements or re-writing that code as well.
Definitely not a trivial task--though it may be a worthy task. Incanter has moved from PColt to JBlas, and is moving toward core.matrix which will abstract the linear algebra stuff. If anything, I'd guess that Racket might be an easier ask than Clojure due to the JVM weirdness around boxing / unboxing of numbers passed to and from the underlying jblas (it's entirely possible I don't understand what I'm talking about re boxing / unboxing, though I've heard tell that the JVM makes some aspects of numerical work very challenging).
Livescript looks like it fixes some of the warts of Coffeescript while also raising the level of abstraction.
Julia is something I've already been looking at. I'm a bit torn on it -- it has vastly fewer libraries than Scipy and R so I don't know if I'm ready to "wear the hair shirt". At this point in life I'm more concerned with doing stuff with existing libraries than building the libraries myself.
Elixir I'm less excited about, because I'm not so excited about Erlang. I feel that Scala provides all of what I'd want from Erlang, along with better sequential performance.
The incredible PyCall.jl package allows use of existing Python libraries from Julia with automatic object translation, including no-copy use of NumPy arrays:
I hope that some Elixir features make it in to Erlang but I guess I'm one of those rare people that likes the Erlang syntax a lot better. I don't really understand the hate for it, it seems so clear and specific. Anyway, if people want choices for an Erlang like experience, without the Prolog syntax, check out LFE:
The thing I don't like about Erlang's syntax is that often I have to change the character at the end of the line when I need to move a line. It makes refactoring more tedious. I have no problem with it when reading code :)
Are you referring to Erlang or Elixir? Erlang has QLC (Query List Comprehensions) which work like that for Mnesia, for instance. I couldn't find information on this in the Elixir docs but I probably didn't know where to look.
The language I'm most interested in currently is kernel: mathematically underpinned by the vau calculus, with a smart creator and steward, and a better abstract model for macros than even Racket's stellar syntax-parse. Implementations abound.
Racket's macro system and Kernel's fexprs are pretty fundamentally different, so I don't think the comparison is very apt. In particular, Racket's macros can be entirely compiled away.
While true, I was comparing macro APIs from a usability perspective and should have avoided the term 'abstract model'.
My point was the mental model I refer to when using $lambda is simpler and easier to understand than my mental model for syntax-case, syntax-rules, syntax-parse and friends. The very fact that 'Racket's macros can be entirely compiled away' complicates my mental model which must now accommodate 'phases', compile-time/run-time dichotomies, require for-syntax, etc.
When looking at languages like LiveScript (or CoffeeScript) I say to myself: "Javascript has been around for 18 years, how long will this be around for?"
Not saying not to use it, but my use case has to overcome that question.
One of the nice bonuses of targeting lowest-common-denominator JavaScript is that it makes things extremely compatible, and durable.
Even if all of the world's copies of the CoffeeScript compiler disappeared tomorrow afternoon, in between sips of tea, your compiled code would still run on every JS runtime (back to IE6), and would still be compatible with all other JavaScript libraries, and future versions of CoffeeScript as well.
With Coffeescript (I don't know about Livescript) the compiled output is close enough to hand-written Javascript that, if Coffeescript suddenly disappeared into a black hole, you could use the compiled output as source if needed.
I usually introduce CoffeeScript as "shorthand for best-practice Javascript".
The big weakness of CoffeeScript IMHO currently is the weak organizational structure of the community. I don't see a clear path where the project is going, and already there are forks/parallels like livescript and IcedCoffeeScript, both with their own strengths.
The output is usually more readable than a lot of JQuery plugins I've seen ;-)
I regard CoffeeScript and its cousins as tools rather than frameworks or languages.
It's very human readable. Look for yourself-- there are side by side examples of coffeescript and the JS it generated on the front page http://coffeescript.org/ . The generated JS is actually a great way to learn about some best practice ways of using Javascript in ways that avoid common pitfalls. I started using CoffeeScript shortly after reading "Javascript, The Good Parts" and the ideas from that book are basically automatically implemented.
It's readable (admirably so), but I don't know about reading the output for best practices -- it's not idiomatic JS and not meant to be (however readable), it's meant to be a formal equivalent of the CoffeeScript syntax.
The way that it creates closures for enclosing variables for packages is idiomatic and best practice. The way that it handles looping is idiomatic for avoiding certain kinds of common pitfalls. The way that it creates "that" variables for temporarily holding on to "this" for a while is idiomatic. There are a bunch of things like that that it does automatically.
> The way that it handles looping is idiomatic for avoiding certain kinds of common pitfalls.
Unless you're using native (or shim'd) map and similar constructs instead.
(Or, for that matter, if you're otherwise used to writing code where block scope isn't the rule.)
> The way that it creates "that" variables for temporarily holding on to "this" for a while is idiomatic.
Unless you're using bind instead.
I suppose it's true enough that people can in fact learn good things from the CS compiler output. But the JS that CS writes is not necessarily the JS an experienced dev would write or would have to write to achieve the same goals -- even taking into account the principles behind "best practices."
The same can be said for any higher level language being compiled into any lower level language or machine code. It's similarly true that an optimized C compiler won't always write the best code in a best practice way.
Also true: no matter how good something is, there'll always be someone eager to put it down.
> an optimized C compiler won't always write the best code in a best practice way.
Aaand, therefore, you might not suggest to people that a good way to learn to write good asm would be to study the output of a C compiler. However educational that experience might be.
> Also true: no matter how good something is, there'll always be someone eager to put it down.
Not sure where this is coming from unless you think we're having a conversation about the merits of a language, instead of the merits of learning another language from the output of a transpiler.
Surprisingly. Where a construct really has no JS analogue, the structure is still mostly maintained and the extra bits are packed into a function that appears right below the code block.
Julia is in a weird place. It improves in speed over R and Python, but the head start of both of these languages is strong. Also it will be some time until julia can provide equivalent support for GUI and Web programming.
Julia fits well as the next step from Matlab. I've already seen groups of engineers across various disciplines that have turned to Julia instead of Matlab.
How is currying better than partial function application (particularly with keyword arguments)? If your function isn't commutative, then it seems to give greater importance/flexibility to the first arguments.
As someone who has just this week started serious study of Erlang, I'll have to look at Elixir as well. Ending statements with periods does take some getting used to!
There is a book on Elixir coming out from Pragmatic Programmers sometime and will be written by Dave Thomas, which might generate a certain buzz too. This may be a good time to dive into it a bit more.
A big part of the original idea with CoffeeScript was to exhaustively annotate the compiler so that folks could take it, and run off in more radical directions. Whereas CoffeeScript is intentionally conservative (no standard library beyond what JavaScript offers is allowed, for example), the fact that LiveScript is a fork off Coco, which is a fork off CoffeeScript, is a wonderful tangent.
There are static and optional typing flavors, asynchronous-flow-control flavors, macro-oriented flavors, functional ones, contract ones, and of course the rewrite of the basic language itself. In open source, there's enough spotlight for all of 'em.
http://en.wikipedia.org/wiki/JavaScript#Birth_at_Netscape
Edit: It seems it is intentional:
"LiveScript was one of the original names for JavaScript, so it seemed fitting. It's an inside joke for those who know JavaScript well."
http://livescript.net/#name