Syntax looks very clean, but probably more like CoffeeScript mixed with Ruby. "Inspired by Haskell" is going a bit far. You took pattern matching and `where` and none of the semantics.
Object patterns are absolutely awesome. Every OOP/FP hybrid ought to have that. Especially if you're not going to support algebraic data types. That point is worth bringing up, though: are they missing because of the presents of objects? I'm concerned about the interaction between pattern matching, duck typing and Java's static typechecker. Based on a quick glance, it's hard to see what that interaction is going to be like. At the very least, it does raise the question: what makes the pattern matching on lists and other built-in types special? In Haskell, for example, there's nothing magic about the cons operator and lists; it's just regular pattern matching on algebraic types.
I suspect your magic Nothing type is also necessary to make object pattern matching powerful, and I have some concerns about this kind of null safety. I wonder if programs are going to wind up turning into "gray goo" of Nothing, and what the debug scenario is for that case.
Exception handling looks a bit odd to me. Maybe it all works out in the end, but it looks like you're going to wind up with code like this:
foo(x) except handler
where
handler(e) =>
IOException : e.stuff
...
I think this is going to wind up being a bit gassy for local exception handling, where in Java you would see something like this, that avoids the intermediate name:
It's probably fine, but I dunno, most languages use more syntax in this area.
Overall, this is fairly impressive. It looks a bit like the greatest hits of Ruby and ML. The JVM has been host to a surprising number of interesting languages lately, and this is clearly one of the better recent offerings. Definitely worth a deeper look.
Haskell programmers may be familiar with this as a do
block, and for Schemers, this is the equivalent of a
begin sequence. However, unlike Haskell, in Loop, a
sequence is guaranteed to execute in order.
A do block is syntactic sugar over >>= (bind) and is monadic in nature. And the IO monad (which is relevant to the example that involved printing things.) is a construct which guarantees that the parts will execute in order. That's kinda the point. That he got it wrong makes me not trust the language. He shouldn't claim to have been "Inspired by Haskell" if he doesn't really know it.
Also in Haskell the pattern [x:xs] matches a list with one element, which is a list with at least one element, so [[1,2,3]] (== (1:2:3:[]):[]) gets matched with x = 1 and xs = [2,3]. That differs from loop's pattern matching. But [x] still matches a list with only one element. What does [x:[]] match? If you are going by [x:xs] matches [1] with x = 1 and xs = [], then surely [x:[]] will match as well.
Also, if you are making a functional programming language, you know one that uses recursions instead fo for loops, why do you call it Loop? I didn't see one loop in any of the examples. It's just silly.
EDIT:
Oh, and if Nothing really is a subtype of everything, then 5 + Nothing should typecheck right? Not " It is a type error to attempt to compute with Nothing." Since Nothing is a Integer, since it subtypes everything. I don't understand. Is + special in hating Nothing? Nothing.add(5) would return Nothing, right? So why not Nothing + 5. Oh, but wait, Nothing is a subtype of string, so Nothing + 5 means string concatenation, right? You did say a + b where a is a string casts b to a string, didn't you? It's inconsistent and I don't think it has the proper semantics.
I think it's quite misleading to list Haskell in the title: the only common thing between Loop and Haskell that I've found is pattern matching, and that's not even exclusive to Haskell -- other languages have it. Besides, it's not clear from the intro that Loop's pattern matching is as powerful as Haskell's (especially with all the extensions). Finally, pattern matching is really a syntactic sugar over the case/switch construct present in many, many languages. Although, I liked the type-patterns.
Pattern-matching means that the branching primitive not only dispatches to different code based on an input tag, but also that it places different values of different types in scope according to the branch.
switch(ptr) {
case NULL: ... handle null case ...
default: ... use ptr as if it weren't NULL ...
}
This is unsafe -- because nothing prevents you from using ptr in the "NULL" case, and the compiler does not give you anything in the non-NULL case.
In Haskell:
case ptr of
Nothing -> ... can't use ptr as a value here,
it's wrapped with Maybe ...
Just x -> ... pattern-matching gave us "x" of
the correct type.
We can now safely use it.
As Tyr42 has so astutely observed, I have meant that pattern matches in function declarations, let-bindings and list comprehensions are desugared into the case expression. Also, it would be kind of pointless to compare C switch and Haskell case. Especially, in your example Haskell, probably, 'wins' because primarilt because a more powerful type system and algebraic data-types (and absence of pointers, hehe). That said, it's not like you can't emulate the pattern matching (with placing nested values/subtrees in scope): https://gist.github.com/2832755 (it's in JavaScript, because I felt like it -- but I'm pretty sure you can devise something similar in C). Of course, it's plain ugly and doesn't carry any nice static guarantees that the Haskell type-system will give you (like warning about incomplete patterns)... but, hey, it works.
Anyway, the point of my comment was that pattern-matching is hardly a feature characterizing and exclusive to Haskell (ML/OCaml/F#, Coq, Scheme have it, probably other functional languages too). Although, it's still the one I enjoy every day :)
In a dynamically typed language, whether you do pattern matching or boolean-blind branching doesn't matter that much. You'll catch any error at runtime anyway.
In the presence of static typing, "emulating" pattern matching defeats the purpose, because the purpose of pattern matching is removing boolean blindness. With the "emulated" code you still get runtime-failing boolean-blind code.
By the way, pattern-matching is indeed very much related to sum types (part of "Algebraic data types"), which C lacks. To have pattern matching in a statically typed language, you would need to have sum types as well.
He said pattern-matching was sugar for "case/switch construct present in many, many languages". That implies pattern-matching adds only syntax to the game, and that it doesn't add any useful things beyond the "switch" you find in C or Java, for example.
Sure, and I'd do the same if he said it was just sugar over "case", but he explicitly said "case/switch found in many languages", which makes it pretty definite IMO.
Perhaps this isn't the place for the question, but I'll ask anyway. Is there any JVM-based language that excludes any code from being compiled that is not directly implemented in that language? For example, any code that would deny the use of java.io or java.lang?
The reason I ask is because Clojure and Scala both suffer from the ability to call legacy code. While some think this is a good idea, it makes the whole scalable, safe/functional paradigm completely unstable. Is there any language out there that forces the developer to use only things written in the specific language its since that would be the only safe platform?
I haven't heard of one that does this. In fact an oft stated selling point of JVM based languages is the ability to call and interact with existing Java code and libraries.
The name is unfortunate, especially when the documentation sometimes omits to capitalize "Loop", which introduces even more confusion than there already is.
The name is unfortunate indeed. There were languages like LOOPS (Lisp Object Oriented Programming System) which evolved into CLOS. Common Lisp loop macro is a kind of special DSL of its own. While the oldest LOOP language may be from 1967 article by Meyer and Ritchie (DMR as in UNIX), a kind of mini language for computation of primitive recursive functions and dealing with computational complexity.
Being functional in nature, Loop doesn't have any of the baggage of the host platform (Java)
I don't see much that's functional in this language, but this statement made me laugh. It looks like a lot of Java's baggage is still alive and well in this language, like exceptions (quite un-Haskell-like)
It has them, but I don't see them being used very often.
More commonly just a Maybe will do. And really, monadic exceptions aren't like java exceptions.
I would call them un-Haskell-like.
Looks great. I wonder why the author went for a specific syntax for interned strings, as opposed to doing what Java does (automatically interning String literals).
In computer science, polymorphism is a programming language feature that allows values of different data types to be handled in a uniform manner.
This may be ad hoc polymorphism (often synonymous with overloading) - or parametric polymorphism (as in ML or Haskell); but it's unclear what they actually do without some language semantics.
And to be more complete, OO-style polymorphism is usually referred to as "subtype polymorphism" and the parametric polymorphism dons talked about is like C++ or Java "generics"
Object patterns are absolutely awesome. Every OOP/FP hybrid ought to have that. Especially if you're not going to support algebraic data types. That point is worth bringing up, though: are they missing because of the presents of objects? I'm concerned about the interaction between pattern matching, duck typing and Java's static typechecker. Based on a quick glance, it's hard to see what that interaction is going to be like. At the very least, it does raise the question: what makes the pattern matching on lists and other built-in types special? In Haskell, for example, there's nothing magic about the cons operator and lists; it's just regular pattern matching on algebraic types.
I suspect your magic Nothing type is also necessary to make object pattern matching powerful, and I have some concerns about this kind of null safety. I wonder if programs are going to wind up turning into "gray goo" of Nothing, and what the debug scenario is for that case.
Exception handling looks a bit odd to me. Maybe it all works out in the end, but it looks like you're going to wind up with code like this:
I think this is going to wind up being a bit gassy for local exception handling, where in Java you would see something like this, that avoids the intermediate name: Is it going to look better with closures? It's probably fine, but I dunno, most languages use more syntax in this area.Overall, this is fairly impressive. It looks a bit like the greatest hits of Ruby and ML. The JVM has been host to a surprising number of interesting languages lately, and this is clearly one of the better recent offerings. Definitely worth a deeper look.