Hacker News new | past | comments | ask | show | jobs | submit login
Show HN: Caramel – a modern syntax for the lambda calculus (github.com/maiavictor)
130 points by LightMachine on Sept 27, 2015 | hide | past | favorite | 22 comments



This is a neat project which makes the λ-calculus more accessible. The λ-calculus is incredibly useful for designing and analyzing language features, but it can be a bit of a pain to work with in part because it quickly becomes verbose. Papers and proofs usually end up developing their own shorthand, but that tends to be rather ad-hoc and is rarely implemented programmatically.

In my view, this makes this project a great platform for playing around with language design. It should be reasonably easy to take the existing base and extend it with additional features (exceptions, continuations) or types. I've long suggested writing a handful of λ-calculus interpreters, from untyped to simply typed to System F to dependent types as a great way to thoroughly learn programming language concepts; Caramel gives you a great starting place for this coupled with an accessible, readable syntax.

Also, it would be great to see an online version of this, perhaps compiled with GHCJS. I've played around with trylambda[1] which is fun, but the lack of syntax quickly gets in the way. Caramel would be a great alternative.

[1]: https://trylambda.com/


I can remember filling page upon page of paper completing worked examples by hand on the CS course I did - oddly enough these labourious exercises actually made me really interested in λ-calculus and the related SK combinators.

I think it actually takes some effort with these things to realise that they really are that simple and yet so powerful.


Would love to see this compile to blc https://en.wikipedia.org/wiki/Binary_lambda_calculus


One of my favorite IOCCC winners (for most functional):

         Int L[A],m,b,*D=A,
          *c,*a=L,C,*U=L,u;s
           (_){u--&&s(a=*a);}
            char*B,I,O;S(){b=b
             --?b:m|read(0,&I,1
              )-1;return~I>>b&1;
               }k(l,u){for(;l<=u;
                U-L<A?*U++=46^l++[
                 "-,&,,/.--/,:-,'/"
                 ".-,-,,/.-,*,//..,"
                ]:exit(5));}p(Int*m){
               return!*U?*m=S()?U++,!S
              ()?m[1]=p(++U),2:3:1,p(U)
             :S()?U+=2:p(U[1]++),U-m;}x(
            c){k(7*!b,9);*U++=b&&S();c&&x
           (b);}d(Int*l){--l[1]||d(l[d(*l),
          *l=B,B=l,2]);}main(e){for(k(10,33
         ),a[4]-=m=e-2&7,a[23]=p(U),b=0;;e-2
        ?e?e-3?s(D=a),C=a  [3],++1[a=a[2]],d(
       D):c?D=c,c=*D,*D=    a,a=D:exit(L[C+1])
      :C--<23?C=u+m&1?O      =O+O|C&1,9:write(m
     ||(O=C+28),&O,1)+        1:(S(),x(0<b++?k(0,
    6),U[-5]=96:0)):(          D=B?B:calloc(4,X))
   ?B=*D,*D=c,c=D,D[            2]=a,a[++D[1]]++,D
  [3]=++C+u:exit(6)              )e=L[C++],u=L[C];}
It's Tromp's binary lambda calculus interpreter. Explanation and sample programs: http://www.ioccc.org/2012/tromp/hint.html


Done.


Awesome, thanks!


I don't think I understand the reasoning behind the ADT. Isn't a 'type' just a set of predicates applied to arguments at compile time and/or at runtime? Why not just define a Caramel convention where you specify typed functions using a fancy argument list, in this case, a list of lists of predicate functions? Untyped functions would take a different shape of argument.


The point of ADT syntax is to allow deriving many functions on different data structures automatically. For example, lists and some functions can be defined as:

    cons      = (x list cons nil -> (cons x (list cons nil)))
    nil       = (cons nil -> (id nil))
    map       = (f list cons -> (list (comp cons f)))
    head      = (list -> (list (a b -> a) nil))
    tail      = (list cons nil -> (list (h t g -> (g h (t cons))) (const nil) (h t -> t)))
    zip_with  = (f a b -> ((left a) (right f b)))
        left  = (foldr (x xs cont -> (cont x xs)) (const []))
        right = (f -> (foldr (y ys x cont -> (cons (f x y) (cont ys))) (const (const []))))
    length    = (flip comp const)
The very same definitions could be written more tersely as just:

    List     = #{Cons Type * | Nil}
    cons     = (Ctor 0 List)
    nil      = (Ctor 1 List)
    head     = (Getter 0 0 List)
    tail     = (Getter 0 1 List)
    zip_with = (ZipWith List)
    length   = (Size List)
Where Ctor/Getter/ZipWith/Size are proper derivers. It is possible because all the information you need is there, but it is very hard to come up with those derivers and I currently only have some very bloated derivers for Scott encoded structures, and none for Church encoded structures yet.


Coincidentally, if you're thinking of adding pattern matching to the underlying calculus (not just as syntactic sugar), the book I mentioned in another comment (Pattern Calculus) is worth a look. The author explores what a language could look like of patterns were first-class and used to express arbitrary computations.

At the very least, it could be fun to have an extended version of the system (or a plugin of some sort, perhaps) using his ideas about pattern matching.


Cool, thanks for the explanation. I guess I'm a little confused by the desire to "derive many functions on different data-structures automatically". What's the independant variable in this meta-function? It makes little sense to "automatically" define a type when one could argue that programming is the act of defining types! I guess my claim is that I want to express type as a named list of predicates that fold into a single predicate over 'and'.


They use .mel extension, which reminded me of the old https://en.wikipedia.org/wiki/Maya_Embedded_Language


Also the Story of Mel[1]. And wow, just realized that particular piece of hacker folklore predates me by almost a decade :P.

[1]: http://www.catb.org/jargon/html/story-of-mel.html


Same. Are they still using it, or is it largely replaced by Python? It's been more than a decade since I wrote my last mel script (or used Maya, for that matter.)


I don't know really, but I was under the same impression, python taking over. That wouldn't be surprising since it's a more general, more expressive, more known language that has been embedded in many other softwares (blender comes to mind)


Had never heard of the lambda calculus before. Very cool.

Does it have any practical applications? Or is it mostly a theoretical / conceptual tool?


It's mostly a theoretical tool: it's an incredibly simple programming language, which makes it easy to design and evaluate language features in isolation. That's a powerful property!

At the same time, practical functional languages are often surprisingly limited extensions of an underlying lambda calculus, perhaps with types. In fact, Caramel is more or less what an untyped Haskell or purely functional subset of Scheme would look like with fewer parentheses and macros.


>which makes it easy to design and evaluate language features in isolation.

Where can I see a demonstration of this? For example, what does it mean to evaluate something like decorators in Python and how will one go about doing that with lambda calculus?


Decorators in Python are basically special syntax for higher-order functions, so I don't think there's much there to evaluate. The lambda calculus is usually used when you want to ignore syntax.

Introductory resources on programming language theory usually have examples, after giving you a thorough grounding in the basic untyped lambda calculus. In particular, this is how they usually introduced type system features, but there are interesting things like continuations that don't necessarily involve types. Types and Programming Languages by Benjamin Pierce is a great resource, but you can probably find free lecture notes to the same effect. (I'd look them up, but I'm on my phone so it's a bit of a hassle.)

The most interesting non-introductory (but still self-contained) example I've seen was Barry Jay's Pattern Calculus book. He introduces pattern matching and first class patterns, first without types and then with, all building upon the lambda calculus. His ideas cover functional, imperative and object-oriented features, and it's fascinating to see a whole language developed piece-by-piece like that. If you're curious about PL theory and want to dive pretty deep, I think this is a book worth looking at.


Theories of grammar that compile natural language to the lambda calculus are popular in computational linguistics. For instance, this won the best paper award in EMNLP 2015, the most recent top-tier conference in the field:

Broad-coverage CCG Semantic Parsing with AMR

Yoav Artzi, Kenton Lee and Luke Zettlemoyer

http://yoavartzi.com/pub/alz-emnlp.2015.pdf


You'd rarely want to write lambda calculus directly for practical purposes, although Caramel looks like a very nice shorthand. But it's the fundamental basis of what functional programming is all about, and its equivalency to the turing machine means it's useful both as an analytical tool and in the context of language design. Functional languages can be seen as lambda calculus implementations with some extensions to make it easier to work with (types, constants, etc).


YCombinator is named for the y combinator of lambda calculus.


Are you going to bootstrap your Caramel compiler ?




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: