Hacker News new | past | comments | ask | show | jobs | submit login
X server in 5000 lines of Common Lisp (github.com/pyb)
136 points by luu on Dec 6, 2013 | hide | past | favorite | 66 comments



Cool though it's going to be egg on OP's face when someone else implements this in 12 lines of Javascript


VNC in JS is done[1] (the VNC display is ~700 lines), so X might actually be doable...

[1]https://github.com/kanaka/noVNC


and subsequently in 0 lines of js.


The OP was already in 0 lines of js.


AKA no.js

My hot tip for the future of programming.


One of the prerequisites for using this is, apparently, an already-running X server.


So by implementing none of the hard bits, LISP wins again!


Lisp featured support for Garbage collection, exceptions done right, macros, optional typing, native compilation. In fact I would argue that many in the lisp community were so obsessed with the hard bits that they overlooked building simple things that see everyday use in the real world.


Quite some bits were implemented in Lisp. Both simple and hard. Some of the bits are lost.

For example the TI Explorer Lisp Machine came with an X11 server written in Lisp. On my Symbolics Lisp Machine I used the usual MIT X11 server written in C - this was possible because the Symbolics Lisp machine had a C compiler.


I was under the impression that those machines evaluate Lisp forms at the hardware. How could a C compiler work? Does it translate your code into Lisp s-expressions, or is there like a bytecode layer in between?


This is an instance of a C to Lisp compiler that translates to s-expressions https://github.com/vsedach/Vacietis


You could translate to either Lisp or raw instructions for the Lisp machine, but it doesn't really matter. The main thing that you do differently from a normal C compiler is translate pointers into pairs of a buffer and an index into the buffer. Then pointer arithmetic still works as far as it is required to work by the C spec.


What do you think a compiler does? The target instruction set can be any Turing-complete language. Compiling C to LISP is no different than compiling LISP to C (e.g., GNU Common Lisp), compiling ML to Java (which is so easy students can do it in 300/400-level CS), and so on.


You make it sound of easy. How hard is it to go from a GC language to a non-GC language? What methods do you use to fill in GC?


It's the runtime environment that provides that. And GC's for C are pretty easy to come by :)


Well, the naïve way is to allocate a big block of memory for your C program and manage it using malloc/free like normal (basically, use the LispOS equivalent of sbrk(2)). Alternatively, since pointers are pointers, nothing stops you from having malloc(3) call out to the Lisp runtime's GC. (free(3) is a no-op.) That's basically how the Boehm G. works. I'm not a systems programmer so I'll admit this is a bit hand-wavy. Also, note that malloc/free are not part of the core C language, per se, but the standard library (a/k/a the C runtime). Even if they were in the core language spec, you as the compiler author can implement them however you choose. They aren't magical.


How would a language's support for garbage collection make it any harder to compile a non-garbage-collected language into it? Why would you think that?


There's no code to hint at how garbage collection should be done, since its automatic. You have to essentially include a library that would emulate the source language gc that it came from so its not a direct translation from one language to another. At least that's what I was thinking


Not really. None of the relevant (= you could buy them and use them) Lisp Machines did not evaluate Lisp forms - the evaluator was a program, too. They all used a processor which provided a special instruction set and Lisp datastructures. For example the processor of the Symbolics Lisp Machine was a mostly stack machine running compiled Lisp code. You can compare it to running JVM instructions in hardware. Thus the Symbolics Lisp Machine had C, Ada, Pascal and other languages implemented for it.


Hmmm, as I recall eval was in microcode.

But so as I recall was the virtual machine that interpreted the bytecodes the compiler produced.

Here I'm referring to what I remember being told about the CADR's microcode (it had a LOT, with options for more if you were willing to buy more $$$ static RAM), but the others were all direct descendants and I gather did pretty much the same things.


Most of the code on the Lisp machine was compiled. An interpreter, which would evaluate source code was rarely used.


It's not that it doesn't implement the hard bits, it just re-implements the hard bits on top of the hard bits. It requires an OpenGL context, which generally requires you to have an X server running.


It should have been written using lazy evaluation. That way, it could use itself recursively as its own X server. /sarcasm


I think Haskell has the market cornered on tying the knot. http://www.haskell.org/haskellwiki/Tying_the_Knot


Can this restriction be removed using EGL ?


In principle yes


Well, the natural thing to do is compile the lisp to javascript and use webgl!

Would be interesting to see how much is needed to get it running with some other gl context. Also -- maybe this'll run under OS X and windows without requiring a X server for the GL context?


Well, according to the Readme, it's a "decent OpenGL driver" which then means "unfortunately, a running X server :) (to create a GLX context)"


I actually left this project alone after I got the very basics running... It's very very slow and rather buggy. I guess a profiling run could help a lot though.


Wait, you need an X server to run it? What's the point then?


It looks like it just uses an existing X server as a way to create an OpenGL context, not as an X server per se. Perhaps it would be possible to port it to some non-X method of producing an OpenGL context, such as EGL.


Yes that's the idea


So its a really hard way to run an X server on top of an X server?


Because this glosses over the really complex parts. Still, if they are written in Lisp, this would comprise a complete self-sufficient X server (how many lines of code this would take though?).


To me 5000 lines of C code is a LOT of code. Linus's first release of Linux was only 10,000 lines of code. Given much higher expressiveness of Lisp, I think 5000 lines of List code is not all that to boast about. I can't pull a reference here but according to one of the studies, the productivity of some of the well known programmers was measured at 10K LOC/year (yes, it's very loose measure of "productivity"). So writing 5000 lines of good code may require ~6 months, i.e., not a trivial amount of work. Although I agree it is cool to have X server in Lisp.


5000 lines isn't really a lot for a real-life project in any language. Besides, a lot of the code here seems to be in the data.

E.g. https://github.com/pyb/zen/blob/master/data/request-specs.li... describes X spec request syntax, accounts alone for nearly 1000 lines, and it's not really possible to trim down.


Here's Xorg's nested X server if you want to see how much C code it takes.

http://cgit.freedesktop.org/xorg/xserver/tree/hw/kdrive/ephy...


Agreed, I was surprised how many code Slime implementation has, there are source files about several kloc in size. From what I saw I can say that "lisp expressiveness" did not helped much there.


Isn't Lisp really hard to understand? Or is it just because I'm unfamiliar with the syntax?


I guess it's you who is just unfamiliar with the syntax. Do you find (+ 2 2) harder to understand that 2+2?

If the code is well indented, IMHO it's quite understandable. For example, what do you think of this Lisp code snippet (it's not the best possible code, just some stuff I have at hand to show without you needing to know the context):

    ;;takes two list l1 and l2 of same length, and return (l1[0] l2[0]
    ;;l1[1] l2[1] ...)
    (defun mix (l1 l2)
      (if (null l1) nil
          (append (mix (cdr l1) (cdr l2)) (list (car l1) (car l2)))))
EDIT: formatted the code as per gcr suggestion.


In HN, use four spaces to indent code blocks, like this:

    ;;takes two list l1 and l2 of same length, and return (l1[0] l2[0]
    ;;l1[1] l2[1] ...)
    (defun mix (l1 l2)
      (if (null l1)
          nil
          (append (mix (cdr l1)
                       (cdr l2))
                  (list (car l1)
                        (car l2)))))
(Also, for the record, you should say (append (list ...) (mix ...)) or else you'll reverse the result list, which I don't think is your intent)


This shows off one of lisp's (INHO) unfortunate anachronisms, namely that your "other car is first".

CDR and CAR refer to machine instructions on the IBM-704 that had a (by today's standards) a somewhat quaint handling of "words" in memory, roughly allowing CAR to refer to a data element at the head of a linked list, and CDR to refer to the pointer to the rest of the list (or the empty list) (As well as a CONS instruction to build a word in memory from for parts, including the two 15 bit values that CAR/CDR refer to)[1].

Lisp then extended this by allowing programmers to combine c(a|d)+r, to get the equivalent of second, third, first-of-second-sublist etc. I'm not convinced the succinctness is worth it -- apparently experienced lisp-coders disagree. The TL;DR is that if you don't know lisp, this code is the same as the above, and is a bit more intuitive:

    ;;takes two list l1 and l2 of same length, and return (l1[0] l2[0]
    ;;l1[1] l2[1] ...)
    (defun mix (list1 list2)
      (if (null list1)
          nil
          (append (mix (rest list1)
                       (rest list2))
                  (list (first list1)
                        (first list2)))))
Other languages might use head and tail, rather than first and rest. I'll agree that list1 and list2 might not offer much more readability than l1 and l2 -- at least not with the comments given.

[1] Most of this is paraphrased from wikipedia, and what I remember from various times I've tried to get started with lisp, without ever really getting entirely comfortable with the language. http://en.wikipedia.org/wiki/CAR_and_CDR

[edit: I didn't notice the comment about reversing the list, but I think it is rather obvious that (append rest list(first)) does indeed reverse the order...? I actually wondered if append had a strange order of parameters while I "rewrote" the code :-) ]


Basically nobody should write such code. As a style guide: use CAR and CDR on cons data structures. Use FIRST, REST, SECOND, ... on lists.

Actually on day two of Lisp, fifty years ago, these patterns have been abstracted. Recursion of lists has been replaced by higher-order functions in many cases.

    (defun mix (list1 list2)
      (mapcan #'list list1 list2))
Above maps the function LIST over the two lists and concatenates the results.


Hm, am I right in thinking that numbers are defined to be atoms in lisp. And also that there are limits to how one can use macros, in that you'd not be able to a) rename FIRST, as in (FIRST list) to 1 (or '1): (1 list) or ('1 list); and b) that you couldn't modify the syntax to "decompose"/parse a slice syntax like: (1:2 (list-of-lists)) (here 1:2 would mean for example first element of second sublist/cons) ?

Now, you could of course write an interpreter for such a language in lisp, but am I correct in assuming that there's no reasonable way to get a (standard'ish) lisp to compile an expression like (1:2 ...) to the equivalent caadr(?)-like expression?

I saw there's a library for clojure that adds "slice notation" -- but does so by adding a (slice x y list)-macro which of course is fine -- but it would seem that (x:y list) would give similar benefits to the cadr/cddr etc -- allowing selected subsets to be highly optimized, for example? (As well as offering concise syntax)


Functions like CADDR are not much used in modern Lisp.

I would not write `slice` as a `macro` or special syntax. I would use the `SUBSEQ` function, which already exists.

    CL-USER 7 > (subseq '(a b c d e f) 2 4)
    (C D)
The simplest way to introduce a special syntax for it is to use different parentheses:

    {1 '(1 2 3 4)}  -> 2
But the default Lisp style is this

    (functionname arg0 ... argn)
where `functionname` is a real symbol. It may be longer on the character level, which may be a problem for some users - not for me.


I suppose what I was really wondering, but cleverly hid in formulating my comment, was "what are read[er] macros?".

That'd be how you could redefine "{" and "}" to work something like this?

At least I came across the following that seems to indicate that:

http://letoverlambda.com/index.cl/guest/chap4.html

http://jlongster.com/2012/02/18/its-not-about-macros-its-abo...


Reader macros allow you to write parser functions for characters. So you could write a reader macro for { such that it collects all enclosed content until a } character. The it could transform that content and return the transformed result. The result is always Lisp data, not a string. How you transform the enclosed text is up to you.

Common Lisp itself tries to minimize its usage of characters. It is expected that users or libraries want to make use of characters like {, }, [, ], and many others.


Interesting, by looking up mapcan, I came across the Simplified Lisp Reference:

http://jtra.cz/stuff/lisp/sclr/mapcan.html

Also linked from the resources-section of:

http://gettingstartedwithcommonlisp.com/

I don't think I've come across the simplified reference before -- appears to be a good resource for those of us starting out with lisp -- striking a balance between all of CLOS and the sometimes simplistic presentation of text books.


There is also reference one can print out:

http://clqr.boundp.org

A a redirect service for Common Lisp documentation:

http://l1sp.org/html/

A search engine:

http://lispdoc.com


Thanks for the tip of the indentation, and for checking my code - you are right that I got the order wrong in (append (mix ...) (list ...)), but in this particular case I'm not interested in how my list is ordered.


I think it's easier to reason about in the long run although the initial period of wondering why you don't just set up a data structure and bang on it in place is a little bumpy.

The syntax is always[0] (procedure argument-1 argument-2 ...) where the procedure is applied to the arguments.

about half-hour in to this lecture the syntax is explained: http://ocw.mit.edu/courses/electrical-engineering-and-comput...

[0] - almost always. Macros look the same and often get treated the same but they are very different.


The syntax is very minimalistic at cost of much more semantical burden. Parsing Lisp code for humans often relies on conscious understanding the words and AST structure, not on reflexes tied to a visual memory of characters and indents. So I think that Lisps are at odds with human perception (someone may consider this an advantage: this repels people who code with their cerebellum instead of cerebral cortex). But once you have a picture of what is the code about in your head, it's not a big deal.


Lisp's syntax is the easiest thing about it to understand. In fact, Lisp's syntax is so easy to understand, that even Lisp can understand its own syntax.


Also building many things on top of ~persistent cons cells / linked lists. Found it quite awkward the first time I saw it.


LISPers know the syntax is clunky but they are OK with it because it makes it super easy to create macros.


I never found it clunky, at worst blurry (only one visible syntactic marker) but it was pretty beautiful and timeless.

Beside macros, with non string-buffer editors, you get very very fast and calm typing or I should say low level refactoring.

Finally it has this weird quality of really looking like static data (relation v1 v2 v3) most of the time and really not like a cryptic (by the number of delimiters {} () , ; ) state dependent imperative notation. Makes you really think about how to interpret data rather than send commands to a machine.


German is really hard to understand. I imagine the user lispm would have something to say about that.


It's easy, the trick is to learn it as your first language ;)


It's also easy if you live in Germany or a German speaking surrounding. Being immersed into a German speaking surrounding helps - learning without other Germans from training material is less motivating, so it looks difficult. What actually the difficult part is, is the motivation, not the language. For somebody from an English speaking country learning German is not really difficult - English has lots of Germanic roots. It's just that many people in English-speaking countries don't learn any foreign language to a competent level, not even the easier ones, so everything looks difficult. What really would be a bit more 'difficult' is language with a different alphabet (Russian) or a language which is different on many levels (Japanese). It also depends on what level you want to be able to read German. If you want to be able to read philosophers (Kant, Schopenhauer, Hegel, Jaspers, Habermas, ...) then the subject is also difficult - it would help if you understand/know the concepts already - they are often independent of the language. English is also difficult. Parts of the language are easy, but the language is huge. I can read many english texts quite fluently, but there are novels where I am at 10% reading speed (compared to english texts from typical news sources) when I want to understand 95% of the words.


I think getting to a decent understanding of German should not be too hard, but some nuances are. Even some native speakers struggle with parts of the grammar, like dass/das, dativ/genetiv, etc. I am a native German speaker who learned English as his second language and is now learning Greek, including a different alphabet, as his third.


That's not a specialty of German. Lots of native English speakers struggle with English.

http://www.theguardian.com/education/2013/oct/08/england-you...

Literacy for example seems to be quite good in some countries with languages which for me would look difficult. The UK OTOH is not happy with their results.


No, you just have to try...and of course practice if you want to get any good at it.


Apparently this is 3 years old.


Good luck groking that mess.


Banging on file descriptors looks pretty much the same no matter what language you do it in. I expect that a Python equivalent would be pretty much the same length.


It's hard to grok low-level code like this in any language. I'd take common lisp over heavily-templated C++ or freestyle Perl any day.




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: