Hacker News new | past | comments | ask | show | jobs | submit login
Radiance: a flexible web-application environment for Common Lisp (shirakumo.github.io)
82 points by zeveb on April 7, 2017 | hide | past | favorite | 18 comments



I love seeing new development in the Lisp space, but this offers very little over the very stable, mature, and consistently supported and updated Hunchentoot beyond a whole bunch of architectural opinions. It does nothing to address WebSockets or realtime in general, which would help to bring the Lisp web stack out of the 90s.


Hmm- gonna get flamed here, no matter what I say, so... I'll try to be a bit delicate, but not going to put too much effort into delicacy.

I understand loving CL. CL was, I think, the first programming language I really loved passionately, for itself I loved programming in C passionately before that, but in retrospect it was more a matter of having gazed from afar at the machine for so long, and suddenly realizing that C would bring me much closer to her than BASIC had. C was just an intermediary, though I did not understand that at the time. We have good relations to this day, but I'm afraid I do not love her, and never really did. I just used her to get close to the machine.

I still love the machine, but, as they say, familiarity breeds contempt, and it eventually became apparent that we would be better off at some remove from each other. No judgment- it's just that I happen to be fairly abstract, and the machine likes to pretend to be concrete (on some level I suppose she is, but I'm pretty sure we never dug that deep in our relationship.) I suppose that that was, perhaps not so co-incidentally, right around the time I met CL again.

I'd flirted with CL in University AI classes, even worked with her a bit in a computer vision lab, and I'd always thought she was attractive, if a little off-beat. But then I fell for her, obsessively, and for a while she was all I could think about.

I cherish our time together. I was such a bumpkin when we met, and she challenged me at every turn. "Why," she'd ask, "must a byte have eight bits?" And I'd be forced to admit that she was right-a truly inclusive computing culture ought to accept bytes of all different sizes. I won't get into our discussions of filesystems here.

And I was introduced to her friends. What a lively bunch they were, and what a smart bunch. I remember I once asked her friend Erik for the time and... well, actually he suggested that I might want to start carrying a watch or just go home and kill myself, but then he explained some things about time that I carry with me to this day. He was the most provocative of CL's friends, but many of her other friends were not only brilliant, but perhaps monotonically increasingly old and disappearing.

The great thing about CL is that it has a standard that is hard to change. The worst thing about CL is that it has a standard that is hard to change. For many years the former overcame the latter. I'm inclined to think that that is no longer the case.

So, pretty much, why CL?


CL allows us to write code the way we think about a problem--and then bring life to that way of framing the problem. We can come up with an ideal way of describing a solution and then make a language work that way. I say a language because the target of our code might be C or JavaScript (these days that is more often the case for me than targeting CL itself, cf. 4500 recent lines of Lisp that turns into 8000 lines of terse JavaScript).

Our ability to reason correctly about systems directly corresponds to how complex they are. And I posit that complexity in code is best measured in number of symbols (because lines can be arbitrarily long, and longer symbol names can actually be helpful). So a system that reduces the number of symbols necessary to express a solution increases the size of a solution about which we can successfully reason. Just as computers are a "bicycle for the mind", homoiconicity+macros (of which I posit CL is still the best practical implementation) is a "bicycle for the programmer's mind".

Lisp provides an optimal solution for thinking of programs as nested sequences of arbitrary symbols. Sequences that can be transformed (and must be, for a computer to evaluate them, unless we hand-write machine code!). Common Lisp provides an optimal set of built-in operators for composing operations and transformations on its fundamental data types (atoms/cells/lists/trees). Other languages might provide better implementations of particular paradigms or whatever, but CL is the best language for implementing macros. Other Lisps make macros "safer" and miss the point.

As Vladimir Sedach wrote earlier on Hacker News[1]:

"The entire point of programming is automation. The question that immediately comes to mind after you learn this fact is - why not program a computer to program itself? Macros are a simple mechanism for generating code, in other words, automating programming. Unless your system includes a better mechanism for automating programming (so far, I have not seen any such mechanisms), _not_ having macros means that you basically don't understand _why_ you are writing code.

This is why it is not surprising that most software sucks - a lot of programmers only have a very shallow understanding of why they are programming. Even many hackers just hack because it's fun. So is masturbation.

This is also the reason why functional programming languages ignore macros. The people behind them are not interested in programming automation. Wadler created ML to help automate proofs. The Haskell gang is primarily interested in advancing applied type theory.

Which brings me to my last point: as you probably know, the reputation of the functional programming people as intelligent is not baseless. You don't need macros if you know what you are doing (your domain), and your system is already targeted at your domain. Adding macros to ML will have no impact on its usefulness for building theorem provers. You can't make APL or Matlab better languages for working with arrays by adding macros. But as soon as you need to express new domain concepts in a language that does not natively support them, macros become essential to maintaining good, concise code. This IMO is the largest missing piece in most projects based around domain-driven design."

[1] https://news.ycombinator.com/item?id=645338


Excellent points.

Unfortunately, those people that only treat programming as a way to pay the bills - with a six figure salary - which would be the vast majority of professional programmers working today, do not want to understand _why_ they are writing code. The commoditization of software engineering that companies like Google [1] enthusiastically support and promote is also directly responsible for the obliteration of the entire field.

In a world where geniuses like Alan Kay are almost unheard of and Tim Berners-Lee ends up receiving the Turing award (next: Stroustrup/Rob Pike) there really isn't a lot of hope for languages like Common Lisp to proliferate. They're simply too meta and require a lot more from you than just settling for whatever makes one feel good about him/herself in an immediate-rewards sense.

[1] http://www.flownet.com/gat/jpl-lisp.html


I think you're massively overstating your case when you say "optimal." There is no evidence anything CL provides is optimal according to any rigorous metric. I could very easily claim Scheme is "more optimal" since it's a much more minimal implementation with the same capabilities. Also I think you are mistaken to equate fewer symbols with easier to reason about symbols, e.g. I find it much easier to reason about complex programs when they are written in a strong static type system, which CL lacks. The type system lets me leverage the computer to help me reason about the program, by type checking.


Then the computer is doing the reasoning, not (just) you. Static typing will certainly help you reason about a larger system, I do not dispute that. I am talking about increasing the capability of the largest symbol system you operate on in your mind without relying on the computer. If the meaning-density of the symbols is higher, your effective intelligence goes up.


> these days that is more often the case for me than targeting CL itself, cf. 4500 recent lines of Lisp that turns into 8000 lines of terse JavaScript

I'm very interested by this. Can you say more about the project? Or, even better, share some code?


Some people are just very much drawn to ML and/or lisp. For some people it just maps in a very nice way to how you imagine a solution to a problem.

I don't think it is much more than in other languages, it is just that there is a culture within those communities that promotes: "hey wow, I just did something cool. Look".

As a lisp guy I stare in envy at the cool things people do in ML.


Re: WebSockets, there's hunchensocket, clws, websocket-driver, etc:

https://github.com/joaotavora/hunchensocket

https://github.com/3b/clws

https://github.com/fukamachi/websocket-driver

I haven't used any of them though.


I loved playing around with CL but I never could get my mind around that damn packaging system. Maybe I'll give it another try sometime. Has the community got any better or is it still the online equivalent of a private clubhouse for PHDs to smoke their pipes?


Common Lisp packages are "weird" because they don't express the same concept popular languages do. CL package is not a Java package. I know I tripped over this at the beginning.

Some helpful links:

http://www​.flownet.com/gat/packages.pdf

http://www.gigamonkeys.com/book/programming-in-the-large-pac...

As for community, it's fine now. Lots of friendly and helpful people there.


Common Lisp packages are overly complicated in addition to being different.

I designed a package system which shares features with the CL one but is considerably simpler. It captures the salient program organization use cases.

In addition, I expose the registry of packages with dynamically scoped variable. This allows programs to easily set up sandboxing. Untrusted code cannot use package prefixes to gain access to functions that you don't want it to, because you control what package prefixes refer to.

http://www.nongnu.org/txr/txr-manpage.html#N-000CFF61


They still have a major flow - common lisp packages are not hierarchical (in contrast to java) which makes it hard to have some symbols exported to not all packages but rather subset of them.


You can import all symbols from a package with "use" or explicitly import specified symbols with "import" on a package by package basis.

I am not following your actual use case, can you explain further?


CL supports multiple dispatch in its generic functions.

Multiple dispatch leads to ambiguities. If we call (fun derived-obj derived-obj) and the two available methods are (fun base derived) and (fun derived base), which one is called?

Now here is the irony: CL will decide this for you, rather than throw an error. However, if you use multiple packages with same-named symbols, you get annoying errors in your face.

That's a philosophical inconsistency pointing at too many chefs in the kitchen. It can't be the work of a single designer that reflects properly on what he or she is doing and has a clearly defined attitude about ambiguous situations in programming.


What happens is well specified, see [1] and more specifically [2].

I don't understand where packages even enter the picture though. Symbols/packages and method dispatch are entirely separate concepts that are hardly related. You certainly won't get any weird errors from one in relation to the other. Furthermore, same-named symbols in different packages are just different symbols. The fact that their name is the same is irrelevant.

[1]: http://www.lispworks.com/documentation/HyperSpec/Body/07_ff....

[2]: http://www.lispworks.com/documentation/HyperSpec/Body/07_ffa...


> What happens is well specified

Nobody said it wasn't.

> Symbols/packages and method dispatch are entirely separate concepts

They are examples of different domains which have referential ambiguities. The object system resolves them. For instance, given (defclass base (d1 d2 ..) ...), base is considered slightly more of a d1 than a d2 in situations where this causes an ambiguity. The tie is resolved and life moves on.

Multiple package use is also a kind of "inheritance", yet it blows up in your face when there are ambiguities caused by clashes (which "FOO" symbol do you want? The one from package "A" or "B"?)

All these behaviors are specified, but according to different philosophy: just resolve things with a documented tie breaker, versus diagnose the situation.

If it's OK for base to be considered slighly more of a d1 than a d2, why isn't it OK for, say, a later package use to just shadow an earlier one?


The community is very diverse, just ignore the persons you don't like.

Was Quicklisp around when you last tried it? If not, you should check it out again.

What were your stumbling blocks wrt the packaging system?




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: