Hacker News new | past | comments | ask | show | jobs | submit login
NewLisp (newlisp.org)
120 points by qwertyuiop924 on Oct 7, 2016 | hide | past | favorite | 142 comments



From a Lisp POV, the only thing wrong with NewLisp is the name.

It flouts McCarthy's wish that no language be called just Lisp. Well, okay, it's not literally called Lisp; it has the "New" in front. On the surface, it seems to be sticking to the letter of the wish. Does it conform to the spirit, though? Is a common English adjective enough? Is "New" a sufficient qualifier?

The mere prefixing of "New" looks like it's claiming to be a new version of something that is just Lisp, unqualified. A "New Lisp" is plausibly something that was "Lisp" before, and was then enhanced to create a new version derived from "Lisp". Lots of projects get rewritten, or otherwise revampd and called "New Whatever". But NewLisp certainly has no such lineage.

One historic dialect was called NIL which stood for "New Implementation of Lisp". But it went by the name NIL, with the acronym just being an explanation for that name (as far as I can tell).

Change the arrogant name, and you will not hear another criticism from Lisp programmers. Make it LutzLisp or whatever. It doesn't matter what it is if it is called LutzLisp; it's not that type of thing that's claiming to be a new, improved version of Lisp.

Breaking, or skirting a sacrosanct rule is now how you play along in the broad Lisp family.


Well, okay, it's not literally called Lisp; it has the "Common" in front. On the surface, it seems to be sticking to the letter of the wish. Does it conform to the spirit, though? Is a common English adjective enough? Is "Common" a sufficient qualifier?

Yes. Yes it is.

("new" is more common than "common", but not incredibly so: https://books.google.com/ngrams/graph?content=common%2Cnew&y...)


I've never seen a rewrite or next generation of any program or function called "common". But new is common: ntalkd -- new talkd; nvi -- new vi; nftw -- new version of the ftw POSIX function (file tree walk); ncurses -- new curses; ncal -- new cal program; nawk -- new awk, ...

Basically Lutz calling his project NewLisp is a form of trolling.


Common Lisp was supposed to provide a "common" successor to the various Lisp platforms developed especially at MIT's project MAC, SAIL, and the Lisp machine vendors spun off from these.


When the next Lisp comes out, this one won't be "new" anymore so it'll be just "Lisp".


>Change the arrogant name, and you will not hear another criticism from Lisp programmers.

I don't think anyone else cares for that, or even went that far in their syllogism anyway.


Agreed. Most people just freak-out over the speed (it's closer to picolisp than SBCL being interpreted) although I recall a pretty major design decision that most lisps haven't used in decades that many would consider a flaw. I do like that it is similar to python in that it is easy to setup and comes preloaded with a lot of libraries.


I wonder if the reason that a Lisp hasn't made it into the mainstream is that Lispers can shut up about pedantic naming rules that are completely and totally unimportant.


It's important because we have to distinguish between many dialects. If any of them are just Lisp, it starts to get confusing.

Despite what the CLers say, we all have an equal claim to the name.


> Despite what the CLers say, we all have an equal claim to the name.

Agreed, but who cares? Is this what you want the Lisp community to spend our time on?

> It's important because we have to distinguish between many dialects. If any of them are just Lisp, it starts to get confusing.

I don't think it's important. I'm not confused, and if you understand what's going on enough to disagree with it, you aren't confused either.

More to the point, neither being confused about what dialect is being discussed nor arguing about semantics helps anyone, but at least confusion doesn't waste enormous amounts of time and energy, divide the community, and derail productive discussions.

Look at this thread. I'd like to talk about ORO memory management, which is pretty interesting, but instead we're talking about semantics. The top response is about terminology. Is this what you want?


> I'd like to talk about ORO memory management, which is pretty interesting

I'd like to see talk about ORO memory managment which is not of the following form: "NewLisp isn't really Lisp because of its <expletive deleted> ORO memory management; you cannot replace real Lisp with that kind of cruft ..."

That is why I have tried to articulate what I suspect it is that actually bugs Lisp people about NewLisp. It's not the fexprs or the lists working differently and so on; trying to frame it technically is misguided.

If you have some problem with the trolling behavior of a project, but you try to frame it in technical terms, that's a waste of time.


I hate that. Seeing as I'm a Schemer, I also hate the other classics: It's not really a lisp because false and nil aren't the same, It's not really a lisp because it doesn't have real* macros, and It's not a lisp for various reasons which amount to: it's not CL or any of its predecessors*.

My usual responses to each of these, respectively, are: come off it, even McCarthy says that was totally by chance, and may not have been the best idea; pretty much every implementation has really macros, and even the original lisp didn't have real macros; and finally, come on, Rainer, we've had this discussion like three times already.


But I don't think the project is trolling, and I don't have any problem with the project's behavior. I'm curious about ORO. You're the only one who's looking for a fight here.


No, not especially. I think terminology matters, but not that much, and not compared to the more interesting aspects of NewLisp. That's why I didn't upvote Kaz. I'd down him, but I'm OP, so I can't.


Naming a project in such a way as to troll people is not a matter of "terminology".

Terminology issues consist of inter-language conflicts among definitions of technical terms like "list", "scope", "variable" or "macro" and such.

The proper noun that Lutz Mueller chooses to call his project is not a definition of a technical term.

To me, NewLisp is uninteresting. I already know how to avoid sharing issues in memory management in C programs by mallocing a copy of everything and duplicating it; that bores me. Lots of programs in POSIX environments do that with strings: strdup everything you receive, and cheerfully free it knowing that nobody else has that pointer. NewLisp is just doing "strdup for lists". It might as well work using strings, just like the "Lisp in sed" implementation: https://github.com/shinh/sedlisp . Why use objects, when pictures of objects provide a reasonable facsimile? I know how fexprs work, and they bore me also. Once you allow them, you can kiss goodbye the future prospect of having a compiler. As an implementor, if I want to introduce a new special operator in an interpreter (a decision not to take lightly), I can write it in the hosting language (such as C), and not as a fexpr. By writing such an operator, I get essentially the same development experience as if I wrote a fexpr, without inflicting harm on the language. If interpretation is a wart, then interpreted code being dispatched to interpret code is a tuft of hair growing out of wart.


> Naming a project in such a way as to troll people is not a matter of "terminology".

1. I see no reason to NewLisp was named to troll people. It only has the effect of trolling you because you have created a subset of the English language in your head which you view as "correct" without any basis in reality.

2. In the English language, "terminology" can be applied to the names of programming languages and people will know what you're talking about. It can also be applied to things like "list", "scope", etc.; generally people use context clues to figure out what you're talking about.

> To me, NewLisp is uninteresting. I already know how to avoid sharing issues in memory management in C programs by mallocing a copy of everything and duplicating it; that bores me.

Thank you for sharing. I'm sure you're very knowledgeable and smart and whatnot, have a cookie. Now can we who are interested and curious about this have a conversation without you butting in, please?

I'd like to know:

What are the performance implications of doing ORO universally? The tradeoff seems to be that copying takes time, but it also allows you to treat everything as stack-allocated, so it saves time on heap management. What's the average case performance like? In what cases is this a good tradeoff? In which cases is this a bad tradeoff?


> Now can we who are interested and curious about this have a conversation without you butting in, please?

Judging by the root post, it appears I started this thread on the subject of the trollish behavior of the NewLisp project. You can collapse it and use another one.

> I see no reason to NewLisp was named to troll people.

I have visited the NewLisp website a couple of times in the last 15 years or so and found trollish content. Example:

https://web.archive.org/web/20041010181233/http://newlisp.or...

Quote: "Don't read books about LISP, if you want to learn newLISP, most deal with Common LISP or Scheme, two different older standards of LISP. These books teach you many concepts unnecessary to learn newLISP, which does some things in a very different way than the older standards. Try to understand newLISP on it's own terms."

There you go!

* Don't read books about Lisp.

* Lisp and Scheme are older standards (NewLisp is the new standard, hence the "New").

Newer version of last sentence:

https://web.archive.org/web/20080726054533/http://www.newlis...

"newLISP does things much differently from the older standards, in ways that are more applicable to today's programming tasks."

Thus, pre-existing Lisps are not applicable to today's programming very well; here is a new standard to fix it. We're bringing back dynamic-scope-only and frexps because today's programming once again requires a Lisp from 1960.

"More applicable" isn't trollish enough, so the webmaster added the text "and on a higher level closer to the problem at hand" sometime by 2012, which stands today. (The "old standards" are low-level languages, incapable of the same level of abstraction offered by NewLisp). At least the "don't read books" remark was removed, though.


Somewhere down the original thread, somebody posted some benchmarks, IIRC.

Not as helpful as I'd like, but hope it helps nonetheless.


...That depends on whether you think interpretation is a wart: if you don't need the perf, a lot of us don't.

And you have no evidence that that was Lutz's intent.


> Despite what the CLers say, we all have an equal claim to the name.

Only actual Lisp dialects have a claim to the name.

That's why Scheme is called Scheme and not Lexical Lisp. That's why Logo is called Logo and not Simple Lisp. That's why Javascript is called Javascript and not WebScheme. That's why Racket is called Racket, and not NewScheme.


> Only actual Lisp dialects have a claim to the name.

Frankly, I don't give a fuck about anyone's "claim". If people can use a word to move an idea accurately from their brain to other people's brains, communication has occurred and I'm happy.

If I say "NewLisp" and my audience knows what interpreter I'm talking about, I have succeeded in my goal.


And your definition of what makes a Lisp is pretty strange. For one thing, Racket and Scheme are definitely in the lisp family.


It's not my definition. I just happen to agree with a lot of others, many who don't post here. It's also not new. I just remind people of a view, you might never heard of, because you never were involved in Lisp meetings, Lisp conferences or Lisp standardization efforts. Lisp is old, and many younger people don't know much of its history and evolution.

See for example Pitman, Lambda the ultimate political party

http://www.nhplace.com/kent/PS/Lambda.html

Published in Lisp Pointers (a regular Lisp magazine, published in the 80s/90s) in 1994.

> Lisp users are accustomed to being able to write programs that take program texts in different dialects (or older versions of the same dialect) and process them to bring them up to date. No single common linguistic feature supports this. And yet, Lisp users are accustomed to expecting that there will be some way to accommodate the needs of compatibility and translation.

That's is what makes Lisp, in my opinion. Sharing of code, implementations, libraries, books, tools, ...

A language spectrum serves a diverse community. It is not a dead body of abstract and vague principles. It grows out of sharing and communication.

Kent knows the Lisp and Scheme landscape very well. He worked as developer, in standardisation efforts (both Lisp and Scheme), as an implementor, editor, author, etc.


I read Pittman. The thing is, the Scheme community may not be able to share code directly, but we are able to, and often do, share ideas. In addition, many Schemers are active in the CL community and vice versa, like Pittman. Many of the same people were involved in the standardization effort, and the creation of both.

A language spectrum does indeed serve as a diverse commiunity. One us Schemers are a part of.

I know my history, I just disagree with you.


His name is Pitman. Kent Pitman.

> Many of the same people were involved in the standardization effort, and the creation of both.

I doubt that this is the case for R7RS.

> The thing is, the Scheme community may not be able to share code directly, but we are able to, and often do, share ideas.

That's my point. There are few ideas shared and no code. Today the various spinoffs of Lisp have their own communities. I don't say that it is bad or good, it's just the way it is. That's not only the fate of a programming language family, it also happens in other areas, that domains are divided into groups which have problems to understand each other and which are not able to track or use the work of other groups. The groups have different preferences and their topic gets an independent life.


Sorry. I always get names wrong.

As for r7rs, Sussman, at the very least, was on the commitee. Several other Lispers as well, IIRC, but it would take too long to check.

All the Lisps have different communities. That's why the languages are different. There's some overlap, but if the communities were identical than all the languages would be the same.

But then, you could say the same of C, C++, and Java, which don't all share code, but are all in the same family.


Sussman is a Schemer.


To be fair, he's done some lisp work. But that's true enough.

Steele was on the commitee for many years, although not this one.

I'd give you a list of CLers on the commitee, but I honestly can't dig through 12 people's online profiles just to prove a point right now.


I still don't see anything here which indicates to me that I should give a shit about NewLisp being called "NewLisp", or about referring to CL/Scheme/Racket/whatever as "Lisps".


Are you expecting a sound technical argument why designers should show respect at least for the sources where they cribbed most of their core ideas? Not piss into the pool where they learned to swim (even if merely the "doggy paddle")?

Sorry, I don't have that argument.


Kaz's point is that NewLisp is slightly confusing, but it's not, not incredibly, so I don't see that point. I think terminology's important, but that particular instance isn't. As for why people complain about calling things Lisps, it's a mystery to me.

Now complaints about calling Racket Scheme, OTOH, I get. Scheme has a specification which Racket doesn't comply to: they changed the name for a reason when they started to deviate. Calling Racket Scheme is like seeing a language that's kind of similar to C and just calling it C.


how is that any different than Common Lisp?


In a number of ways. "Common" is not that common of an adjective for software projects, compared to "new". Common Lisp is actually derived from dialects that trace their lineage to Lisp (the Lisp, unqualified) standardizing what is common among them. And the full name is actually "ANSI Common Lisp"; "Common Lisp" and "CL" are just nicknames.

Common Lisp people insisting on "Lisp" denoting "Common Lisp" are simply wrong, and that's that.


> Common Lisp people insisting on "Lisp" denoting "Common Lisp" are simply wrong, and that's that.

I don't think that's really true. After all, 'Common Lisp is actually derived from dialects that trace their lineage to Lisp (the Lisp, unqualified) standardizing what is common among them,' to quote someone. The intent of the Common Lisp standardisation effort was to standardise a common Lisp that programmers could rely on.


a Common Lisp. Anybody want to tell the Schemers that they're not Lisp (even though Scheme diverged from MacLisp, much the same as common lisp did), because you said so, for no real reason?


TXR looks more interesting (from your profile).


> you will not hear another criticism from Lisp programmers

From what I've seen of the lisp community, one of their favorite activities is criticizing variants of lisp that aren't their preferred one.


I made one and haven scarcely had any negative eaction from Lispers, including the Usenet newsgroup comp.lang.lisp.

Maybe that's because I didn't go around saying that it's a awesome, new, improved, easy-to-use dialect for the modern world that's going to replace outdated, academic, hard-to-use, stuffy Common Lisp and its ilk.

NewLisp in particular has attracted vitriol, and it's because of the arrogant trolling.

NewLisp is no worse than, say, AutoLisp, yet AutoLisp doesn't attract flaming every time it is mentioned.


[flagged]


What does http://boo-lang.org/ have to do with this?


If you gather three Lisp programmers, you can have at least four different factions.

It's the nature of the thing - a selfish language. It allows you to have things exactly your way, and that's the way it's going to be.

P.S. Emo Philips nailed it down: http://splitframeofreference.blogspot.pt/2013/04/the-greates...


Paul Graham on newLISP, 12 years ago: http://lambda-the-ultimate.org/node/257#comment-1889

With hindsight, turns out that none of newLISP, Goo or Arc set the world on fire.


I love Lisp and I'd use it for everything if I could. But the Lisp community spends too much time talking about how awesome it is instead of making good frameworks that anyone can use, even kids just out of college. It doesn't matter how beautifully made your project is. New programmers make trends. If the framework can't be picked up and run without spending years in the Lisp community it won't take off.

Look at Clojure. They offered the kids that started with Java something new but with a flair that was familiar. If you know Java, Clojure isn't too hard.


I've been working on one:

https://github.com/rongarret/ergolib

The beginnings of a book using this:

https://github.com/rongarret/BWFP

Hello-world looks like:

    (require :ergolib)
    (defpage "/hello" "Hello world")
    (ensure-http-server 1234)


You don't hear about it, but it happens. There's awful (http://wiki.call-cc.org/eggref/4/awful), and Artanis (http://web-artanis.com) for Chicken and Guile respectively. CL and Racket probably have something, I just don't know what as I don't know them as well.


But clojure is also a lisp unless you are using lisp to mean Common Lisp only?


Comments like this are why I hate discussing Lisp.

Nobody cares except language zealots.


Yes we do. Language matters. If you can't say what you mean precisely, in a way that everyone can understand, than you've got a problem.


"Be conservative in what you emit, and liberal in what you accept" (or words to that effect).

Say what you mean precisely. But if you can't tolerate less than precise language from others, then you have a problem. It's your problem. Yes, they should speak more precisely, but you need to learn to live in the world we've got, not the one you wish we have. That means dealing with the people we've got, not the people you wish we had.


I do. But if you're just going to say that terminolgy doesn't matter, I'm sorry to say I don't agree.


No, I'm not going to say that. Terminology matters. Accurate use of terminology matters.

But there's a threshold for when to correct vs. when to just let it slide, and it seems to me that your threshold is too low (at least on this issue).


Entirely possible. It may have something to do with the fact that I'm getting sick of CL users insisting that CL is the only real lisp.


Ah. Sounding off at the Nth annoying person, when you let the first N-1 slide. Been there...


Using lisp interchangeably with Common Lisp is a frequent cause of confusion in lisp related conversations, so I'm hardly a language zealot mentioning it.


I genuinely would love to see apps written in lisp. It's all Javascript nowadays, with about 5-8 frameworks dominating this sad landscape.

Imagine how refreshing it'd be to see Lisp driving webapps, front and back-end.

Not even saying that should be the one size fits all, I'm saying, anything else...


I've personally worked on a Lisp backend, it was probably worse than most of the JS sites you lament about.

Of course, it's probably not Lisp's fault that that project was a mess, but the ultimate level of flexibility that CL provided let bad ideas worm their way down to the compilation level, making it really hard to work your way back out.

Honestly, you've never felt tech debt pain like unwinding reader macros.


Which is why the rule of thumb with any sort of macro is: use minimally and tastfully, and do not write one until there is a proven need.


Netscape had originally planned to use Scheme as their scripting engine. But then Java took off and they wanted something that looked similar... So we ended up with a ten day hack.


Eich ascribed the result less to internal Netscape politics and more to the Scheme interpreter taking longer to implement than expected and there needing to be some scripting language in the browser that shipped - it was either JS or something that he described as being very like PHP avant la lettre.

IIRC this happened in comments on jwz's blog - cite on request, though you'll have to excuse me for waiting until i'm back at a machine with a keyboard and a big display before I go hunting it down again.


No, I never started implementing Scheme. Where did you read that?

Netscape didn't "plan" to do Scheme in any deep or detailed way, either. It was a gleam in the eyes of jg & mtoy when they recruited me. By the time I got there the Java deal was on, and MILLJ was the management edict.


> Where did you read that?

Evidently nowhere! Apologies for the misrepresentation, and thanks for clearing it up.


This reminds me however of HTML4 mentioning Tcl scripting.


https://www.jwz.org/blog/2010/10/every-day-i-learn-something...

duskwuff:

So, since I have to ask: What would we be stuck with if JS hadn't happened?

Brendan Eich:

Something like PHP only worse, which Netscape's VP Eng killed fast (in early July 1995, if memory serves; I did JS in early-mid-May) since it was "third" after Java and JS. Two new web programming languages were hard enough to justify...

also:

Ten days to implement the lexer, parser, bytecode emitter (which I folded into the parser; required some code buffering to reorder things like the for(;;) loop head parts and body), interpreter, built-in classes, and decompiler. I had help only for jsdate.c, from Ken Smith of Netscape (who, per our over-optimistic agreement, cloned java.util.Date -- Y2K bugs and all! Gosling...).

Sorry, not enough time for me to analyze tail position (using an attribute grammar approach: http://wiki.ecmascript.org/doku.php?id=strawman:proper_tail_...). Ten days without much sleep to build JS from scratch, "make it look like Java" (I made it look like C), and smuggle in its saving graces: first class functions (closures came later but were part of the plan), Self-ish prototypes (one per instance, not many as in Self).

and:

Scheme was the ideal admired by the JS supporters at Netscape, and the trick used to recruit me. But (see above) the "look like Java" orders and the intense time pressure combined to take out a lot of good ideas from Scheme that would've taken longer (not much longer, sigh) to implement. One example: just the C-like syntax makes tail calls harder (or less likely: consider falling off the end of a function returns undefined, so the last expression-statement before such an implicit return is not in tail position).


Thanks!

You'll want to wrap that jwz.org link in a referer stripper, for the benefit of those not already savvy about the NSFW redirect he has configured for people who go there from here.


...aaand I can't edit it. Apologies, all...


It's really hard to believe that implementing a Scheme or Lisp would be slower than implementing JavaScript, so I would indeed be interested in your cite.

My recollection is that Eich's bosses wanted something that looked like CJava++.


See your sibling comment (if you haven't already).


Hmmm, I guess my takeaway from that is that it had to look like C, and that iron constraint combined with the time pressure to mean that good features from Scheme were left out, not that implementing JavaScript rather than Scheme was due to time pressure.


Yup. (And it would just figure that, the one time I describe something based on a half-decade-old recollection and don't throw in an "as I recall" qualifier, it'd be the time I misremember in a significantly erroneous way.)


I wrote a web framework for newLISP a few years ago. Actually it's not as fancy as Ruby on Rails, because it's focus is more on templating, routing and data handling. But it does still it's job and I'm using in everyday. Even as a REST backend for our mobile apps in my company AppTruck.

Demo here: http://dragonfly.apptruck.de


> It's all Javascript nowadays, with about 5-8 frameworks dominating this sad landscape.

This is only even a perception in the Silicon Valley echo chamber, and it's not even true there. I worked at a contracting firm that did apps last year and we never once wrote one in JS.


"Imagine how refreshing it'd be to see Lisp driving webapps, front and back-end."

Get web browsers to embed a Lisp in them, and this will happen.

Though even then, Lisp will never be nearly as popular as Javascript due to its unfamiliar syntax.


We have Chibi through emscripten, and SPOCK compiles to JS. Both implement most of R5RS.


Kind of like how Hacker News was written with a Racket made language?


Well, Clojure/ClojureScript?


Yet, functional programming is an in vogue as ever.


It's perpetually going to be huge Real Soon Now.


It's as huge as it is adopted by new C# features


That's just the latest in a long list of languages. This is a process that has been going on for nearly as long as lisp has existed - so nearly as long as people have been programming.


The kinds of things PG was promoting were call/cc for web interaction flows and macros to create a DSL for the application. These are heavily Lispy ideas and not hugely functional.


He wasn't promoting call/cc. I know he wasn't because neither CL nor arc actually support the construct.


Though the runtime arc is built on does support call/cc.


one of the runtimes. And it doesn't, not really, not anymore. It does support delimited continuations, but those are different.


I'm quite sure Racket still supports call/cc.

https://docs.racket-lang.org/reference/cont.html#%28def._%28...


[flagged]


You don't like Lisp? That's fine. There will always be people that don't. They'd get used to it if they tried it, IMHO, but that's their choice.

But leave autism out of it.

Some of my best friends and family are on the spectrum, and they don't deserve having their condition turned into an insult on a web forum. Furthermore, you make yourself look immature by using it as an insult.

I know people in their 20s who can't go to college because of their disability: It's not funny.


It may be an arguably insensitive word-choice, but I wasn't making fun of anyone with a disability. I was using the word 'autistic' to describe the nature of the lisp style of languages. There's not really a good alternative word.

Did not mean to offend, and there was no attempt at humor there.


There are many good alternative words. In fact, given how wide the spectrum of autistism is, it's not really very desriptive of what you mean (actually, what do you mean?)

Glad to hear that you didn't mean to offend. But you wouldn't call something you didn't like "gay," or "jewish." So why autistic?


Not the same thing. I was actually using the meaning of the word. (See 2.) [1]. Some people are fine with this usage of the word, some aren't.

[1] http://www.dictionary.com/browse/autistic?s=t


Oh. You know, I wouldn't be surprised of not many people knew about that...


Yet another reason not to get offended by such things.


He's got the right to say it, and I've got the right to feel offended.

And since you ask, no, I didn't flag the comment. That was someone else.


> He's got the right to say it, and I've got the right to feel offended.

Nobody's questioning that. The point is rather that being offended by such things is (IMHO) silly, especially given that it betrays a certain degree of ignorance. On some level, taking offense is a choice, and citing your right to do so is equivalent to doing something just because you can. More to the point, his word-choice does adequately illustrate his argument, even if you find it distasteful.

Playing the role of the vocabulary police is more often than not a case of making a mountain out of a molehill.


I was unaware of the other definition, otherwise I wouldn't have voiced my distaste quite as harshly.

I work with the facts I'm given. If I suspended judgement until I had all the facts, I'd never get anything done.

And playing vocabulary police isn't a role I favor either. But if you're calling something autistic, even if you're technically correct, in a context like this, congratulations, you've annoyed me. And sometimes I choose to voice that annoyance. Not often, but every once in a while.


If you want to use a Lisp for scripting, embedding it into an application and having great interoperability with C then you also have Guile Scheme (https://www.gnu.org/software/guile/) as an option. I don't have comparative benchmarks but I've never had any issues with its performance.


I wish every programming language homepage had a small code example. I know this is a LISP-like language and I can expect it to look like any other LISP. But I still appreciate getting a glimpse of the language without having to hunt through documentation for a tutorial.

A small code example can tell you way more than a language description. An example is the PL equivalent of the picture is worth a thousand words adage.

Additionally, it is not a given that a LISP will use S-expressions. It could use M-expressions, or even G-expressions (form a LISP like language I helped develop that nobody has ever heard of).

Go, Rust, Ruby, Python all have examples on their homepage. Oddly, Swift does not.


If you love small code examples, then check out newLISP running inside a browser. That REPL is pretty fun.

http://www.newlisp.org/newlisp-js/


Did you work on Ginger? I'd love to see what that looked like :)


Yup, it was Ginger. The syntax was like a mix of Python and Scheme.

   define fib (n)
     if (= n 0) 0
     elsif: (= n 1) 1
     else: (+ (fib (- n 1)) (fib (- n 2)))
    
   println (fib 20)


The slew of differences dont seem very "designed", meaning some corner of the world won't end if life wasn't that way. They seem more like a consequence of "my pet lisp interpreter as I began implementing it". I know that smell 'cos I've been through it myself and subsequently redesigned and reimplemented things.


Exactly. After playing with different Lisp dialects for well over a decade I've come to the conclusion that Common Lisp's decisions are, generally, reasonably well-thought-out and not often terrible. I've not found a Lisp better-suited for the name.

I'd love to see a new version of Common Lisp fixing some of the decisions which were terrible, but I don't see that happening.


Why not, by the way? I've always wondered why the CL community hasn't come up with a new, improved standard for such a long time. Other old languages like Fortran and Ada are continuously updated. Why not CommonLisp?


As a lisp noob, I found many of common lisp's function names to be extremely arcane.


Eh. I lean more on the scheme side of things, myself. One namespace, less historical cruft, and macros that are hygenic by default. It's a pretty good deal.


I actually think that the single namespace and weak macros are design mistakes.

The lack of (as much) historical cruft is probably a Good Thing.

call/cc is a great teaching tool but a terrible thing for use in an implementation supporting production software: it's goto without the limited scope of TAGBODY.

I don't blame Scheme for not learning from Common Lisp, since the former predated the latter; indeed, Common Lisp learned from Scheme.


The single namespace is quite useful, and if you think the macros are weak then you've only been looking at R5RS and not at real implementations: implicit rename, explicit rename, syntactic closures, and syntax-case are all as powerful as CL macros, and a good deal safer (although implicit rename is kind of expensive, and explicit rename requires explicit safety, so syntactic closures or syntax-case are the ways to go)

Call/cc can indeed be useful in real software, although it's uncommon. However, teaching is part of Scheme's domain, so it's still a good thing either way. If you have an especially fast implementation of call/cc (like Chicken), you can use it to implement delimited continuations, which are safer, and can express common uses of call/cc far more elegantly. Racket (it should be noted that Racket isn't a scheme, but it's close enough that it's kind of "in the family" as it were) and Guile (which IS a scheme) have these as part of the core language.


Scheme was earlier developed than Common Lisp. But that's not the whole picture.

The mainline of Lisp, which Common Lisp and the early Scheme (partly) is based on, existed earlier: Lisp 1.5 and then Maclisp. Scheme was then developed over several decades. It took 13 years to get Common Lisp from an idea to the ANSI Common Lisp standard.

Sometimes the same people were involved.

    1973 Maclisp
    1974 Lisp Machine Lisp
    1975 Scheme
    1981 CL development started
    1984 Common Lisp the Language published
    1985 Scheme R2RS
    1986 Scheme R3RS
    1989 Common Lisp the Language II published (with CLOS, error handling, ...)
    1990 IEEE Standard for Scheme published
    1991 Scheme R4RS
    1994 ANSI CL Standard published
    1998 Scheme R5RS 
    2007 Scheme R6RS
    2013 Scheme R7RS small


Including Bayes in the standard library is cool, but newLISP must be one of the slowest LISP interpreters out there.

Its certainly useable - usually on par with Python... But picolisp, Gambit, SBCL are more on par with D or Rebol.

So when newLISP claims to be great for embedding, I feel cautious.


For me Picolisp is a very nice Lisp implementation; easy to understand, enthused founder who has been at it since the 80s. Which is amazing in it's own right. And a solid community. Won't go away any time soon. Gambit is the same age, but I find the source code far harder to read. That's a matter of taste I guess.


F-Expressions are awesome fun.

Gambit internals does kind of look like black magic half the time, but the author implemented environments because I asked [0], which is very cool.

The more experimental lisps, the better we can make the future. I love how the lispish community views PLT, and always tries out someone's new idea. (Bayes integration in newLISP is cool. LFE is also cool.)

[0] https://github.com/gambit/gambit/issues/204


Ditto on Fexprs, and PicoLisp!

Xtlang in Extempore is a whole lot of fun [1].

Echolisp has amazing built-ins and runs in the browser [2].

Gambit is great to get C code out of.

I like LFE, but I have to learn more OTP/BEAM to really use it.

Wasp Lisp is a lot of fun for me, and for playing with distributed programs.

[1] http://extempore.moso.com.au/

[2] http://www.echolalie.org/echolisp/


Hadn't heard of Wasp [0] before. Looking forward to trying it out!

[0] https://github.com/swdunlop/WaspVM


Have you tried Kernel?


Fexprs should be fun, it stands for fun-expression after all... Right?.. I am not a lisper though...:D


No, although what it stands for has been lost to the ages.

They're functions whose args are passed unevaled.

It kills a whole variety of compiler optimizations, but in an interpreter, there's no reason not to have them.


My only criticism (not really criticism)is that Windows is only supported through Cygwin. I will say you can tell the community is very competent. 9/10, the picolisp example on RosettaCode is the most readable while being in the top 5 shortest for loc (kind of hard to beat J in loc).



How much of an issue does fixed point over floating point become in picolisp?


The whole newLISP semantics (everything dynamic, no closures) serves for it to be one of the fastest pure interpreters out there. IIRC it is even significantly faster than CPython (on microbenchmarks).

The reason for that lies in the fact that newLISP's interpreter does not have to do that much of pointer chasing as other interpreters which implement some kind of lexical scoping. In dfsch I found out that about 80% of time is spent in symbol-lookup related code and had to resort to some ugly tricks (inlining constant references in AST, microoptimalizations in lookup code...) to get it's performance to be somewhere between CPython and newLISP.


D is more like C++ and I guess SBCL isn't too far from that realm in some cases (certainly slower in most). Rebol, python, and picolisp are all interpreted though and quite slow by comparison.


SBCL really isn't that slow [0], being on par if not faster than Java in most things.

Python is a magnitude of order slower [1] than most things, and that was my point. newLISP is slow.

I do have to take exception to you calling interpreted slow - because you've called Python interpreted, I'm guessing bytecode interpreted fits your view. Picrin [2] is bytecode interpreted, and isn't slow. [3]

By Rebol I mostly meant Red, which isn't exactly slow, as it aims for 1-3x the speed of C. (Though I can't find any decent set of benchmarks for it currently).

[0] https://benchmarksgame.alioth.debian.org/u64q/lisp.html

[1] http://benchmarksgame.alioth.debian.org/u64q/python.html

[2] https://github.com/picrin-scheme/picrin

[3] http://www.larcenists.org/benchmarksGenuineR7Linux.html


> Picrin is bytecode interpreted, and isn't slow. [3]

Those measurements seem to show Picrin at-least an order of magnitude slower than Larceny?


Yep. Larceny is one of the fastest compiled Schemes.

Doesn't mean Picrin is slow, though. It matches Chicken in most of the stats, and Chicken uses the same process (Cheney on the MTA), but Chicken is compiled. Picrin is bytecode interpreted.


In NewLisp's memory management model, I didn't understand if one-reference-only limits what the programmer can do. http://www.newlisp.org/MemoryManagement.html

Does it mean it's impossible for more than one variable to refer to a variable already referenced? If yes, is this a problem in practice? If no, is this the way of the future?


This comes up every time newLISP is introduced somewhere. I'm not a developer who ever tried to create or invent a new language, so my knowledge is really limited regarding which memory management is advanced or dumb.

BUT I've used newLISP in many web business applications, which are still used by hundreds of users, under load and I've never ran into trouble. Developing and coding such big projects has been great (we used newLISP contexts a lot).

I'm trying to say: in practice I don't care about internal memeory management if the language does it job. And in my opinion newLISP is a good tool to solve real world problems.


A brief skim of the document suggests:

- single-reference-only for expressions (so, static object lifetime); deletion is delayed until a stack slot gets reused, allowing return values

- assignment to variables or values passed into functions are done by copy

- EXCEPT certain types are passed into functions by reference as an optimisation

So, basically, values are owned by either the stack or a variable. Assignment to a stack cell destroys what was there before. To keep a stacked value, it needs to be copied into a variable.

That all seems pretty intelligent, and I might steal the idea; except that I don't see any mention of how cyclic data structures or mutable data structures are implemented. Does anyone know how these are handled?


For mutable structures you use the "context" concept, which is also used to build a form of OOP.

For cyclic data structures, I'm not sure. I think one can pass a function, a list or a context to a function that expects a list. This it might do the trick.


From Wikipedia:

> Sharing of sub-objects among objects, cyclic structures, or multiple variables pointing to the same object are not supported in newLISP.

Dang.


Ah, Newlisp -- the Isagenix of programming languages. Sooner or later, every general-programming community will have to deal with a deployment of Newlispoids eager to convince you how great the language is and that finally, programming is fun again. As if programmers hadn't enjoyed themselves much more using "old" Lisps that embraced advanced features like lexical scoping and hygienic macros.


It's interesting. I'm not a fan of dynamic scoping, but it's okay. ORO looks no worse than Rust's borrowing, so that's not a worry, aside from cyclical datastructures.


> newLISP is LISP reborn as a scripting language: pragmatic and casual, simple to learn without requiring you to know advanced computer science concepts.

How is this different from LISP?


If you play with the examples it becomes more apparent, but you can use it alongside gnu-utils like grep with pipes on the command line easier than a lot of traditional lisps that have a slower startup (ex: Clojure). Kind of like Perl.


Clojure is not a "traditional lisp".

And Guile was my Perl in the late nineties. Never experienced startup time issues, even on Pentium-class hardware.


There are ways around the slow startup of Clojure such as

http://leiningen.org/grench.html


From that website:

> Grenchman lets you run Clojure code quickly. By connecting to a running nREPL server you avoid JVM startup time, streamlining your development workflow.

It's irritating to see the same claim repeated again and again. It's not JVM startup time, it's Clojure startup time. Whether it's because JVM is ill-suited for the kind of code Clojure uses or for other reasons I don't know, but JVM itself starts up 100x faster than Clojure on JVM.

Moreover, a daemon running in the background is a just a workaround. To use it you need to write your app in a certain way so that reloading works, and you still need to start the daemon at some point.

I've been working with Clojure for a few months full-time and it was kind of ok, however, for command line utilities, almost anything is better than Clojure. Python, JS with Node, Racket, Ruby, GNU Smalltalk, even Io - all these start up much faster than Clojure. ClojureScript with Node would be an option, but then ClojureScript (and Node). In short: if you need to write a cmd line utility program you should avoid Clojure as much as possible.


I don't agree. Planck (https://github.com/mfikes/planck) is quick and great for command line utilities written in ClojureScript. It runs on macOS' built-in JavaScriptCore. There's also an alpha version that runs on Ubuntu.

Please check the Planck User Guide's section on Scripts http://planck-repl.org/scripts.html


There are a few solutions for scripting with Clojure

http://blog.fikesfarm.com/posts/2015-08-01-planck-scripting....

Personally I wouldn't script in Clojure. If something is simple enough to be a script I'd use bash or Python and leave Clojure for writing non-trivial programs for which startup time is not an issue


Just play with it in your browser ;-)

http://www.newlisp.org/newlisp-js/


It would be interesting to see some speed benchmarks.


And regarding hardware benchmarks this is fun too, seeing on how many platforms this scripting language is running.

(Source: http://www.newlispfanclub.alh.net/forum/viewtopic.php?f=9&t=...)

0.33 ; 2.7GHz Intel Core i5 iMac, 64 bit newLISP 10.3.2 - cormullion\n 0.45 ; 2.4GHz Intel Core i5 MacBook Pro, 64 bit newLISP - itistoday\n 0.5 ; 2.2Ghz AMD Phenom(tm) 9550 Quad-Core Processor 64-bit on Linux IPv4 - pjot\n 0.55 ; Windows XP at AMD Phenom II X2 545, 3 GHz - Cyril \n 0.6 ; 2 x 3.2 GHz Quad-Core Intel Xeon - joe \n 0.63 ; FreeBSD at NFSHOST poss. 2.8 GHZ CPU - lutz\n 0.7 ; FreeBSD at NFSHOST probably the same on a bad day - cormullion 0.71 ; Mac OS X 2.0 GHz Intel Core 2 Duo, 64-bit version of newLISP - cormullion 0.75 ; MacBook Pro 2.4 GHz Intel Core 2 Duo running 32-bit newLISP - hilti 0.8 ; AMD 64 3200+ - newdep 0.89 ; zLinux (for the IBM mainframe) - jopython 0.9 ; Mac OS X 2.0 GHz Intel Core 2 Duo, 32-bit newLISP- cormullion 1.00 ; Mac OS X 1.83 GHz Intel Core 2 Duo - Lutz 1.1 ; Windows Vista 64 at Intel Pentium D 940, 3.2 Ghz. - kazimir 1.36 ; Pentium 4, 2Ghz running Ubuntu 9.04 - robert gorenc 2.24 ; Sun Sparc 1350MHZ processor - jopython 3.25 ; Windows XP at Intel Pentium III, 800 MHz - Cyril 3.40 ; Nokia N900 at 950 MHz - hilti's "numbercruncher" 5.15 ; Nokia N900 at 700 MHz 5.37 ; Raspberry Pi 900 mHz (overclocked with the raspi-config tool) - Hilti 5.44 ; Mac OS X 1GHz PowerPC G4 (eMac) - cormullion 6.72 ; Raspberry Pi 700 mHz 256 MB RAM - Hilti 9.52 ; Sun Sparc Ultra-2 - lutz 13.7 ; Nokia N810 armv61 - newdep 30.64 ; Pentium 90, running DamnSmallLinux - robert gorenc 50.0 ; Intel Pentium 120 - P54CQS - 120MHz - xytroxon



Speed of development or speed of execution? :)

I don't think anybody anywhere has ever chosen a Lisp language because it's fast.


Actually I know of a couple projects that have played with using CL as an intermediate language because of the performance they got compiling with SBCL.


Some of the CL and Scheme compilers are pretty fast, especially compared to similar languages.




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: