Hacker News new | past | comments | ask | show | jobs | submit login

The repls you mention are not like lisp repls. You're being downvoted because your comment makes it sound like you've never programmed a lisp but have strong opinions nonetheless.



Not the OP but would somebody be able to summarize HOW are the lisp REPLs different then to me? I've written limited amount of clojure and common lisp just to play around and I don't recall any difference between Clojure REPL and the REPL I get for say Kotlin inside IntelliJ idea.

Maybe the ability to send expression from the IDE into the REPL with one keybind but I cannot say it's not possible with the Kotlin one right now because that's not what I use it for.


Watch this video on Lisp interactive development approach. I've recorded it especially to answer the question:

https://www.youtube.com/watch?v=JklkKkqSg4c


Thanks, I forgot about this aspect of live program editing. Whether or not it's possible (or how close just quick live reload) is to this it' definitely not a first class citizen like you presented. It also reminds me of Pharo (or maybe just smalltalk, I've only played with Pharo) where you build the program incrementally "inside out".

It does make me wonder how aplicable this way of programming is to what I do at work but that is more because of the technologies and architectural choices where most of the work is plumbing stuff that is not local to the program itself together. And maybe even for that with the edges mocked out it would make sense to work like this.

Again, interesting video that made me think. Thanks.


Yes, Smalltalk/Pharo also support this.

Being able to interactively update code in response to an error, without leaving the error context and being able to restart stack frames (not just a “catch” or top level, as in most languages) is one of the key features that makes REPL-driven development possible. Or at least that’s how I see it.

It’s not something you always need to use, but it can be handy, especially for prototyping and validating fixes.


There's a person above saying that it's about being to able to mutate program state from the repl, which is a thing that's also possible in any repl for a language with managed memory.


Not just from the REPL, but from the REPL in the context where the error occurred, without having to structure the code ahead of time to support this. It’s not always an important distinction, but it’s handy when prototyping or if the error is difficult to reproduce.

There are some other affordances for interactive programming, such as a standard way to update existing instances of classes. I’m sure you could implement this sort of functionality in any language, but this is universal and comes for free in Common Lisp.

CL also has other interesting features such as macros, multiple dispatch, compilation at runtime, and being able to save a memory snapshot of the program. It’s quite unique.


Cl Condition system + repl = godmode. Your software crashes? Do you go back and set a breakpoint? No, because you’re already in the stacktrace in the repl exactly where the crash occurred. You fix the code, reload it, tell it to ether run where it left off, or restart from an earlier point.


Flask and Django have the exact same functionality - I've already said that this thing you guys keep talking is just a matter catching exceptions.

https://flask.palletsprojects.com/en/2.3.x/debugging/

https://docs.djangoproject.com/en/dev/ref/settings/#debug


That is definitely not the same. I write a lot of python code and the interpreter / interactive development is just not as good as it is in Common Lisp.

To my knowledge there’s no real “mainstream” language that goes all in on interactive development. Breakpoints and traceback are all perfectly cromulent ways to debug, but it’s really not the same, sadly.


exceptions unwind the stack in all languages I know except in CL


and Smalltalk ;-)!


good to know! but the point still stands :D


The fact that I’ve never seen a CL lover who can explain this adequately is quite concerning in itself


>you've never programmed a lisp but have strong opinions nonetheless

i've written racket and clojure (and mathematica, which is a lisp). not multiple 10kloc but enough to understand what the big ideas are. claiming i just haven't written enough lisp is basically the logical fallacy of assuming the premise.


But Racket and Clojure are very different from Lisps such as Common Lisp that embrace the idea of a lively, malleable and explorable environment, which is arguably the biggest idea.


> “mathematica is a lisp”


http://xahlee.info/M/lisp_root_of_wolfram_lang.html

http://xahlee.info/M/lisp_vs_WolframLang.html

> WolframLang has all the characteristics of LISP:

seems you either don't know what lisp is or you've never written mathematica


The content on the pages clearly explain the differences.

Mathematica is a symbolic language based on 'rewriting' There are other examples - Prolog would be an example, a logic language. Also most other computer algebra systems are in this category, similar to Mathematica: Macsyma/Maxima, Axiom, ...

> WolframLang has all the characteristics of LISP

It has many, but there are a lot of differences, too.

The big difference is the actual engine. Mathematica is based on a 'rewrite system'. It translates expressions by applying rewrite rules.

Lisp evaluates expressions either based on an interpreted evaluator or by evaluating compiled code. Lisp has macros, but those can be transformed before the code is compiled or running. The practical effect is that in many Lisp implementations usually all code is compiled, incl. user code. Mathematica uses C++ then. Most of the UI in Mathematica is implemented in C++, where many Lisp systems would implement that in native compiled Lisp.

Thus the computation is very different. Using a rewrite system for programming is quite clunky and inefficient under the hood. A simple example would be to look how lexical closures are implemented.

Another difference is that Mathematica does not expose the data representation of programs to the user all the time, where Lisp programs are also on the surface written as s-expressions (aka symbolic expressions) in text.

The linked page from the Mathematica book also claims that Mathematica is a higher level language. Which is true. Lisp is lower level and languages like the Wolfram Language can be implemented in it. That's one of its original purposes: it's an implementation language for other ('higher-level') languages. Sometimes it already comes with embedded higher-level languages. CLOS + MOP (the meta-object protocol) would be an example for that.


> Another difference is that Mathematica does not expose the data representation of programs to the user all the time, where Lisp programs are also on the surface written as s-expressions (aka symbolic expressions) in text.

I have already addressed this: FullForm

https://reference.wolfram.com/language/tutorial/Expressions....

>Thus the computation is very different. Using a rewrite system for programming is quite clunky and inefficient under the hood. A simple example would be to look how lexical closures are implemented.

You're skimming a couple of paragraphs without actually knowing much about Mathematica. It's absolutely not the case that Mathematica is purely a redex system; it's just that it's very good at beta reduction because it has a strong focus on CAS features.


> I have already addressed this: FullForm

No you haven't addressed it. The "Wolfram Language" user typically does not write code in FullForm. It's used as an internal representation.

> it's just that it's very good at beta reduction

and not so good at compiling code...

https://reference.wolfram.com/language/ref/Compile.html

See "Details and Options"


>The "Wolfram Language" user typically does not write code in FullForm. It's used as an internal representation.

I have no clue what you're talking about - it's an available primitive and I use it all the time.

>and not so good at compiling code...

Lol I am 100% sure that the majority of lisps cannot be aot compiled.


> Lol I am 100% sure that the majority of lisps cannot be aot compiled.

Ahead-of-time compiling has been the principal method in mainstream Lisps going back to the 1960's. The Lisp 1.5 Programmer's Manual from 1962 describes ahead-of-time compiling.

The curious thing is how can you be "100% sure" in making a completely wrong statement, rather than some lower number, like "12% sure".


>The curious thing is how can you be "100% sure" in making a completely wrong statement, rather than some lower number, like "12% sure".

The reason is very simple and surprisingly straightforward (but requires some understanding of compilers): dynamically typed languages that are amenable to interpreter implementations are very hard to compile AOT. Now note I have since the beginning emphasized AOT - ahead of time - but this does not preclude JITs.

But in reality I don't really care about this aspect - it was the other guy who for whatever reason decided to flaunt that clisp can be compiled when comparing it with Mathematica.


For someone playing with Mathematica, you have a curious intellectual process. To be clear, I'd rather have you doing that than hocking loogies at cars from an overpass.


> I have no clue what you're talking about

That's not good. Try again.

In Lisp adding two numbers looks like this is source code: (+ 1 2)

  CL-USER 41 > (+ 1 2)
  3
If I quote the expression and evaluate it, the result is (+ 1 2)

  CL-USER 42 > (quote (+ 1 2))
  (+ 1 2)
Thus in Lisp the textual representation of code and code as data are the same.

Not so in "Wolfram Language": a + b has a FullForm which looks differently. The user does not write ALL of the code in FullForm notation.

Source notation

  a + b
FullForm

  Plus[a, b]
Lisp:

Source notation

  (+ a b)
FullForm

  (+ a b)
Can you see the difference?

> Lol I am 100% sure that the majority of lisps cannot be aot compiled.

I'd expect that they can. That's a feature since 1962. SBCL for example does AOT compilation by default, always.

  * (disassemble (lambda (a) (+ a 42)))
  ; disassembly for (LAMBDA (A))
  ; Size: 36 bytes. Origin: #x7006DC83B4                        ; (LAMBDA (A))
  ; B4:       AA0A40F9         LDR R0, [THREAD, #16]            ; binding-stack-pointer
  ; B8:       4A0B00F9         STR R0, [CFP, #16]
  ; BC:       EA030CAA         MOV R0, R2
  ; C0:       8B0A80D2         MOVZ R1, #84
  ; C4:       3CAA80D2         MOVZ TMP, #1361
  ; C8:       BE6B7CF8         LDR LR, [NULL, TMP]              ; SB-KERNEL:TWO-ARG-+
  ; CC:       DE130091         ADD LR, LR, #4
  ; D0:       C0031FD6         BR LR
  ; D4:       E00120D4         BRK #15                          ; Invalid argument count trap
  NIL
Looks like native ARM64 code to me.


> FullForm Plus[a, b]

How can I make this any more clear? You are able, in Mathematica, to write Plus[a, b] with your own fingers on your own keyboard and it will be interpreted as the same thing as a+b

> I'd expect that they can.

Clisp is not the only lisp - I can name 10 others that cannot be compiled.


If we count everyone's one-weekend project that evaluates (+ 1 2) into 3, then there are probably thousands of Lisps that cannot be compiled. So what?


Then the person should spend another weekend and implement a compiler for it.


> You are able, in Mathematica, to write Plus[a, b] with your own fingers on your own keyboard and it will be interpreted as the same thing as a+b

Sure, but it is not Mathematica's InputForm:

https://reference.wolfram.com/language/ref/InputForm.html

The majority of code is written not in FullForm. In Lisp 100% of the code is written in s-expressions.

> Clisp is not the only lisp - I can name 10 others that cannot be compiled.

Typical Lisp and Lisp dialects all can be compiled: Common Lisp, Emacs Lisp, ISLisp, Scheme, Racket, ...

Which Lisps can not be compiled?


>Racket

Do you really know what you're talking about here?

https://docs.racket-lang.org/raco/make.html

>The raco make command accept filenames for Racket modules to be compiled to bytecode format.

That's not a compiler...

I don't claim to be an expert on lisp, so further googling I find

https://racket.discourse.group/t/chez-for-architectures-with...

which has some discussion about this and that native backend.

Suffice it to say I am not any more confident that being compilable is somehow intrinsic to lisp.


From the Racket documentation:

https://docs.racket-lang.org/reference/compiler.html

"18.7.1.2 CS Compilation Modes

The CS implementation of Racket supports several compilation modes: machine code, machine-independent, interpreted, and JIT. Machine code is the primary mode, and the machine-independent mode is the same as for BC."

CS is the new implementation of Racket on top of the Chez Scheme runtime. Chez Scheme is known for its excellent machine code compiler.

"Machine code is the primary mode"

> Do you really know what you're talking about here?

Read above.


If you have time to research Lisp implementations until you gather ten that don't have compilers, you might want to take a few seconds to visit https://clisp.cons.org to find out what Clisp means.


> Lol I am 100% sure that the majority of lisps cannot be aot compiled.

    CL-USER> (defun foobar (x) (1+ x))
    FOOBAR
    CL-USER> (disassemble #'foobar)
    ; disassembly for FOOBAR
    ; Size: 35 bytes. Origin: #x5365BF44                          ; FOOBAR
    ; 44:       498B4510         MOV RAX, [R13+16]                ; thread.binding-stack-pointer
    ; 48:       488945F8         MOV [RBP-8], RAX
    ; 4C:       BF02000000       MOV EDI, 2
    ; 51:       488BD3           MOV RDX, RBX
    ; 54:       FF14251001A052   CALL QWORD PTR [#x52A00110]      ; SB-VM::GENERIC-+
    ; 5B:       488B5DF0         MOV RBX, [RBP-16]
    ; 5F:       488BE5           MOV RSP, RBP
    ; 62:       F8               CLC
    ; 63:       5D               POP RBP
    ; 64:       C3               RET
    ; 65:       CC10             INT3 16                          ; Invalid argument count trap
    NIL
    CL-USER> 
There you go: #'FOOBAR is AOT-compiled down to four MOVs, a CALL, two MOVs, a CLC, a POP and a RET.


> "seems you either don't know what lisp or you've never written mathematica"

Meanwhile, you brought up examples from Mathematica docs that talk about head/tails (car/cdr) but by that logic, Python is a Lisp too because you have:

   list[0]
and

    list[1:]
Maybe your Clojure/Racket experience wasn't enough to teach you what the essence of Lisp was. From your first link:

"Mathematica expressions are in many respects like LISP lists. In Mathematica, however, expressions are the lowest-level objects accessible to the user. LISP allows you to go below lists, and access the binary trees from which they are built."

That right there is telling you that Mathematica is not a Lisp.

Edit: Corrected the Python list example.


I'm sorry but are you really going to pretend like car and cdr are not core to lisp?

>list[0] and list[-1]

That is not car and cdr; closer would be list[0] and list[1:] if lists were cons in python.

>Mathematica expressions are in many respects like LISP lists. In Mathematica, however, expressions are the lowest-level objects accessible to the user. LISP allows you to go below lists, and access the binary trees from which they are built

This is a quote from 1986. I wonder if the language has changed much since then

https://reference.wolfram.com/language/tutorial/Expressions....


Read PG's "Roots of Lisp" and you'll understand what I mean.


I believe that is an argument from authority (if I remember correctly).




Join us for AI Startup School this June 16-17 in San Francisco!

Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: