Hacker News new | past | comments | ask | show | jobs | submit login
Not all programmers are alike (a rant on Clojure). (scottlocklin.wordpress.com)
92 points by winestock on Sept 12, 2012 | hide | past | favorite | 84 comments



Labview beats the snot out of anything else for building, say, custom spectrometer control and data acquisition systems. I know, because I had to do this. I’ve seen mooks try to do the same thing in C++ or whatever, and laugh scornfully at the result

What's so great about LabVIEW for spetroscopy? I ask because, as part of my MS thesis, I built a fourier transform spectrometer. After messing around for a few days with some LabVIEW code someone else had written for a similar task, I wrote the control code in C in a couple of hours, including the time it took to learn the (deprecated) LabWindows C API. I have no doubt that LabVIEW has improved since I did my thesis (almost ten years ago), but it's not obvious, a priori, that graphical programming should be better for controls.


It's not that Labview is great for spectroscopy. It's that it is great for building things to control large, complex scientific apparatus (mine was an X-ray Fourier spectrometer; the X-ray beamline was also controlled in this way), manage streaming data from dozens of sensors, do the analysis, and rapidly change things around if you stick new hardware in it or need to run things differently. There was C and DSP programming involved on a VxWorks VME crate where things needed to go fast or talk to unsupported hardware, but controlling the whole mechanism was done via Labview. There was no other sane way to manage the complexity.


Oh no, graphical programming languages again. Slightly OT but I think you made the right choice. I'd like to refer you to a series of posts detailing the practical issues from a few days ago:

http://news.ycombinator.com/item?id=4496494


While any LabView code I've ever seen (admittedly not much) looked like a horrible mess to me, I disagree with your view on graphical programming languages.

I've done a reasonable amount of programming in Max/MSP and while, again, most code I've seen online was a horrible mess, I attribute that to the fact that these languages usually target non-programmer audiences who, sadly, don't get taught about proper encapsulation, abstraction, naming conventions (ie use actually descriptive names!) and software patterns.

My own code read (to me - though others have commented on it too) almost like something you'd write on a whiteboard to show how the software works to non programmers. I encapsulated each chunk of code that does a single task into its own component, so an individual component (ie diagram) never had more than a few high level "things" connected together, so you can tell - at a glance - what does what. Of course, this means that there are many layers of components, from very high level all the way down to low level, but the point is that with proper encapsulation, visual languages are no harder to understand, read or manage than textual ones and I personally feel that, in many ways, they're actually easier.

I also found debugging Max/MSP code to be the best debugging experience that I've ever had, due to the fact that you can visually see where data comes from and flows and you can intercept it at any point to see whats happening.

I also found Max/MSP a dream to do experiment-driven-development and code design and far superior to using the Python and Clojure (the two languages I use most at the moment) REPL's. I think one of the reasons is that while I'm still experimenting with concepts and design/algorithm ideas, I do not need to name things, I can just lay down a few components, connect them and see what happens. In textual languages, everything needs a name, so often you end up with a, b, x etc. I should point out, though, that when I design software (architecture or algorithms), I do it with boxes and lines, so graphical languages may just be a good fit for how I think and I guess not everybody is like that.

Don't even think of keeping things tidy, wires will be everywhere, and the auto-arranger doesn't work well.

With proper software engineering practices, this is not an issue. Without proper software engineering practices, we have the same issues in textual languages.

Modern Tools

This is a very good and important point, but its not inherently a graphical language problem. Things like version control and diffs - sure. They just work in textual languages, but in general, all languages start off with tooling problems. I do believe that good diff tools could be developed for graphical languages and that they would be at least as good as for textual languages and writing a git-compatible file format is obviously possible too (in fact, Max/MSP files look a tiny bit like JSON IIRC). This of course doesn't change the reality that currently there definitely is a tooling problem, so you are very right! But it need not always be that way.

My only complaint about Max/MSP is that I'm one of those people who hates the mouse... I spend most of my days in a keyboard-centric window manager using vim and a keyboard centric web browser, so graphical languages aren't exactly ideal from that point of view.


I've also spent a fair amount of time working in Max/MSP and I would support nearly every statement you said here, even though I'm not a programmer that thinks in graphical terms.

The 'wires going everywhere' thing is potentially a problem, but I usually took it to mean that it was time for more encapsulation or better code organization, which is I think more or less what you were suggesting. On the other hand, lots of artists create amazing work that's so will make you think you're going to have a heart attack just looking at the rats nest. But hey--these are programming amateurs and they're doing amazing things, really fast! I know some musicians who could cook up amazing stuff in Max so fast I would still be thinking about how I would structure the code and they'd be demoing it. It's also important to remember that much of what we focus on as professional programmers is less important to artists for whom cost of implementation is far more important than reuse.

Regarding version control, yes, modern max patches are in fact serialized to disk as JSON. There's an option you can set that causes the program to perform a sort on the structure on save, hopefully minimizing the diff. I worked w/ max patches in git for years, and it was fine though I never really tried to do anything even remotely interesting like merge from another branch; I treated my max patches more like binaries in version control that happened to be stored really efficiently.

In the end, much of the work I would do in Max involved writing a lot of things in C/Java/Javascript and wiring it up, and the reason is, as you alluded to, the mouse. It's just not as productive to move boxes around for me as it is to write text, but boxes and lines make really great glue code for putting together inputs, outputs, and a set of algorithms into a single work, especially if that work is time-related in some way.


these are programming amateurs and they're doing amazing things, really fast!

It's also important to remember that much of what we focus on as professional programmers is less important to artists for whom cost of implementation is far more important than reuse.

These are two things I did not touch on in my comment at all, but you are of course correct and I feel its important enough to mention again. The fact is that people without programming backgrounds are doing amazing things and that they seem to be finding it much easier than with textual languages. The fact is that graphical languages are very successful both in audio (Max/MSP, Reaktor, Synthmaker etc) and graphics (voreen and blender embed a visual language) where they are used successfully by a lot of non-technical people. I've also seen them used in game engines, both for shading/effects/materials and for scripting of game logic[1]. As you mentioned, however, the goals and priorities of these people are generally not the same as those of professional programmers, so often writing once - but quick - is much more important to them than re-usability and modularity and that's perfectly ok if that's what makes sense for them.

boxes and lines make really great glue code for putting together inputs, outputs, and a set of algorithms into a single work

Graphical languages make for fantastic glue code/dependency injection. XML has been traditionally used to externally glue components together and IMHO is often more complex than doing it in the code itself and not really all that much more flexible at the end of the day. Graphical languages are IMHO ideal for this as they show at-a-glance, at a high level, how components in the system are connected, yet still allowing the lower level nitty-gritty algorithms to be managed in a textual language, which may be more optimized for those tasks.

[1] http://www.unrealengine.com/features/kismet/


If you're interested in writing software, labview is a bad choice. If you're interested in solving problems on a computer, labview is often the only choice. Things like it will probably eventually make your job obsolete, as more people can draw flow charts than can deal with correct programming practices.


What's so great about LabVIEW for spetroscopy?

LabVIEW probably has a library to talk to the spectrometer.

Apart from that, LabVIEW has very little advantage.


He had me until "Clojure is popular because Paul Graham is an excellent writer." Did he just confuse Paul Graham for Rich Hickey? I don't get that point otherwise. pg is a lisp guy but I don't see the Clojure connection.

Anyway his overall point seems to be that Clojure isn't the only programming language, or the best one for every case. Well, ok. No one worth taking the time to talk to would ever make that assertion, so it seems like wasted bluster.


I've talked to a lot of people about how they got into Clojure, and a fair number credit Paul Graham's essays for selling them on the idea of using a Lisp. I think there's a pretty clear causal link.


For me it was reading SICP which used LISP (scheme) that made me pay attention to clojure.


Clojure did emerge in the right time, it maybe all corelated, maybe Hickey felt a lisp revival coming when pg wrote his essays and decided to build the one he always wanted. That's a question I'd ask him if I could.


So now pg is not only responsible for Clojure's success, he's the reason it exists in the first place? As a lisp, anyway? This is getting pretty ridiculous, here.


I shouldn't have used 'when', it can be read as an implication, but I just meant around the time pg talked about lisp.


Clojure is my favorite language, and I love Rich Hickey, but I don't think I would have sought out something like Clojure if not for PG's essays.

There was a tweet from Nathan Marz that I felt summed it up nicely a while back "Paul Graham set the bait, Rich Hickey reeled me in". Maybe you're right that I think it's more common than it is because it's true for me, but maybe you think it's less common than it is because it's false for you?


Well, I am quite sure that among HN folk there is a high occurrence of people who found decided to give Clojure a shot because of pg's lisp advocacy. This is his website, after all. But HN isn't the world.

But yeah I'm definitely prepared to entertain the possibility that I undervalue pg's essays in the world of general purpose computing. It may be that indeed Rich Hickey should be paying Paul Graham royalties but I truly doubt it.


The point is that pg popularized LISP, and Clojure is the closest extant LISP to the ideal that pg describes in http://paulgraham.com/hundred.html, http://paulgraham.com/arclessons.html, and #6 of http://paulgraham.com/ambitious.html.


Also, I seem to recall pg said that Clojure was (probably) the new Python w.r.t. The Python Paradox.


I definitely went the path of PG essays -> exploring lisps -> Clojure, and I suspect I'm not alone.


>pg is a lisp guy but I don't see the Clojure connection.

Pg -> promoting Lisps. Clojure -> a Lisp. Pg -> influential. Clojure -> got traction on HN and startup crowd.


This is getting pretty close to the Chewbacca defense.

http://en.wikipedia.org/wiki/Chewbacca_defense


I thought it was pretty obvious. I had a lisp book before I read Graham's essays. I only seriously looked into lisp after PG convinced me to.


It's pretty obvious to you because it's true to you. I guess the irony is lost on you that you've just written a rant about how all programmers aren't the same, and in that rant paint with broad strokes about Clojure is only popular because we've all fallen under the sway of the Paul Graham hypnotoad.

Not true. Some of us have fallen under the sway of the Rich Hickey hypnotoad. :) Anyway, point being that Clojure is gaining traction because it has some objective strengths that converge in a way that isn't common in the realm of programming languages.


I believe I covered that. Though if it were not for PG, everyone would think RH was weird for doing it in Lisp.


You keep saying "everyone" when you mean "I".


And you keep wanting people to mean only "I" when they mean "some".


I did too. The book was On Lisp, by Paul Graham.


There is also the matter that “programming” is an overly broad word, kinda like “martial arts.” A guy like “Uncle Bob” who spends his time doing OO whatevers has very little to do with what I do. It’s sort of like comparing a guy who does Tai Chi to a guy who does Cornish Wrestling; both martial arts, but, they’re different. My world is made of matrices and floating point numbers. His ain’t.

This. The attitude the OP describes is very common on HN as well, and can't be attributed to Uncle Bob only: the idea that programming means web, cloud, and databases. Other influential writers, like Jeff Atwood, make exactly the same mistake[1]. He can't imagine anyone doing any other martial art than Tai Chi.

A bit sad, really.

[1] http://www.codinghorror.com/blog/2009/08/all-programming-is-...


This is common on HN for a reason. The typical Y Combinator startup and the average HN reader does web programming for a living. This is HN's target group and there is nothing wrong with that.

Jeff Atwood has a point that desktop apps are dead. I would not build another business app on the desktop. Users are now used to being able to access their data and applications from anywhere, and without having to install any software.

There must be other forums dedicated to scientific computing and similar topics. These topics do find a place in HN from time to time. However, they are not the norm.


>Jeff Atwood has a point that desktop apps are dead.

Do you mean this literally? Or just business apps? I mean it doesn't seem likely that web or mobile apps are going to replace Photoshop, or Illustrator or Maya in the next decade.


Business apps. However, Mobile Apps have become the new desktop apps - at least until HTML5 becomes good enough.

But my point was that the typical readers on HN are interested in the kinds of posts here (it is a self-serving argument) - which is mostly about building web apps that help people communicate better, share content, connect people, remove middlemen, manage businesses, buy and sell stuff, increase productivity and consume content.

Scientific computing, low level programming, language research, theoretical computer science and the vast array of other computing related stuff that we don't talk about here - they are extremely important. But it doesn't find a place here for a reason and that is just fine.


It's one things, and perfectly fine, for certain programming topics not to have a place here (although I find it a bit sad), it's quite another to talk as if they don't exist or are irrelevant to programmers.


There are already web and mobile versions of apps like Photoshop etc. Adobe themselves created an online image manipulation program and there are several really good iPad apps too. So, yes, it's likely.


I agree with you on the level that it is a very limited view, but as for the OP states that "Most people who consider themselves programmers are employed effectively selling underpants on the internet using LAMP."

I was slightly offended when I read this - I'm 'only' a web developer, but I don't believe I'm any more or less useful or talented than anyone who say, programs kernel drivers.

I'm sure the people who program kernel drivers would disagree with me, on the other hand.

It's a topic which has bugged me for an age now, am I a real programmer? Am I actually making any difference to the world? And the answer to both of those, perhaps, is no, but on the other hand, with the exception of Tim berners-lee, Linus Torvalds and Dennis Ritchie, all of us are pretty much interchangeable. I.E. You could remove us from the team and replace us pretty easily with someone else.


What bugs me is perhaps even more fundamental:

All those "superstar" programmers started somewhere.

Whether they were a teenage-hacker prodigy or not is irrelevant, they were young, inexperienced, and foolish programmers at one point.

Yet once they age (and gain knowledge, or perhaps more importantly, _practical_ experience) they suddenly forget that their journey had a beginning.

When you're on the road, everything feels like the middle. So I think it's important to keep perspective, and realize that (irrespective of age) their journey has just been longer than yours.

---

The other thing is: I think a programmer's fundamental job is to make computers useful for people.

At the end of the day: we provide meaningful abstractions that allow others to get computable tasks done.

I don't think it really matters how you define "get stuff done."

To the end user: the computer is a blackbox.

The fact that they can buy underwear online is just as amazing as the fact that they can turn around and plug in their digital camera. Thanks to the magic of device drivers; they can then download pictures of themselves in their fancy new underwear.

So, no, I don't think a web developer is any less of a programmer. For the same reason I don't look at a heart doctor and think: "he is clearly less of a doctor than the neurosurgeon down the hall."

Just my $.02


> To the end user: the computer is a blackbox.

What about programmers writing programs for other programmers; e.g., IDEs and compilers?


What in Clojure is biased towards web programming (or any other domain)?


This isn't a "rant on Clojure". It's a rant, all right, and Clojure is prominently featured, but the rant is about "Uncle Bob" Martin's assertion that Clojure is or should be "the last programming language". Which I take to be hyperbole, not meant to be taken literally. But then I haven't watched the video.

Clojure actually gets quite a decent nod here, which, given Locklin's overall snarky tone, is saying something.


I /have/ watched the skills matter video months ago. What i took away from the talk wasn't that Clojure is the last programming language, but that;

- Programming language innovation is rather cyclical.

- Frameworks can be re-implemented, languages ecosystems and frameworks are very important but one should not exclude a language because of it's lack of a framework feature (I think he was picking out .NET and Java here, suggesting that enterprises who select C# or Java because of the frameworks are missing the point)

- Polymorphism is achievable in Clojure, so many popular programming techniques are achievable.

I thought the talk was thought provoking and interesting. Whilst it certainly advocated Clojure, I certainly took the conclusion with a pinch of salt. The entire talk is twinged with humour. It's supposed to spark debate and help push people from their comfort zones IMHO.


It was hyperbole. Uncle Bob is opinionated but definitely isn't stupid.


I don't get all the anger. Most popular writers are opinionated, and talk to their "crowd". If Uncle Bob (or Joel, or Jeff, or DHH) had to preface all discussion with "does not apply to people who do scientific programming, make games, or Albanians" we'd probably end up missing most of their points. It seems you understand the writer's context, so be happy that the answers for your neck of the coding woods are different.

Similarly, many posts around here are targeted at people in (at least) the US, often California, and usually SF or the Valley. I don't post "Noooo! It doesn't apply to me because I live in South Africa!" every time, because I understand what the context is. Something doesn't have to be universally true for it to be true in its context, and often useful for us outside of it.

In any case, thanks for the alternate opinion - it's good to be reminded that there are universes outside of ours.


That's true. But once Bob Martin posits that "Clojure could be the last programming language," he's crossing pretty much all programming domains. The original post is right in taking him to task for that statement.


I don't agree. The post comes across as insanely angry. You listened to a podcast and are reduced to ranting? Bad sign... might be time to go play with the dog or something.

The weird thing about this is that it probably wouldn't happen if he were listening to Uncle Bob (or whomever) in person. He'd raise a point, and probably get some reasonable qualification, and move on. But when the talk has already happened, and all you can do is read/listen and stew, people have an amazing ability to parse and reparse the phrasing, treating it like someone's last word and final position forever.

Maybe that makes sense for PG's essays; he seems pretty careful with words. But I don't think most people are that careful.


If you had asked me about Uncle Bob and spending an hour with him, I could have spared you most of it. He is very opinionated, and often really quite wrong.


FWIW, I don't think he has the slightest clue what leiningen is. It is hardly "basically a shell script which Does Things". It is written entirely in Clojure. All the shell script does is tie it together and bootstrap things. I'm fine with criticism, but not blatantly inaccurate information.


Oddly, the bit you call out as critical of Clojure, and use to say he has no clue what he's talking about, is the bit where he says that a tool written in Clojure "works brilliantly" and is an example of how Clojure is more useful to people than Common Lisp.

Read in context, I took that "shell script" statement as trying to describe Lein from the user's perspective-- which would make sense, given that his point there is its utility.

For what it's worth, I've never used Lein, and had no presuppositions about it. When I read the paragraph, I carried away the idea that Lein was written in Clojure and acts like a well-done command-line tool.


I didn't say it was critical of Clojure. It was misinformation and I called it out as such. Regardless of perspective, incorrect is incorrect.

Furthermore, he did clearly state that lein's design is "Oogly" and seemed to be using his shell script statement as an example.

EDIT: Re-reading your post, I'd like to point out that my response here is solely about his remarks about leiningen and not the post in general. I'm passing no judgement on what he knows and doesn't know in general and merely pointing out that Leiningen is absolutely not just a big ol' shell script like he implies. Whether he meant it that way or not, I felt someone needed to point out that it isn't true.


It is oogly (thanks, mostly, to java), and it is a shell script.

scott@thinkpad ~/j64-701/bin $ file /usr/bin/lein

/usr/bin/lein: Bourne-Again shell script text executable

I picked the word "lein" instead of "leiningen" for a reason. That's the part that people use.


In that case, I don't understand the point of what you said at all.


Object Orientation doesn't imply "single dispatch / message passing" style of Object Orientation. Common Lisp's CLOS and Clojure's multimethods and protocols are also object oriented, albeit in a different style.

Of course CLOS is far more powerful than what Clojure has, but I wouldn't be so quick to discard OO in Clojure.

OO doesn't imply mutability: even in Java it's encouraged (per Bloch) to create "functional objects" -- objects that create other objects instead of mutating internal state.


It's always difficult to pinpoint the essence of a paradigm that doesn't have a formal definition. But I disagree with you. In my opinion, message passing is central to OO thinking. It's not a style. It is the essence if there is one at all.

Sending a message to one particular object isn't just a method dispatch mechanism. It also defines "self" and hence what data can be accessed without breaking encapsulation - another principle of OO.

Now, I'm not saying Clojure doesn't support OO. It does. What I'm saying is that whenever you go beyond thinking in terms of passing a message to one individual object you are using features that are not object oriented on a conceptual level. You could even do that in Smalltalk and it still wouldn't be OO.


I think we can agree to disagree here.

What about a language like Perl, where any reference (not just a reference to a hash) can be blessed and be used to dispatch methods? Would you say OO Perl is not OO?

(I think you could argue either way).

Personally I think CLOS is a very interesting mode of doing OO. You could always define define multimethods based only on the first argument. Some parts of CLOS (method interceptors) have also made it to Java in forms of AOP (and in fact the author of AspectJ is one of the authors of CLOS) -- which provides "magic" for Java frameworks such as Guice.

While I also hate to Biblethump, but Alan Kay has also spoken very favourably of CLOS despite CLOS not being message passing. If you prefer, you can think of CLOS as a "meta OO" which lets you implement different OO behaviour including message passing.

Back to Clojure deftype and defrecord do provide encapsulation, by the way: in types with :mutable-unsyncyronized fields can only be accessed via self/this pointer.

I do agree that Clojure's protocols/multimethods aren't a full blown CLOS, but they could (sensibly) be called a form of OO (even if it's a form OO that differs greatly from Java and C++)


This article is OK when defending the "right tool for the right job" statement in the middle of the present Clojure hype that is going on, but this perfectly coherent discussion just gets swallowed by the author's apparent underground syndrome.


What exactly do you mean by underground syndrome?


The habit of disdaining something that recently gained a considerable amount of popularity.

I agree with the article and the author seems like a reasonable and experienced programmer, but the Common Lisp comparisons just felt like hipsterism to me.


I do write for the same magazine as the moustachioed founder of hipsterdom, and my favorite Lisp is very obscure (Lush), but I fail to see any hipstery things in what I wrote. I'm a redneck and I lift weights. I don't use Lush for the reasons that skinny hipsters brag about being cool before it was cool; I use it because it's good at numerics.


Lush really seems pretty neat, actually your post motivated me quite a lot to take a closer look at it.


Dark corners at present: compiling 64 bit C++ code into it. Otherwise; it's my favorite thing.


Ahhh, I see, thanks for the explanation, makes sense.


He/she probably means that people are more likely to listen to Uncle Bob, since he's a well known consultant than to Scott Locklin, which is a shame because he really gives some good points.


> Common Lisp native ASDF is probably very well designed, but it is practically useless to anyone who isn’t already an ASDF guru. ASDF should be taken out back and shot.

This sounds like the best you can say is "it's probably very well designed for the wrong purpose"; and I'm not sure that can be usefully distinguished from "poorly designed".

(I have dabbled with both CL and Clojure, but never used either ASDF or Lein.)


The whole argument is moot. Quicklisp exists now, and you can install anything with it, with one command.


Haven't you realized by now that rants about Common Lisp are based on a weird amalgamated reality where Erik Naggum rants on Usenet, the standard still has unanswered questions and Lisp Machines caused the AI Winter?


And for people who don't know what Quicklisp is: it is like "apt-get" for Common Lisp.

(If you don't know what "apt-get" is you can ignore this thread.)


But unlike .deb it appears to be yet another way of smuggling code onto a box without visibility for the sysadmins or supporting dependencies on anything written in any other language.


Quicklisp is still just a layer on top of ASDF. It's still there, you just don't have to deal with it as much.


ASDF2 is a beautiful work of pragmatic engineering.


> R* and kd-trees are preposterously slow on the JVM compared to the old libANN C++ library, or naive kd-tree implementations. Factors of 100k to 1E6. I may be wrong, but I’m guessing trees confuse the bejeepers out of the JVM

Is somebody here knowledgeable enough to comment further on this? May this be due to excessive allocations and indirections? (It was also one of Bjarne's objections against Java.. composition of classes is always by reference.)


My R* trees in java run within 5x the speed of C++ versions, maybe 3x. I just use object pools and primitive data elements without "accessor" functions.

If you write Java as if the garbage collector and malloc actually worked as advertised, you deserve to have your software run dog slow. Nobody has ever invented a garbage collector or memory defragmentor that works for the usual CRUD Java workloads and for data-heavy or scientific computing tasks.


++APL. It seems like it would be a great language for writing shaders in.


APL? Does anyone use this anymore?

APL was actually the first real programming language I learned, 35 year ago! It was very cool and mind-bending, and has hugely influenced the way I think today. APL has also been hugely influential on the world, but after I learned Lisp, I'd never go back to APL. Though I do wish there were an APL embedded DSL for Lisp.

I wasn't aware that anyone still uses APL for anything real. Back when I used it, it didn't even have arrays of strings. Though you could make a twelve dimensional array of characters instead, for whatever that might be worth.


Dyalog is the most popular implementation; It's alive and kicking, can do COM and .NET on windows. Dyalog is closed source and proprietary.

There's also J http://jsoftware.com which is the APL's designer "fix all the deficiencies" iteration. It only uses ASCII characters, and takes APL to the extreme -- e.g., +/ % # (that is: plus-slash-percent-hash) is a complete unary function that computes averages. J is open source.

And there's also K, http://kx.com which is popular in the financial world. The relation is K:J like C:Ada, I think. Whereas J strives to be pure, and mathematically complete and extremely well defined and rounded, K throws away almost every language redundancy (e.g. it only has nested vectors, which are also used to represent matrices), and does everything to be superfast. In the latest revision of the language, K4, they incorporate a database into the language. There's an open source implementation of K3 called Kona which you can find on github.


It's extremely important in the financial industry.


I have a friend who's a quant at a large pension fund, and although they don't write any new APL applications, they still rely on and update existing APL applications.


Though I do wish there were an APL embedded DSL for Lisp.

I've been working on something like this in Racket, but I'm still exploring the design space (though I worry that I'll spend too much time doing this and never actually be satisfied with anything).


After Prolog ("what do you mean it just figures it out?"), APL made me think about how I approached solving problems. It looks like something out of a wizard's grimoire in a fantasy novel and forces you to think in groups instead of iterations. It would be damn fun to write shaders in it as opposed to stilted C.


Yikes, one minute you're reading about the latest obsession of "Uncle Bob", next you're on a right-wing site defending the Nazi's view of modern art. There has to be some kind of mix between a "Bacon number" and Godwin's law…

By the way, was it Bob Martin who did the TDD sudoku solver? I'm always mixing up my XP evangelists.

And regarding last languages, I'm always reminded of the transputer/4GL/Prolog hype of ages past. Although one might argue that Lisp itself is probably a good candidate - but given the wide variety of existing and possible languages that could theoretically be called by that name, this isn't saying a lot.


Ron Jeffries did the Sudoku Solver.


Good to hear that more folks are bringing some balance to this kind of discussion.


I like the style of this guy. He is right, clojure won't be the last programming language (and it's my favorite language), the last/100 years programming language will be an artificial intelligence.


Except he misses the part where nobody actually thinks that.


Are people really paying that much attention to languages being used? Most of my problems are finding and getting the right libraries to work with my apps. Learning new API or new libraries take up more time than worrying about language issues.


Clojure isn't even a Lisp, it is a bunch of misconceptions with lisp-like syntax.)

Attempt to make so-called lisp by breaking code-is-data concept, or trying to use some yaml-like notation instead of s-expression with annotations (to describe a representation where we need only a structure), without proper recursion is something as far from Lisp as, say, Python.

there is more - http://karma-engineering.com/lab/blog


there is what I mean:

this is idiomatic Lisp:

  (define (keep pred l)
     (cond ((null? l) '())
        ((pred (car l))
         (cons (car l) (keep pred (cdr l))))
        (else (keep pred (cdr l)))))
this is nonsense:

  (defn keep
    "Returns a lazy sequence of the non-nil results of (f item). Note, this means false return values will be included.  f must be free of side-effects."
  {:added "1.2"
   :static true}
  ([f coll]
   (lazy-seq
    (when-let [s (seq coll)]
      (if (chunked-seq? s)
        (let [c (chunk-first s)
              size (count c)
              b (chunk-buffer size)]
          (dotimes [i size]
            (let [x (f (.nth c i))]
              (when-not (nil? x)
                (chunk-append b x))))
          (chunk-cons (chunk b) (keep f (chunk-rest s))))
        (let [x (f (first s))]
          (if (nil? x)
            (keep f (rest s))
            (cons x (keep f (rest s))))))))))
When one breaks the underlying ideas he ruins the spell..




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: