More seriously though, I find these particular choices interesting. For one thing, they're indicative of his experience as a "greybeard" programmer [1]: Someone who was still active until very recently but started programming 50 years ago. He was always try to bring forwards some computer science ideas from the 60s that never really made it into the modern zeitgeist. His tool choices seemed to represent that. Old tools that had some foundational flexibility that allowed them to stay relevant through the decades.
[1] and not the curmudgeonly kind who will only use what they learned so many decades ago, as demonstrated by his selection of more modern tools for other tasks
Make, particularly, stands out for me. As an embedded/firmware programmer, it's a core tool of mine that I've had to begrudgingly learn. Make is awful, but it's irreplaceable.
At it's core, it's really a mix of three things:
1. A language to define a dependency graph ("target: dependencies") and how the nodes are connected ("rules")
2. a functional scripting language to be able to factorize the declaration of the graph
3. and finally a graph traversal worker to dispatch and perform those rules that are required (including a sophisticated "job server" to maintain the concurrent job limit across child make processes with no developer intervention!)
I haven't encountered a good replacement tool that properly tackles all 3 of these feature sets. (Granted, I haven't done an in-depth analysis of the field...)
I used it on a project with many thousands of files and hundreds of (mostly dynamic) build rules. This was a literate program, and most build steps required the extraction of code from the source docs --- including build rules themselves --- and I still maintained sub-second updates.
I have! Tup was high on my list of potential replacement candidates (despite being initially put-off by the cheekiness of its homepage).
In my experiments though, Tup failed hard on point 2, lacking a good language to factorize the dependency graph declaration. (and to be clear, I think Make's is pretty terrible by itself, before GMSL or Guile extensions come into the picture)
Edit: I see tup supports Lua extensions, which may cancel my complaint above.
What language and/or system were you writing in? I discovered literate programming in 2003 and was excited by it (still am). But it never took off and the tooling was poor (Leo, the Literate Programming Editor, is the one I remember - and shudder thinking about)
Edit: some dead links there. This "bootstrap" directory contains the custom tangler and the custom rule processor. This 10K, plus the tiny configs in the root, are the only code in the project not in documents.
Have you considered just writing your own? The dependency graph should be a DAG - finding dependencies in a DAG is not hard. If you use a language like Python, This will easily give you point 2) - you can express abstractions in a much saner language than make.
Frankly, I have much better things to do than writing my own ad-hoc, bug-ridden implementation of Lisp^h^h Make.
I did look at SCons years ago, which is a Python-based build-system, but as I recall it ended up being the worse of both worlds by attempting a declarative syntax that was marred by its imperative warts.
I doubt it (if we're talking about the same thing since I see no mention of SLIME nor distel in the article). SLIME was developed as a replacement for ilisp which got a little long in the tooth.
ilisp was developed for Emacs to get a similar experience as developing on Lisp machines and now we're so far back in the past that I do not think that way of working was inspired by Erlang.
Agreed. I've used Vim as my editor for as long as I can remember, and combine it with make if a project needs some kind of compilation or post-processing step. The flexibility of using it alongside a powerful general-purpose text editor means you can basically do anything, without the need for an IDE. Even writing mobile apps, and you get a better understanding of what's going on "under the hood" as well.
I'm validated!
More seriously though, I find these particular choices interesting. For one thing, they're indicative of his experience as a "greybeard" programmer [1]: Someone who was still active until very recently but started programming 50 years ago. He was always try to bring forwards some computer science ideas from the 60s that never really made it into the modern zeitgeist. His tool choices seemed to represent that. Old tools that had some foundational flexibility that allowed them to stay relevant through the decades.
[1] and not the curmudgeonly kind who will only use what they learned so many decades ago, as demonstrated by his selection of more modern tools for other tasks