The history of programming languages is one of my personal interest fields, and I find it fascinating how, 25 years on, many of the most popular languages are still slowly encorporating core features of functional languages like OCaml: records, enumerated (sum or variant) types, pattern matching and destructuring, true tail calls, type inference.
To be clear, I’m in favor of these changes, and I don’t denigrate any existing language for not having them, I just still marvel seeing HN headlines like “Python to get match statement”, “Record Types in Java”, “Tail Call support in Rust” and appreciate all the more the things OCaml (and before it, ML) have gotten right for so long.
Keep in mind that a language like Rust would've been effectively unusable in its current form 25 years ago due to the RAM and CPU requirements by the compiler.
As CPU and RAM increases you can make smarter compilers and generalize things like type inference, bigger compilation units and higher level constructs that rely heavily on compiler optimization for good performance.
Many people think that Rust and C++ are too slow to compile today, imagine what it would've looked like on a Pentium II CPU with 128MiB of RAM!
C++ 25 years ago (1996) was terrible. I used to have to keep a large C++ application going on GCC, Sun's compiler and Microsoft's compiler and it was a nightmare. Using templates with classes would invariably cause one or other of the compilers to crash, so we had to concentrate on the simplest subset of the language that worked. STL was very new and not supported on all platforms so we ended up reimplementing a lot of that in our own classes. Build times were glacial and the resulting binaries were enormous. Multiple string types. And horrific things like CORBA's C++ bindings ...
I have used C++ in 2001 with ROOT and it was a long wait most of the time. Depending on your codebase and libraries involved making compiling and linking fast was an art form
25 years ago it wasn't C++11, but C with classes. Less TMP noise, small compilation units instead of the source code of entire libraries being #include'd.
lol or MFC, I checked that out and subsequently pulled the power plug on my pc. Granted I'm probably not a very good programmer. heck my comment is probably not even on topic in this thread.
No. I was born in the 90s. in the 2000s I was mostly doing .Net and Java, started doing C++ seriously in 2010s until I discovered Rust. Have I got the C++ history wrong?
I was using both templates and the stl heavily in 2002. I started learning c++ in 94 or 95 can't remember. Turbo c++ did not have templates, before the 98 standard it mostly was c with classes. And lots of unnecessary inheritance and overloading.
Edit, should include that the experience in 2002 was on a corporate project which ultimately had many millions of lines of code, so not just some crazy guy in a basement. Lots of crazy people were in the basement with me.
> I find it fascinating how, 25 years on, many of the most popular languages are still slowly encorporating core features of functional languages like OCaml
Yes! Ocaml was my first programming language and I have been baffled at how languages I learned subsequently tended to miss what I considered basic utilities like proper pattern matching and tail calls (happily those ideas are now finally entering the mainstream).
I had several years professional experience, including functional programming, and still found it pretty rough going for a while. And even other professionals have often not even heard of it.
My observation (and, I believe the authors of How to Design Programs have some data about this) is that FP is only “hard to learn” if you already know how to program: you have to unlearn things like “for loops” and various techniques that rely on mutation and learn to think in terms of the abstractions FP provides you.
So the biggest barrier that I found when learning ocaml was there were just not a lot of resources or organic examples around.
What there is is usually for a specific course, or is published by like Jane Street or some shit and highly domain-specific as well. I just found it hard to dig myself out when I ran into trouble.
I definitely agree about the FP concepts being no more difficult or even simpler to learn. I used to teach introductory programming to adults and at first we taught loops then map/filter etc and told students to use their preference.
Almost universally they used map, including complete synthesis of the entire concept of iteration without loops. Like when presented with a loop, I once saw a student skim the internal logic, see that it was doing an array push inside an if, and call it "a filter loop." To him the filter was the concept and the loop the tool, which is I think backwards from a lot of us.
I discovered computer science in preparatory classes (a French thing that comes before engineering school, you can think of it as the first years of college) in France where it was the default programming language (I believe they now start with Python and include Ocaml only if you take some options).
Interestingly, it was the students with previous programming experience who had the most difficulties learning the language.
I have not found the reverse to be true, the Ocaml compiler is very strict, but it makes you attentive to details and types in a way that transfers well to other languages.
OCaml was also my first programming language. First year uni in Brazil, 2006! Thanks to a very tough French professor.
The class hated it/him. Type errors were nothing short of terrifying. `function has type (a->(b->c)->(b->d)) but should have type (a->b->(c->b)->d)` or such. If you passed the class (a big if), you'd move on to C++ and OOP and never see Ocaml/functional again.
I appreciated it for providing me so much understanding wrt mathematical thinking - functional programming, recursion, induction. Later on I took some classes/researching using SML.
Fast-forward ~12 years and I get into web dev and what I see functional programming trending!? Kinda funny... I want to get into it somehow, probably Clojure this time. It just resonates with my thinking - maybe a product of being primed for it all those years ago.
Exactly, started university in Paris in 2004 and OCaml was used to teach data structures for instance.
First year was Scheme and C, second year had Java, OCaml, more C, some C++ if you wanted and Mips. Third year added Ada and .Net if I remember correctly.
I'm not trying to be oppositional or take anything away from what you said about OCaml, but I'm interested in the evolution of programming languages and watching how the best ideas are borrowed or stolen and incorporated into new languages, sometimes with additions that improve upon the original. That's why I always check out new languages that are posted on HN. I like to see if the author has a new idea and then which language picks up on it. It's interesting to ponder such things as what will be the C++ response to Rust?
While not a lesser know language, the evolution of await/async from Haskell to C# to Dart by Meijer is a good example that comes to mind. I'm a working programmer without a college degree, so I can't vouch that I have the correct interpretation of the computer science. See "Confessions of a Used Programming Language Salesman - Getting the Masses Hooked on Haskell' by Erik Meijer
Programmers generally pick platforms and not languages. C became popular because of Unix and then Windows. JavaScript came with the browser. PHP came with shared hosting. Go sort of came with the "cloud". Java was made by an OS company, and it got new life with Android. So did Kotlin. Swift is for iOS, etc.
There are exceptions like Python and Ruby, but I think the "default" is that a given language is not popular, and there have been almost no platforms written in functional languages. (anticipating the objection: Lisp isn't functional :) )
That's really not the question. Any language that we've all heard of is pretty much by definition a wild success, comparatively. Most languages stay very small.
There are very large first-mover advantages and network effects to programming languages. Programmers want to learn languages existing code is written in; companies want to use languages that lots of programmers already know. Languages that get big for historical reasons tend to stay that way.
Mainstream adoption wasn't the main goal for a lot of functional languages when they were created: the aim was to develop ideas of how programming could be better. The surprising thing, really, is that people write real production in some of these languages, despite how different they are from what most programmers already know. We shouldn't be treating not getting as big as Java as some sort of failure.
Along other things, there was a heavy investment in marketing the languages that came at the same time as the likes of ocaml [1]. In more recent years, we've also seen that backing by a major company that can help building the ecosystem is a major factor, as happened with Go. Finally, some languages flourished because they offer access to a platform that would otherwise be inaccessible, as JS does.
Overall, though, functional languages see progression today, they may not have an exponential growth, but neither did Python, and look where it is now.
Ecosystem. It's not always about the language features, it's about how quickly I can get something working. Having a large, thriving ecosystem is a big part of that.
I’ll add an alternative answer to the variety you’ve gotten: functional programming is often given a bad name for not really corresponding to the way computers actually work. When you’re writing C, it can feel like the code you’re writing corresponds directly to the actual physical processes and components of the machine.
The more higher level you get, the less you feel that (no Malloc in Java, for example), and functional languages are fully based on a model of computation that isn’t the one we physically have in the real world. This can also lead to speed and memory problems, though I think at this point the compilers for these languages are good enough that this is less of a concern than it may have been.
It’s an interesting example of the Blub paradox [0] as well (Paul Graham’s answer to a similar question, “why doesn’t everyone use Lisp”)
What mainstream languages are you thinking about? Java is obviously the one that comes to mind when talking about market dominance, but it was created/supported by extremely wealthy corporations who invested hundreds of millions to push for its adoption.
Do you have others in mind? I'm sure we can find fairly similar circumstances for most of them.
Much as people credit Sun's marketing for Java's success, I think it would have succeeded even without them. Java successfully brought garbage collection to the majority of working developers at the time who were all C and C++ developers. This was a huge step forward that solved a massive pain point. Today this is far less of a compelling argument since most languages are GC'd and C++ is far less dominant, but Java was the right language for the time.
There was another big factor, remember that ISO C++ initial release is from 1998 and while C was standardised in 1990, many compilers in the mid-90's were still catching up to it.
Then POSIX still left too room for each UNIX flavour to decide how to actually implement certain features, e.g. signal.
So Java, despite being initially interpreted felt like how portable code should be, coupled with a rich library.
I recall a quote from an anonymous industry heavyweight when Java came out (paraphrased now because I can't remember exactly where I saw it): "Thank goodness, now I don't have to learn C++."
> Java successfully brought garbage collection to the majority of working developers at the time who were all C and C++ developers.
Isn't this what people refer as marketing? Other languages were garbage collected before Java. Common Lisp and SML both existed a decade before Java. There was also Smalltalk and Eiffel, but for them I've heard that Java being free really helped (which may be called marketing or not, but is still related to Sun money in a way).
- crowd maturity (zero dev cared about safety in the web 2 era, cue wordpress cheese, people flocked to php or ruby because it was fun and easy, only later people started to make solid specs for their languages)
- business-less ethos (I'd bet $10 FP dudes have zero desire nor skills to promote and sell their work)
- culture shock (fp is somehow very rooted in math culture, how many times in the last 20 years did someone at a meeting or an OO class said "explictVerboseVariableNamePlease" ? compare that to
fold f z [] = z
fold f z [h:t] = fold f (f z h) t
)
add various degrees of second degree effects like the fact that complex and hard to sell things don't attract funding, unlike wordpress you get less exposure. Maybe for the best because mainstream business interest would surely distort the original paradigm quest to fit whatever money-heavy requirement lands on someone's desk that day
fold is a library function that is very general, and would be built-in in most other languages (if it existed at all). You wouldn’t write most functions like that.
A long variable name only makes sense when the variable is specific. In fold, you have “a function”; nothing else can be said about it. And since working with “a function” is very common, Haskell programmers have converged on a shorthand for it: f. Sure, it’s harder for an entirely new programmer to understand, but every language has practices like this that need to be learned: take “i” in a for-loop in C.
That's quite removed from reality. In java8+ you'd have to import a BiFunction class to be able to talk about a fold. Don't underestimate the mainstream.
because they didn't have millions of dollars for marketing budgets or the one thing they did that could make startups make millions off their work.
If the French government had realised what they had in OCaml and made it the standard programming language in all of France they would be a software powerhouse by now.
Back when I was a student (circa 2000), caml was one of the "official" language to teach computer science in the first few years leading to engineering school ; and in my case, it also happened to be the first language taught to everyone in my eng school. So I guess the acamedic world was trying it's best. I remember joking with a TA that no one would ever write a line a caml professionally ; I can't wait to be writing reason or caramel at a gig :)
Functional languages are harder to learn than imperative languages, a big factor that I didn't see in this comment thread.
With imperative code, you can just spell out a for loop which makes intuitive sense. With functional code, you need to do so recursively, so you need at least some understanding of recursion, and perhaps even category theory for more complex examples or languages like Haskell.
Just that kind of complexity over an imperative language alone is why I surmise that functional languages haven't reached as much popularity.
You really don’t need to understand category theory to understand Haskell, or even recursion for simple examples. Recursion is frequently used to implement control structures that are either built-in or missing in imperative languages; users of these control structures don’t need to understand how they work.
Which is simpler:
for (int i = 0; i < n; i++) array[i]++;
or
map (+ 1) list
Admittedly, most modern languages have something like foreach, but one of Haskell’s major advantages is that foreach can be implemented as a library function.
For most people, the first one is simpler. The second seems simpler to us as we've used functional languages, but there's a lot going on there. What is map? It's a function that acts on a structure using a function, ie a higher order function. What is (+ 1)? An anonymous function that will be used. Contrast this to the more natural idea of, for each item in the list, increment that value.
You can argue exactly the same for the first example: what is for? Why is for a special construct and not a function? What is (int i = 0; i < n; i++)? What is array[i]++? What is n here?
The functional approach allows you to only care about the what ("add 1 to each element of the list") instead of the how ("create a number that goes from 0 to the size of the list minus one (because our language is 0-indexed, because arrays are in fact directly related to how memory is laid out), then for each value of this number, take the value that's at this index in the array and increment it").
I'd argue that the functional approach is easier to learn, especially if you haven't had prior experience with programming. You also said earlier than functional is harder to learn because you have to use recursion to make for loop. That's not really true, you can just use functions like map which are already here. And when you start tackling recursive data structures like trees, recursion is easier to understand.
Biggest thing impeding Ocaml for the average developer is the situation with the Stdlib, and multicore.
It's not exactly a struggle to get Base or Core working, but it's way less streamlined than it really ought to be. I see things like F# picking up some popularity and it strikes me as a bit of a tragedy on Ocaml's part. .NET is pretty bad to work with, imo. I know C# people will hate that, and indeed if I was meaning C# then they'd be right to feel that way. The fact is, there are few FP languages with a deep platform (like JVM, .NET, js/browser) already, and even fewer pleasant enough to work with on a daily basis.
I've been blessed to use Ocaml for some of my statistics work, and I'm watching very close the OWL project (ocaml.xyz), and hopefully it will come together nicely.
Still, I see the quickstart process of F#, Scala, Clojure, Elixir, even Haskell, and it's just much less headache than Ocaml for the first 10-15 minutes.
Ultimately, I don't know. I use Python a LOT, and I REALLY enjoy it. A lot of what people don't like about Python just doesn't get in my way that much. Really the only thing that I could see myself yearning for in the language is pattern-matching (which we're getting soon) and some form of piping with better lambda support. I really miss |>, <| in the way I build my tools, but in the end I get along just fine.
I'm glad that SO many languages have taken inspiration from Ocaml, and I'm sure that as time goes on, many more still will draw water from the plentiful well of its beautiful type inference, compiler architecture, domain modeling capabilities without annotations, et al.
Finally, I think it remiss to sorta colloquialize projects by a sum of their most recent decisions. It's not difficult to look around and go "Man, ReScript is sort of a shit-show, ReasonML doesn't really offer anything except different syntax at this point, and there's typescript anyhow, which has a vastly superior dev experience in just about every modern code editor." But that doesn't do justice to how incredibly epic it is that a project is still alive and competitive at a deep, academic level, across continents, for 25 years. I could munge around my emails and find projects starting near me that are using things like Coq (now called something else, I think), it's very much still alive for important technical work even throughout the Numpy-pocalypse.
Anyhow, this comment is a mess. Congratulations Ocaml team. Keep on truckin'. <3
As a newcomer it’s a fantastic language but the tooling and the ecosystem story is really lacking. I really think the language needs more hype in order to get more people to do things for it. The versioning story of dependencies is almost inexistant, the interaction between the package manager (opam) and buil tool (dune) is filled with issues, there are no conventions in the way you set up a project (you can put your files in any folder), stdlib vs base? Poor documentation, poor compiler errors, poor managements for patches (opam pin) and different versions of the compiler (opam switch creates a local repo in all your projects), etc.
Coincidentally I’ve been writing up an OCamlbyexample page that hopefully I can post here at some point to motivate people to learn the language:)
I'm an actual OCaml programmer, almost daily, and I don't understand this objection. We use autotools and make as the build tool (but you could use ninja/meson, cmake or whatever), and don't bother with opam/dune. We properly package over 100 OCaml packages in Fedora. For anything else not covered by a package we mostly call into C libraries, which is very easy to do from OCaml.
I guess it depends on exactly what you're trying to do, but we've developed a large amount of complex software in OCaml this way.
You roll your own build system instead of using the built-in tools and you don't understand why people wouldn't want to do that? This isn't a typical situation in Rust or even Haskell these days. Come on.
Rarely are people arguing that a situation like this makes writing software impossible. The question is whether it adds friction, and in adding friction impedes adoption. I think it clearly does, and that tooling is one of the highest-impact features a language can prioritize to improve adoption. If the story for building and packaging OCaml software is "go make yourself proficient in the C build tooling ecosystem first," of course that makes it harder to adopt (and you're not the first person in the community to give this answer, either). This hostility to beginners is one of the major things that irks me about the OCaml community, and OCaml is probably my favorite language.
Is it "being hostile to beginners" to encourage them to learn simple universal tools to build any software in any environment?
When i was a beginner, trying a new language was actually easier: you just installed a compiler. Now you need a "platform", for some even a special dedicated machine.
I wonder who is benefiting this sillo-ing, but I doubt that's the beginners.
I'm not talking about beginning programmers, I'm talking about people who are new to your language community. You can still just install and use ocamlc/ocamlopt, or rustc, or ghc, but the bar has been raised. People picking up a new language today for production use expect it to have a cohesive story for builds, packaging, tooling, and testing out of the box.
>I wonder who is benefiting this sillo-ing, but I doubt that's the beginners.
Everyone benefits from the better tooling we have now. The siloing is a problem that still has not seen a satisfactory solution. Regression in user experience is not acceptable.
It baffles me how much younger devs seam to be unable or unwilling to consider anything else than "communities". I guess the latest 20 years of research in web marketing did that to us.
Being part of a "language community" is not the only way to be a programmer. Not knowing your way outside of the opinionated built tool black box is indeed quite frankly the hallmark of being a beginner, if not a tech illiterate, in my book.
When I originally asked what was the point of Dune, the answer I received from the people pushing that project at that time was that I shouldn't worry, it was just a tool to speed up getting a workable workflow when teaching 1st grade students. It made sense then, I though at some point beginners would naturally outgrow that walking aid. But look at what happened instead!
Mind you, today's experience with ocamlc/ocamlopt for the seasoned programmer is not unchanged either. Because of that new "community first" principle, all the tools tend to work only withing the ecosystem at the expense of cooperating with the larger distribution. Again, younger devs or devs used to "lesser operating systems" never experienced life under a proper system-wide software distribution so they have no idea what they are missing. Of course opam looks great once Debian is broken (and until, let's hope, guix or nix takes over). An exemple of such a regression: querying the type of an ocaml expression from any editor used to be trivial from the easily parsable annotation file producer by the compiler. It's now been replaced by a memory dump that's easier for merlin. And merlin will not even be able to answer that simple query unless you provide it with a full blown configuration. Good luck setting that up without the assistance of the dune built tool. Good luck setting that up if your project involves anything else than plain and simple ocaml.
Ach, I probably start to sound unfair. Indeed this tendency toward infantilism is in no way specific to ocaml, probably ocaml has been resisting it for longer even. And it's not specific to programming languages either, nor even to technology. Pardon my rant, grumpy old man can still feel passionate about computers at times; rest assured he is being transferred to another trade :)
You’re just assuming a lot of stuff I didn’t say. I have a lot of respect for your perspective on software and always love hearing from more experienced programmers. I think universal tools are great for a lot of stuff, but they’re just not as productive as cargo and stack are. Have you tried really getting into one of these? Having package management, toolchain management, builds, testing, formatting, linting, and IDE support out of the box with one tool is really cool, and it’s not something that the distribution-centric tools ever provided.
I will say that adapting to change is part of the human experience and especially part of the programmer experience. I’m 22, but I’m almost entirely self taught as a developer and have only a year of college. I’ve been reading older programmers (and older people) lamenting the new way of things for at least ten years already, it seems to be a constant. What you now call infantilism will be the old way 30 years from now, and the kids writing software then will be doing stuff that confuses me, I’m sure. But this is how progress happens, and change is okay.
> You’re just assuming a lot of stuff I didn’t say.
That's because I took that opportunity to rant more largely about the whole industry.
I frequently ask myself what's the likelihood that this perception that the computing industry is regressing, and that is shared by many of the older devs who were passionate about it, would be just caused by normal grumpiness and confirmation bias. Eventually, I believe tech evolution is more dynamic than progress/regress.
I've witnessed the creation and then the demise of personal computing, unrestricted computer networks, free and open software distributions, unbiased search engines; I hope I will not be around when the Linux kernel momentum is lost and personal general purpose computers become once again inaccessible. There were of course already a lot of deficiencies when I started my career ; the big "Software Crisis" was already well documented, some big corporations were slowing down innovation as hard as they could, and surely many established programmers routinely wrote miles of buggy software on inefficient hardware. But there was a growing, vivid shiny trend, easy to spot, in all this gloominess: a Unix revival on micro computers led by young engineers who just wanted to do the right things. Ok, that momentum is gone now and I'm looking for the next big wave that would push us forward. Meanwhile: business, politics, conventions, ignorance and laziness are eroding that culture.
This is how I picture things, at least; in this picture progress is not automatic, nor is regress.
Please do not believe progress is automatic, or you won't fight for it.
At worse, if that is wrong and progress is indeed automatic, what's the loss?
I don't believe progress is automatic, that's why I like hearing perspectives from people who have been around longer. When I hear the narrative from lots of older people that everything has gotten worse since their youth though, I think you can understand my skepticism. Especially given as your opinions are not universal among people who have a lot of experience.
Anybody today can, in seconds, download dozens of production-ready language runtimes and get started writing programs with a great IDE experience, for free! No cost at all. And this is now a fundamental assumption of software development.
I don't take it for granted because I have some idea of how far we've come, but I read people like you complaining about not being able to work with new build tools and I'll be honest, I assume that you've been left behind technologically and haven't kept up. I'm sure that's unfair, but you don't give these tools credit for their upsides (using new languages is much easier than it used to be, and development with them also scales much better), and you still haven't really explained the downsides fully. Even the C and C++ community is slowly moving in the direction of package managers and integrated tooling.
Sure, when I was younger I remember trying every single package from my distro or that was featured in slashdot or that appeared on freshmeat.net etc. These days my default attitude toward a new piece of tech is a shrug, and the list of techs that I (wilingfully or not) do not learn and skip over is becoming larger and larger with time. I do not believe that's because I'm becoming lazy though; fatigue sure is a thing, but we're living at a time when our universe is expanding faster than the speed of learning anyway. And I believe you too are staying ignorant of most of novelties.
Do I stick to my tools longer than necessary before acknowledging true progress?
Possibly, but not always. I've been pushing ruby over perl/php/python, ngnx over apache, I adopted systemd quite early for some of its practical merits, I pushed for containerd over docker, for nix then guix over Debian, and PL wise I've enthusiastically explored mercury, ATS, and rust way before it was a thing (then decided against it). So although it's true that I would not feel confident in a conversation with young JS programmers I could still name a long list of interesting new techs they have never heard about! :)
One of the hardest thing in a software dev's job used to be to learn to say "no" to product-designers and management. Nowadays it's to say "no" to shiny new techs. Many times this pays off, since most of tech novelties shine only for a brief moment before being superseded by another. The price to pay is to arrive a bit late to the party from time to time. One have to be very passionate and picky to not end up stranded in an isolated ecosystem.
You mentioned IDEs many time. Beware that they are often times such isolated ecosystems. You would not believe how strongly java devs thought no one would ever need to venture outside of Netbean... No I mean Eclipse. No, VScode. Meanwhile, I'm still wondering what problem those are trying to solve; do I have a problem I haven't diagnosed? Must not be the speed of writing or navigating code though, given I'm usually amongst the fastest around.
In programming as well as in real life, things own you as much as you own them.
I can tell you in my case, older folks who still program can be VERY picky on what new tools they elect to invest time in learning. When we elect to "keep up", we do it in the general case and I'll tell you why from my point of view, there have been times in my life when I've chosen to dive into details of a tool, that I would then be disappointed in what I discover and now this time is spent, I cannot get this time back. When we're young, time is cheap.
Look, "keeping up in detail" is the domain for youth because youth has the luxury of the spare time to do it.
Besides, being left behind is actually the destination for all of us simply as the result of our own mortality. I'm suspicious that you might think that the fact some folks "haven't kept up" indicates an error on their part rather than a very valid choice.
heh and in my case I'm just not that good a programmer so theres that.
These language-specific toolchains are often not better. They're complex, opaque, stuff things in places in my home directory, don't work together, and don't work with system packages -- golang is actively hostile to being properly packaged, making it difficult to keep track of what's installed and security updates.
Now if you're saying that the autotools/make/cmake/meson tooling is hard, I sort of agree, but many people are familiar with C build tools already. I don't see how it's easier or harder than learning language-specific tools.
>Now if you're saying that the autotools/make/cmake/meson tooling is hard, I sort of agree
That's not really the main problem to me but it is a problem.
>many people are familiar with C build tools already
Maybe if they're C++ programmers. I don't know anyone outside of the C++ community who has ever used anything aside from Makefiles and language-specific tooling.
>I don't see how it's easier or harder than learning language-specific tools.
The language-specific tools are typically very well integrated with the default testing, packaging, and editor tools ecosystem. The C build tools are not going to be, at least not for OCaml. Language-specific tools show users the best experience a language can offer. More advanced users can always choose to use something else, but the advantages of the whole community using the same tool are hard to beat as well.
I guess we know different developers. I work with Linux / C programmers, and everyone is familiar with the basic C build tools. C tools also deal with testing, and there's a very clear and well-travelled path from ./configure && make && make check && make DESTDIR=.. install to RPMs and Debian packaging.
That's the process for building and installing packages though. I'm talking about writing new ones, on a system with nothing else installed.
I'm omnivorous but I get paid for web development and that's what most people I know are most interested in. And like I said, they understand Makefiles and make, but autotools and Meson are not "basic C build tools," they're both extremely complex and relatively niche. I'm not saying you shouldn't use the tools you use, but you should understand that they are not popular or well-understood outside of your niche of Linux programming with C. I'm sure these programmers could learn how to use them, but most are going to choose not to if it's the easiest way to use the language. They will pick something else.
If you browse job listings, you might be surprised to find that the vast majority of jobs these days are not related to C or Linux development.
Companies are mostly building web and mobile applications, along with the corresponding web servers on the backend. JavaScript, Swift, Go, Java, Python - these are the kinds of languages developers are working with, and none of those languages normally are used with C build tools.
I've done a few small projects in OCaml. It's a brilliant language that I deeply enjoy using. But the situation for newcomers and those who don't have established patterns is bad. It's hard to do anything nontrivial from scratch, and the community doesn't seem to have any consensus around dependency management, builds, project structure, and so on. Make and autotools may work for you, but it's not like your setup is on the ocaml.org homepage, reproducible by an application developer in five lines of shell commands.
It's common now in some language ecosystems for the default tooling to just generate all this stuff for you, and you move on to writing code. So if you're e.g. a JavaScript developer who is interested in functional programming, and you're used to npm or whatever, it's a difficult start.
The commenter I'm replying to uses an entirely different toolchain, and the frequently-recommended Cornell book (https://www.cs.cornell.edu/courses/cs3110/2021sp/textbook/) uses ocamlbuild and utop and makes no mention of Dune. If you (understandably) happen to land in the official OCaml manual at https://ocaml.org/manual/index.html instead of the page you linked to (which I can't find from ocaml.org, by the way, but you're right that the ocaml.org tutorials do mention Dune and opam), then you don't see Dune or opam mentioned at all. I get what you're saying, but I don't think you could disagree with the premise that there are parts of the OCaml ecosystem that a newbie would land in and be confused or misled by here.
That's fine, but if you go to https://ocaml.org/ and click the big 'Install OCaml' CTA at the top of the page, it takes you through installing opam, dune, and ocaml-lsp-server. And as per the latest OCaml Survey, it looks like the majority of the community are using opam and dune: https://docs.google.com/forms/d/1OZV7WCprDnouU-rIEuw-1lDTeXr...
Do some of the doc pages need to be updated? Yes. Do some universities take a long time to update their course materials? Yes. Does this mean there's no conensus? No.
There appears to be a rough consensus but it seems to have emerged relatively recently. I learned OCaml about two or three years ago and at the time the best tooling was Tuareg and Merlin in Emacs. Fortunately for me I love Emacs, but I found the Dune documentation poor and overall I got the sense that the OCaml production ecosystem (vs academic) is largely dictated by Jane Street these days, but the community is kind of quiet about the extent to which Core is the "real" standard library and the default stdlib is an academic curiosity. For one thing, the community seemed to be in a transition period between ocamlbuild and Dune, as the official resources recommended ocamlbuild and the community said everyone uses dune.
Yes, this is changing, and these changes have been ongoing since about 2013 when opam was created. But it's true that they're accelerating. I think it's reaching critical mass. so yes a few years ago I would say your points would be valid, but today they are less so.
Dune itself is only few years old. And there has been quite a lot of work on consolidating ocaml.org as a clear entry point for newcomers. But that is still an on-going work.
The C-ecosystem build tools you refer to are pretty far behind those of pretty much every other langauge. And most developers don't want to wait for their OS to package it to start using the new version of a library (not to mention this style of packaging isn't typically available on windows/macos).
Really? You can put your opam package files and dune files wherever you want and it’ll automagically work. Submodules can be defined far away from the library and it’s fine. If your opam packages are empty it’s fine, dune doesn’t care about versions of packages anyway. If you want to vendor/patch a library, just drop it anywhere in your project. Yet dune seems to be unaware or executables and libraries when you want to run or build something (you need to pass paths).
That's mostly because dune is 'composeable' so it accepts project components anywhere in the tree. But the components themselves are laid out in a certain way:
- A dune file in a specific directory makes that directory a component
- An .ml file in the directory with the same name as the component (in the dune file) becomes the main module of the component
- Any submodules aliased inside the main module are properly wrapped and namespaced.
Sure, but Cargo and go modules/vendors are composable too but there’s still a forced convention that doesn’t allow you to place your files wherever you want.
Oh and, the way name and public_name works is also not great. Basically you never know what is being importer or used because it ultimately depends not on how the folder is named but on either name or public_name in a dune file. For example in rust you can import what’s in folder a/b by doing a::b but in OCaml you import a.b or a_b it it’s what set as name or public_name.
Python's pattern matching is not the same as Ocaml's. It's a statement, not an expression. But I like your line of thinking. Create a variant of python with ML like features without sacrificing compatibility too much.
An effort in this direction is a transpiler I'm working on. So far it sticks to a strict subset of python3. But I'm open to incorporating ML like features in an incompatible way if there is a compelling argument.
By sticking to a statically typed subset we've already given up some compatibility.
I mean, obviously you're right. I guess I just meant that it's good enough for what I'm doing in Python. People don't write verified provers for deep numerical projects in Python, they'd use Ocaml. For digging through a little sqlite collection, parsing through csv files and correlating a few simple features in dataframes, python works fine and the pattern matching that we'll be getting will make it easier to write THAT sort of code.
I don't see Python growing very much more functionally. There's two decades of resisting FP with much vigor, and I expect another two decades of fighting about what little FP has grown into Python thus far.
I'd love to see your transpiler work! It sounds pretty cool.
I keep an eye on mys-lang (@github/mys-lang/mys) and mypyc, to see what's going on in that realm of Python.
Compatibility is a double-edged sword, as I'm sure you realize.
Thanks for the link. I'm thinking along the same lines and might derive some inspiration from Haxe on how to evolve python syntax to support features like GADT and FP that can be easily transpiled to other languages.
It looks like Haxe was designed from day 1 to be transpiler friendly, whereas I'm trying to leverage python's popularity and language ecosystem to achieve similar goals.
Sticking to a strict subset of python has the advantage that all the scripts are directly executable by the python interpreter. If we add some features (say implemented pattern matching similar to OCaml), we lose that.
Perhaps some macros/library features can bridge that gap. Or over time, python evolves to be more like OCaml.
IMHO, OCaml's decision to not encode strings in a Unicode encoding by default is the correct one. OCaml treats strings as just bytestrings, leaving the programmer to take care of the specific encoding, with the help of sophisticated and powerful libraries like Camomile.
Compared to my experience with Python 3 and Ruby, OCaml is so much better. On the rare occasions I need to do something Unicode-y, I'll reach for Camomile, and the rest of the time I'm passing around strings/bytes unimpeded by silly language decisions.
After that, I would recommend the new (beta) version of Real World OCaml — it gets a lot more in depth about more advanced language features: https://dev.realworldocaml.org/toc.html
I recall being exposed to OCaml in an intro to programming language class in college. Learning recursion was pretty mind blowing but ironically, it puts everyone in the class on the same footing: many who have programmed in OOP were just as baffled as those who have never programmed. Looking back, it was a good gateway into discrete math in general.
<rant>
After working professionally with OCaml for a few years, it's kind of a shame that the language is 25y old and hasn't become popular.
The documentation of libraries is awful, there's the Async/Lwt split, small ecosystem, the IDE support is flaky (breaks every now and then when you upgrade, doesn't work properly in VS Code).
My diagnosis (specially after talking to people in the Rust community) is that there's a weak sense of community and weak leadership. The response to conflict is "screw this, I'll do it my own way", instead of making decisions as a community. It's kind of the wild west, people going in different directions doing what they want. There's no alignment in focus or effort to really drive things forward.
We are moving away from OCaml. If you are considering it for medium or large production systems, I urge to stay away from it
</rant>
To be clear, I’m in favor of these changes, and I don’t denigrate any existing language for not having them, I just still marvel seeing HN headlines like “Python to get match statement”, “Record Types in Java”, “Tail Call support in Rust” and appreciate all the more the things OCaml (and before it, ML) have gotten right for so long.