Hacker News new | past | comments | ask | show | jobs | submit login
8 months of OCaml after 8 years of Haskell in production (2023) (chshersh.com)
264 points by droideqa 11 days ago | hide | past | favorite | 277 comments





Firstly, this doesn't mention the thing that bothered me the most in my few encounters; how baroque and fluid the tooling is.

I have regularly run into code that can only be compiled by a rather narrow range of ghc versions; too old or too new and it just won't.

> Haskell probably has the most elegant syntax across all languages I’ve seen (maybe Idris is better because dependently typed code can become ugly in Haskell really quickly).

I have a general dislike for ML-type syntaxes, and Haskell is my least favorite of the ML-type syntaxes I have encountered. Presumably there is some underlying set of principles that the author would like maximized that I assuredly do not.

> There’s utter joy in expressing your ideas by typing as few characters as possible.

Disagree; if I did agree, then I would probably use something from the APL family or maybe Factor.


I agree. I have limited experience with FPs, and I can get a hang of what is happening in OCaml examples relatively easily, but Haskell examples feel more like something akin to Forth - if I really needed, I could parse it with my brain, but cryptic one liners like that feel more like "puzzles" than an actual useful program.

I feel like there's nuance here that a lot of people don't mention.

There are the actual puzzles that have been minified beyond recognition, and one has to work backwards to figure out what clearer code the author started from. I don't think anyone likes this code, but it gets written by people who are more used to writing code than reading it.

There is also the incredibly clear, straight-forward code that could have come from any other language. If you manage a sane production Haskell codebase, you'll have a lot of this.

In between those two, there is clear code but which you wouldn't find in any other language. This is code that uses well-known Haskell idioms that anyone who works with a lot of Haskell code will recognise, but which look like puzzles to someone who has not worked with a lot of Haskell. These are things like

    guard (age person >= 18) $> beer
to return Just beer to someone of age, but Nothing to someone underage;

    Uuid.fromWords <$> getRandom <*> getRandom <*> getRandom <*> getRandom
to return a randomly generated UUID;

    maybe (compute_fresh seed) pure cached_value
to use a cached value if it exists, computing it from scratch if it did not; or

    traverse_ $ \elem -> do
        error "stub"
to create a function that loops through a collection and executes a side effect for each element.

These look like nonsense to people not familiar with the idioms, but they appear frequently enough that they are no longer puzzles.


Maybe this is a bit off-topic but I have to say: I don't _get_ FP languages. I get the idea of having functions with no side-effects. I think that's a great thing. But this can be achieved in every language.

Sure, the statements in the function will have side-effects on local variables, but as long as the function isn't too long, and you can comprehend all of it at a time, that's not a problem.

Your examples of "clean code" aren't any cleaner than what can be found in any imperative language.

Using C++ (with std::optional), for reasons of familiarity.

    (person.getAge() >= 18) ? "beer" : {};

    UUID::fromWords(getRandom(), getRandom(), getRandom(), getRandom());

    cache.has_value() ? cache : (cache = computeFresh(seed));

    for (auto& elem : collection) {
        modify(elem);
    }

Other languages can't help you track that a function that transitively calls 100 other functions is side-effect free, Haskell can.

> UUID::fromWords(getRandom(), getRandom(), getRandom(), getRandom())

This ignores that getRandom() is a side-effecting function (it has to be because it returns something different every time). And that's not just a "local variable". UUID.fromWords val1 val2 val3 val4 works in Haskell too, the "extra syntax" is to be able to all that in IO (i.e. with side effects). If you want it in a form that is more recognisable, you could write it as

  getUuid :: IO UUID
  getUuid = do
    val1 <- getRandom
    val2 <- getRandom
    val3 <- getRandom
    val4 <- getRandom
    return UUID.fromWords val1 val2 val3 val4
but that's unnecessarily verbose if you know the idiom.

Your cache example is similarly side effecting.


I wrote Haskell for a couple years and I _never_ figured out the indentation rules. The haskell people seemed confused that it was even possible to find it confusing

and then there’s “do” notation. I has three people tell me “it’s just a simple mechanical transformation from lambdas” oh yeah? how? they couldn’t explain that either


    mySubroutine = do
      foo <- mkFoo
      bar <- mkbar foo
      mkBaz bar
    
    mySubroutineV2 =
      mkFoo >>= \foo ->
        mkBar foo >>= \bar ->
          mkBaz bar

Note that this is same desugaring for normal let

  let x = y in F   <===>   (\x.F)(y)
just replacing function application with a different function (>>=).

This is not the case in Haskell, because let-bindings can be self-referential, while lambdas cannot (without using a fixed point combinator). Also, in most functional languages, a let-binding causes type generalisation, which is not the case for a lambda.

Not exactly the same: `x` is given a polymorphic type (in F) in Haskell (restricted to values in ML) whereas the unannotated let-over-lambda will give `x`a monomorphic type.

You can desugar let bindings that way but often times (as I believe is the case in GHC) you have a `Let` term in your AST.

I'll agree about the weird Haskell indentation rules.

But do notation really is just syntactic sugar for a chain of >> and >>= applications (plus "fail" for invalid pattern matches). It's not usually pretty or understandable to write it that way, but it's a very simple translation. If the people you talked to couldn't explain it to you, I think they maybe didn't understand it well themselves.


The reason people can’t remember it on the spot is that it’s really not necessary to understand how do-notation desugars in order to use it.

People usually check it once, see that it is correct and forget about it.


Do-notation is not all that different—if anything, it is simpler—than async/await syntax that's gotten popular in a bunch of mainstream languages. I don't know if that makes it easy or difficult in an absolute sense, but it isn't any worse than Python/JavaScript/etc.

The indentation rules are definitely a mess though :/


To understand "do" you need to understand the underlying >>= and return operators for your monad.

If you can use >>= and return to do it then "do" is indeed a mechanical transformation.

It is like await if you understand JS promises.


> and then there’s “do” notation. I has three people tell me “it’s just a simple mechanical transformation from lambdas” oh yeah? how? they couldn’t explain that either

Scala's for/yield (which is pretty similar) you can literally do an IDE action in IntelliJ to desugar it.

I forget the precise Haskell syntax, but do {a <- x; b <- y; c <- z; w } is sugar for something like x >>= $ \a y >>= $ \b z >>= $ \c w . Hopefully the pattern is clear. Were there cases where you actually couldn't see how to desugar something, or are you just nitpicking about them calling it a mechanical transformation without being to explain the precise rules?


I wrote a tutorial about do notation. Maybe it helps https://elbear.com/a-user-guide-for-haskell-monads

> I _never_ figured out the indentation rules

I'm glad I'm not the only one. The white-space as syntax in some cases is very confusing. It took me a while to figure it out.


It is the same as in any ML derived language, or Python.

I also don't see the issue, when there is tooling that complains about broken indentation.


> I _never_ figured out the indentation rules

Weird. Haskell's my preferred language and I thought there was only one indentation rule - if something's further to the right it's part of the same expression, and if something's down it's a new expression.

  f
   x
    y
  g z
... With a slight addendum that you don't need to write 'let' on consecutive lines, e.g.

  let x = ...
      y = ...

That’s funny. I actually love Haskell syntax, it’s by far my favorite language to look at.

People often say that before they realize how complex and messy real-world Haskell looks like. The indentation-based rules aren't even defined anywhere except via imperative code in the implementation.

Hmm, isn't this the definition? 10.3 of https://www.haskell.org/onlinereport/haskell2010/haskellch10...

> The effect of layout is specified in this section by describing how to add braces and semicolons to a laid-out program.

I wasn't going to actually read that in detail to see if it makes any sense, though, but it looks very detailed :).


That's because real-world Haskell tends to not be idiomatic, and it involves a lot of directly telling GHC what to do. Mathematicians don't have to (and rarely, if ever) make efficient code. GHC is a really advanced and impressive compiler, but that really doesn't amount to much. At some point the language just devolves into understanding what arcane compiler internals you need to invoke, and it's more like a bunch of C macros with a fancy runtime.

Nothing forces it to be messy. A well-tended haskell codebase is great. On the oposite, I would say that a poor haskell codebase is painful, but far from being as painful as a poor java/js codebase.

Anecdotally, in my years of haskell at separate places, I had to debug/repl maybe had to repl once or twice. I wish I could say that of mycurrent java gig.


My experience is similar, and in addition I would say that I have come across badly written, difficult to understand Haskell code, but it was always straightforward to refactor it into something clearly written.

I've written Haskell code in production and I still find the syntax the most eloquent. And you don't have to know the indentation rules, you just pick them up from usage. Just like you don't have to know the grammar rules.

Maybe it's because I code Python professionally, or because I use an autoformatter (ormolu) for Haskell, but I have literally never had an issue with the indentation rules.

If that's true, that's such a disappointment. I always thought of Haskell as the pinnacle of mathematical discipline in programming (though too hard for me to wrap my head around!).

Such a pinnacle would be Agda. Haskell has mathematical holes.

Haskell is 30 years old at this point. It was pretty cutting edge in the 90s, but the pinnacle has climbed ever higher.

Oh sir we call that "operational semantics"

You have earned your Martini.

That's interesting you dislike ML syntaxes when they're generally thought to be the cleanest syntaxes. What syntax families do you generally like?

Also, Haskell isn't really ML-syntax. I love MLs but find Haskell syntax pretty ugly.


I also dislike them. And I wouldn't disagree about them being "the cleanest", but cleanliness of syntax is only one concern. I also need to be able to mentally parse it, and on that from OCaml is very bad in my experience (haven't used Haskell much so I can't comment on that).

I've said this before but if you remove all punctuation and capitalisation from English it will make it look "cleaner", but it will also make it awful to read.

Look how "clean" text is if you get rid of all the spaces!

https://en.wikipedia.org/wiki/Scriptio_continua#/media/File%...

Clearly brilliant.


The argument you make about literature is one I have made before, but I don't think it applies here. To me, languages with clean syntax aren't just removing things arbitrarily. They remove things that aren't needed or they otherwise have large amounts of consistency.

Let's look at three implementations of a doubling function in modern languages where at least two of them are considered "clean" syntaxes.

F#:

    let double x =
        2 * x
Python:

    def double(x: int) -> int:
        return 2 * x
Rust:

    fn double(x: i32) -> i32 {
        return 2 * x;
    }
Tying back to my original point, are the brackets, annotations, semicolons, and return really needed? From my perspective, the answer is no. These are the simplest functions you can have as well. The differences get multiplied over a large codebase.

Are people here really making the argument that F# and MLs don't have clean syntax? All three of these functions have the same IDE behavior in terms of types being showcased in the IDE and also in the type-checking processes, with F# and Rust's type-checking happening at the compiler level and Python's happening with MyPy. People might argue that I left off type annotations for F#, and that's true, but I did so because they aren't required for F# to compile and have the same IDE features like the other languages. Even if I did, it would look like:

F# with type annotations:

    let double (x: int) : int =
        2 * x
which is still cleaner.

Idiomatic Rust would look like this:

    fn double(x: i32) -> i32 {
        2 * x
    }
The reason Rust requires types here is because global inference is computationally expensive and causes spooky action at a distance. The only real difference between your F# and this is the braces, which I do personally think makes code easier to read.

Of course there’s more overhead in smaller functions. I’d never write this code in Rust. I’d either inline it, or use a closure, which does allow for inference, and dropping the braces given that this is a one liner:

    |x| 2 * x
Different needs for different things.

    #  Rust
    let double = |x| 2 * x;
    double(3);

    // F#
    let double = (*) 2
    double 3

    -- Haskell
    let double = (*2)       
    double 3

    #  Python
    double = lambda x: 2 * x
    double(3)

If I am understanding that |x| 2 * x in Rust is an anonymous function, another F# equivalent is fun x -> 2 * x.

It's a closure, not just an anonymous function, though it of course has an empty environment so it's not a significant difference in this case. Rust doesn't have syntax for anonymous functions, only closures.

The Haskell and F# examples above use partial application.

This also works:

  let zeroCount = numbers |> Seq.filter ((=) 0) |> Seq.length

I thought that the Rust playground was requiring me to use return, but I was mistaken. However, it actually lends to my original point though because I would wager that dropping the return was part of the influence of ML on the beginnings of Rust.

Expression orientation certainly comes from that general space, for sure.

> Tying back to my original point, are the brackets, annotations, semicolons, and return really needed? From my perspective, the answer is no. These are the simplest functions you can have as well. The differences get multiplied over a large codebase.

The answer is no for this trivial function. As your code goes beyond this it gets harder for humans to parse and so the syntax becomes more necessary.

My earlier analogy is pretty perfect here actually. Do you really need spaces in "thank you"? No clearly not. Does that mean you shouldn't use spaces at all?

> Are people here really making the argument that F# and MLs don't have clean syntax?

No. You are literally replying to a comment where I agreed that it is "clean".


I love most entries in the Lisp family.

I find Python to be quite readable as well.


If you like Python and Lisps, then I'm surprised you don't like F#. It is much more concise than Python, even for pure OOP programming.

> when they're generally thought to be the cleanest syntaxes

That might be true for academics. But most engineers don't consider ML syntax to be the cleanest, since most don't know any ML language.


MLs aren't really academic languages aside from Standard ML. F# and OCaml are very pragmatic languages. The cleaner parts of Rust came from ML.

What syntaxes do engineers find clean? I don't understand the distinction you're making.


I assume they mean:

  * Python, Ruby, C#, Java, Go-style languages?
I imagine most developers operate in neither ML languages nor Lisp-style languages.

The most advanced "FP" trick that I imagine most developers use is Fluent-style programming with a whole bunch of their language's equivalent of:

  variable
    .map do ... end
    .map do ... end
    .flatten
Addendum: or maybe Linq in C# Addendum 2:

And even the fluent-style trick in those languages tends to follow a similar pattern. Using Kotlin and Ruby as examples, since those are my (edit: main) languages,

  variable
    .map do |i| something_with(i) end
    .map { |i| something_else(i) }
    .flatten
shows familiar tricks. The dot operator implies that the thing previous to it has an action being done; and curly braces or the "do" operation both imply a block of some sort, and so a quick glance is easy to follow.

In Kotlin, this gets a little bit more confusing (yes, really) because it's common to see:

  variable
    .map { somethingWith(it) }
    .map { somethingElse(it) }
    .flatten()
And now there's this magic "it" variable, but that's easy enough to guess from, especially with syntax highlighting.

Anything more advanced than that and the cognitive load for these language starts to rise for people that aren't deep in them.

When you're starting working with a new language, that does increase difficulty and may even be so much of a barrier that developers may not want to hop over.


When you start working in a new language, the best thing to do is to get a book and familiarize yourself with any constructs or patterns you are not familiar with. Expecting things to be similar to what you’re used to (which lower the learning curve) is a fool’s errand.

To learn a new language, you need a reason to learn that new language and you need sufficient drive to go through with it.

Having familiar constructs not only make it easier to code-switch between languages (people that work on multi-language projects know that pain pretty well), but also decreases the barrier to entry to the language in the first place.


> most engineers don't consider ML syntax to be the cleanest, since most don't know any ML language.

This is like saying "most uncontacted Amazonian tribes don't like Shakespeare, because they've never read it". Sure, but why would we care about their opinion on this topic?


They might think Shakespeare's story are silly even if they did read it. In fact, there is at least one widely publicized instance where exactly the same thing happened with a culture that wasn't exposed to Shakespeare before:

https://law.ubalt.edu/downloads/law_downloads/IRC_Shakespear...

The same idea is probably true with programmers who have grown used to C-like syntax or even Python-like or Ruby-like syntax. Syntax is at least in great part a cultural thing and your "cultural background" can affect your judgement in many cases:

1. Are braces good? Some programmers find them noisy and distracting and prefer end keywords or significant whitespace, but other programmers like the regularity and simplicity of marking all code blocks with braces.

2. Should the syntax strive for terseness or verbosity? Or perhaps try to keep a middle ground? At one end of the spectrum, Java is extremely verbose, but a lot of Java engineers (who have generally been exposed to at least one less-verbose language) are perfectly OK with it. The trick is that the main way that code gets typed in Java used to be copy-paste or IDE code generation (and nowadays with LLMs typing verbose code is even easier) and reading and navigating the code is done with the help of an IDE, so a lot of the effects of having a verbose language are mitigated. Diffs are harder to review, but in the Enterprise app world, which is Java's bread and butter, code reviews are more of a rubber stamp (if they are done at all).

3. Lisp-like S-expression syntax is also highly controversial. Many people who are introduced with it hate it with passion, mostly because the most important syntax feature (the parentheses) is repeated so often that it can be hard to read, but advocates extol the amazing expressive power, where the same syntax can be use to express code and data and "a DSL" is basically just normal code.


I think Ada's verbosity is one of its strengths. Cannot say the same for Java, however.

There's "verbosity as descriptive clarity", and there's verbosity as in "I already said this, why do I have to repeat myself?" Ada unfortunately still has some of the latter, otherwise I would agree.

I mean, it is supposed to be verbose for descriptive clarity.

Java is really not that verbose if steer out of the 25+ years old conventions that were defined to justify expensive proprietary tooling. Leave all fields package scope or public final, so you don't need getters and setters. On JDK21+ use records and pattern matching. Go procedural + lambda, don't be afraid to use static methods. When OO is required, prefer composition over inheritance. Adopt a proper annotation processor library such as autovalue to eliminate remaining boilerplate.

For me, my problem with Haskell's syntax is less rooted in the fact that it's in the ML-family. It's more to do with just how horrific complex type aliases can end up being thanks to currying. Imagine we have type aliases that are built up out of function signatures, made up of more aliases of function signatures, and this is a tower of some 20 layers deep. If you're working with types at different levels of this tower of garbage at the same time, you very quickly end up in an associativity hell and the type signatures you have to work with are beyond unhelpful.

Combination of both the flaws of currying as a concept, as well as the complexity of Haskell's type system.


Compiler version issues are a pain. For most programming languages I use, I like to constantly update toolchain and library versions. I fight that tendency with Haskell. “cabal freeze” can be your friend.

I now find that using some combination of Claude, Gemini, and ChatGPT makes using Haskell more pleasant for processing compiler error and warning messages, suggesting changes to my cabal or stack configuration files, etc.


Like a moth to flame, I'm drawn to Haskell every 6 months or so. What drives me to it is its ability to express concepts (once you're used to it) parsimoniously - like the point-free style in the example. Monads, I can understand/manage. But by the time I hit Monad Transformers (and the n different packages available), the excitement turns into a headache.

It's also a bummer that you need the containers package for basic DS. So, batteries not included, unlike Python. This also means that you need to use a package manager (stack or cabal) even for non-trivial programs, where a one-line ghci command line would have been more ergonomic.

That said, learning Haskell has had a very positive effect on how I think about and structure the programs I write in other languages ("functional core, imperative shell"). I don't think I'll ever program in Haskell in any semi-serious capacity. But these days, I get my daily functional fix from using jq on the command line.


Back a decade+ ago when I was still using Microsoft in a corporate environment, I found F# and loved it because it changed how I thought about programming (in a very good way) with its being very concise, allowing me to naturally express algorithms, and giving the "functional core, imperative shell" functionality one can't get in the old-and-new-school VS.NET solution sets that comprise a tiered client-server WCF-based model (thanks in large part to the utterly brilliant Juval Lowy's IDesign approach).

Unfortunately, F# was very much a second-class citizen in the VS.NET space, so it was only useful for banging out tight C# classes in F# first, then translating them over to C#. (My fundamental db access classes were developed in F# then translated to C#; the crux were a parallel pair of isomorphic classes that did design-time swappable DB2 and SqlServer access.)

Beyond my refusal to use Microsoft for my OS anymore, it looks like F# has transitioned away from the original Don Syme authored minimal OCaml-based v2, into much more automagical v3+ stuff that I'm not interested in. They're a brilliant design team, but well beyond what I need or want for my needs.

At the end of the day, it's hard to keep things simple, yet effective, when the tool designers we depend on are always one-upping themselves. Somehow, it seems like a natural divergence as design tooling expands to meet the our expanding needs as devs. What's good is that our perspective gets levelled-up, even if the tools either out-evolve our needs/desires or just plain fail to meet them.


I love F#, but haven't used it in a few years.

"more automagical v3+ stuff"

What is going on with this? Any examples?


Perhaps type providers? Sadly they didn't really take off, they work but remain a niche feature. At least I'm not aware if any other feature fits this description closer. https://learn.microsoft.com/en-us/dotnet/fsharp/tutorials/ty...

Or perhaps Computation Expressions? But those are an integral part of F# and one of the key reasons why it's so flexible and able to easily keep up in many areas with changes in C# - something that require bespoke compiler modification in C# is just another CE* in F# using its type system naturally.

* with notable exception being "resumable code" which is the basis of task { } CE and similar: https://github.com/fsharp/fslang-design/blob/main/FSharp-6.0...

If you want to get back to F#, it has gotten really easy to do it because .NET SDK is available in almost every distro repository (except Debian, sigh) - i.e. `brew install dotnet`/`sudo apt install dotnet9`.

I have also posted a quick instruction on compiling F# scripts to native executables in a sibling comment: https://gist.github.com/neon-sunset/028937d82f2adaa6c1b93899...


Yeah, I think it was type providers.

Thanks for the links and help. That's really excellent.

Still, I'm old enough to remember "embrace and extend and extinguish" so I keep Microsoft off my linux boxen. I reject their corporate mandate and the directions their profit-motive has taken them. Besides, I wouldn't ever want my Unix systems to depend on their software's security.


This criticism could be very well placed against Google nowadays.

While at it, please ask Golang team to change YouTube policies.

You may also want to avoid Rust, Java, TypeScript, C and C++. For they too are """tainted""". For good measure, VS Code must be avoided at all cost and, which is terrible news, probably the Linux kernel itself.


Yeah, I'm not a fan of Google at all. I like companies that keep "don't be evil" in their Mission Statement.

No Rust here.

Never cared for Java, its generics were crap, then .NET v2 was here with F# and, welp, Java's boxed ints were crap, too, so nope.

No TypeScript. Javascript was built in a week (or something); ubiquity is not to be confused with good design. So, no node, too, in case you were curious.

C and C++?!? I used C in 1990 in OS class recompiling the Minix kernel for assignments. No Microsoft there, bro.

VS Code is not welcome in my home, either.

No Linux kernel? No biggie, I prefer OpenBSD anyway. Does it run vi, C/C++, and Python3? Of course it does. I'm good to go, dude.

I hope you enjoy your adware version of Windows, which is going to be ALL versions of Windows.


For better or worse, it's time to realize it's not the 90s, 00s or even 10s anymore.

I use macOS as a main device with Linux and Windows for verifying and measuring behavior specific to their environment, while doing so from VS Code. I have a friend who has his home server farm hosted on FreeBSD, still using .NET. Oh, and luckily most of the tools I use are not subject to ideological and political infighting.

I like when the technology can be relied on being supported in the future and be developed in a stable manner without sudden changes of project governance or development teams. The added bonus is not having to suffer from the extremely backwards and equally sad state that Python and C/C++ find themselves in tooling-wise. Python is somewhat fixed by uv, C/C++ - not so much.


Do you also keep Microsoft contributions to Go, Linux kernel, clang, Rust, Java, Mesa, nodejs, Chrome, LSP, out of your Linux box?

Contributions? I don't monitor my box at that level.

But rust, java, nodejs, and Chrome are nowhere to be found (IIUC).

I don't even care what Mesa is.

But Microsoft products are not present.

Many (most?) for-profit corps are not my friends. They have the right to exist; I'll leave them at that.


Unless you boot to terminal, while taking advantage of Microsoft contributions to the Linux kernel, Mesa is the thing that allows your graphics desktop to take advantage of your graphics card.

Are you sure none of the binaries being used by you wasn't implemented in Rust.

By the way, the Rust contributions to the Linux kernel are mostly driven by Google and Microsoft employees.


Maybe something to do with autodiscovering, autogenerating, and then utilizing interfaces over the net. I never used it, but watched a couple of the F# guys' vids about it. That's the extent of my involvement, but that was 10+ya, so my memory is a bit fuzzy.

That is always bound to happen, languages are products, like anything else.

Even Scheme and Go communities had to learn this.


> It's also a bummer that you need the containers package for basic DS. So, batteries not included

Batteries are included, because containers ships with every distribution of GHC, and you don't need to use stack or cabal to expose it:

    % ghci
    GHCi, version 9.4.8: https://www.haskell.org/ghc/  :? for help
    ghci> import Data.Map
    ghci> singleton 5 "Hello"
    fromList [(5,"Hello")]
Please try the same at your end and if it doesn't work for you then you can let me know and I'll try to help you resolve the issue.

Whoa, is this a recent addition? I could be out of touch for a while, TBH.

I'm not sure, but I don't remember a time when it didn't work like this.

I feel like to use even the most basic python tools I need to use a package manager or find solutions for mix matched dependencies.

Is this problem unique to me?


It all depends. For a quick script that I can write in 15 mins that uses an HTTP client, JSON and joins two APIs together, the standard library is OK.

To be fair, it could be almost a bash script (using curl and jq or something like that), but I'm more comfortable with Python.

So for one-off (or almost one-off) scripts, the standard library is worth avoiding dealing with dependencies.


As the comment below alludes to, how many people are just going to import requests as r and then break something else instead of just hitting the client?

In the case of Python, both you the developer and the users need all these packages installed. So it's trouble in perpetuity. The saving grace with Haskell is that it compiles to native binary, so only the developer takes the bullet.

Yeah. I feel nobody uses the standard library. Even for stuff like HTTP there is requests.

Being conservative in the language extensions you use/allow in an organization is pretty important to maintaining usability of Haskell as a language.

Do we ever use TypeFamilies and DataKinds? Sure, but it's very rare.

https://www.simplehaskell.org/ is a pretty reasonable set of guidelines (though we opt for a more conservative approach ourselves)


Seems similar to Perl, where there are about 20 ways to do any given thing, and 15 of those are deprecated or not recommended by the community.

It usually follows that the fancier stuff is done for a reason, not just artistic expression.

So it's not that they're not the best way, it's just that not everyone knows how to do it that well.


Nowadays there are "language editions" (e.g. GHC2024) packing many stable language extensions together. It's definitely safe to turn those on, from a maintenance point of view.

Edit: Link to docs: https://ghc.gitlab.haskell.org/ghc/doc/users_guide/exts/cont...


Yeah - we definitely consider the language edition as we revise our internal lang extension rules...but we still find ourselves even a little more conservative than even GHC2024.

Nice to see some folks encouraging how to solve problems with less fancy features/lang extentions (especially in commercial context), considering not all Haskell coders are programming language/compiler nerds :)

I made a similar transition (1 year Haskell, 2 year OCaml) and this pretty matches up my experience. Some more points:

1. Compiler speed: a clear win for OCaml

2. OCaml 's module system is more explicit (always qualified Map.find). Haskell's type class system is more ergonomic (the instance is found through its type, so no explicit annotation needed, e.g. fmap fmap fmap, where the second must be the Reader's fmap).

3. > If I come to an existing OCaml project, the worst thing previous developers could do to it is have poor variable names, minimal documentation, and 200+ LOC functions. That’s fine, nothing extraordinary, I can handle that.

Though it's not common, but functor-heavy codebase does give you a headache. On the other hand, unifying type class instances across packages is no fun either.

4. OCaml's mixture of side effects and pure code tends to encourage using that in libraries and codebase. So expect more speed and more crashes.


My problem with OCaml is that it's in an awkward spot where the runtime offers very little, so it can't really be compared with most Haskell use cases where you want green threads (with preemptive scheduling), transactional memory, and so on. The runtimes of the languages aren't really equipped to do the same things, basically.

So that leaves OCaml in a spot where it instead competes with more bare runtimes, i.e. compiled languages like Odin, Zig and Rust.

In terms of straight forward control of behavior OCaml loses handily to both Odin and Zig. Likewise with how much effort you have to put in to get good performance out of what you're doing, despite what OCaml enthusiasts say it's not magic fairy dust, you still have indirection in terms of expressing your desired allocation patterns, etc., in OCaml that you wouldn't have in Odin/Zig, making it less appropriate for those situations.

So, OCaml's final upside: Language simplicity... It's not remotely small or simple enough to be compared to Odin. Zig (despite their best efforts) is still a simpler and leaner language than OCaml as well. Zig also has the interesting quirk that it can express functors just with its `comptime` keyword (returning structs with parameterized functions, all done at compile-time) in a much more straight forward way than OCaml can, making it almost a better OCaml than OCaml in that particular respect.

Given the above it seems to me that there's always a [obviously] better language than OCaml for a task; it can't really win because whatever you're doing there's probably something that's better at your most important axis, unless the axis is "Being like OCaml".

I liked writing OCaml with BuckleScript, though, compiling it to JS and using OCaml as a JS alternative.


It has almost 30 years of history, and ecosystem, while those are barely at 1.0.

OCaml's ecosystem is significantly smaller and less well developed than Haskell, and since you've given up on tight and clear control of allocation strategies by choosing OCaml, Rust looks like a very good alternative, which has an absolutely massive ecosystem in comparison to OCaml.

Again, the main issue with OCaml is that it really doesn't have an axis it's even in the top 5 of, except maybe compile times.

I understand people who like OCaml. There's a lot that's good about it, but it just doesn't have any edge. There's almost no way to pick OCaml and objectively have made a good choice.


Lost me, weren't we talking about Zig and Odin, versus OCaml?

Also, OCaml is mature enough for Jane Street, Cisco, Docker, among others.

Which most likely will never pick neither Zig, nor Odin.


I'm pointing out exactly the same thing I was in my original post: There is no axis on which OCaml is top 5. If you want control, there are better languages, if you want ecosystem, almost every language beats it. It's practically speaking never the best choice for anything. Basically the only reason anyone would ever choose it (for pretty much anything) is because they like it.

Zig and Odin beating OCaml, ok.

    (…on Stripe API…)
    OCaml: 1 (last change was 8 years ago, so it’s more like zero)
I’ve been using OCaml for some time, primarily for my pet project, but also and some minor at-work utilities. One such thing was interacting with Datadog which, unsurprisingly, doesn’t provide OCaml SDK.

In short: Experience was great. Not only implementing specs with provided specs while using OCaml tooling was fast but also when I got to the rate limiting, I’ve been able to refactor code in around 20 minutes and then it felt like magic transformation.

My take away from that experience is that I wouldn’t use library availability as a hard argument. Every external library is liability and if language provides comfortable tooling and helpers then probably having own code on use-adequate level of abstraction (instead of high level kitchen sink) might preferred.

For high number of external integration I would use something like Go instead, which has exactly that, but as an API implementer I’d prefer to use OCaml instead.


Isn't what you are describing the curse of Lisps ? That is it is so easy to just roll your own but that there is no third-party consensual libraries and since there isn't there is little adoption ?

There are libraries like cl—ppcre or alexandria. The fact is most features are trivially implemented that you don’t need that much external libraries. Most are glue code anyway. And with lisp, as soon as you can translate the data structures, you can use the available functions in the standard packages.

So you may want that github sdk, but in truth you’re only going to use 2 to 5 functions. Why not get an http package and add the few line of codes for these?


> Both Haskell and OCaml have kinda barebones standard libraries. […] Haskell doesn’t include Map and HashMap;

The Haskell standard library was split off into smaller parts. Map used to be part of the standard library. To date the containers package (which contains Map) is still pre-installed alongside the GHC compiler. So it should be considered part of the standard library.

Check out the documentation for GHC 3.02 https://downloads.haskell.org/~ghc/3.02/docs/users_guide/use... it clearly shows a FiniteMap type being provided.


I have some sympathy for "divide and conquer" but for me, libc and libm and libffi seem like very good splits. libgnuc is just having a lend, libobjc is beginning to feel like I'm being trolled and when I learned about stdlib I decided to stop worrying.

There was a time when the fastest way to resolve circular dependencies in the library chain was to simply add -Lthing multiple times in sequence so that a subsequent library could be sure the name list of the priors were loaded, including the dependencies down "right" of it in the list.

Taking something as fundamental to what FP is like map/reduce and saying "this can live in an adjunct library" feels like somebody took divide and conquer a litte too far.


This is Map as in Data.Map.Map, not as in Data.Functor.fmap.

> Taking something as fundamental to what FP is like map/reduce and saying "this can live in an adjunct library" feels like somebody took divide and conquer a litte too far.

What are you talking about?


> The Haskell standard library was split off into smaller parts. Map used to be part of the standard library. To date the containers package (which contains Map) is still pre-installed alongside the GHC compiler. So it should be considered part of the standard library.

See those words "to date" and "considered" Not Is, is considered, to date. Thats what I am talking about.


Sure. But what does Map have to do with map and reduce?

To be less snarky: this might be a bit confusing to non-Haskellers.

The Map they are talking about here is a key-value-store datatype. It has nothing to do with the 'map' function.


I think the title is misleading. This isn't really about either language in production environments. As other commenters mentioned, a post about production would cover topics like whether there were any tooling / dependency updates that broke a build, whether they encountered any noticeable bugs in production caused by libraries / run time, and how efficiently the run times handle high load (e.g., with GC).

This is more about syntax differences. Even then, I'd be curious how well both languages accommodate themselves to teams and long term projects. In both cases, you will have multiple people working on parts of the code base. Are people able to read and modify code they haven't written -- for example, when fixing bugs? When incorporating new sub components, how well did the type systems prevent errors due to refactoring? It would be interesting to know if Haskell prevents a number of practical problems that occurred with OCaml or if, in practice, there was no difference for the types of bugs they encountered.

This blog post feels more like someone is comparing basic language features found in reviews for new users rather than sharing deep experience and gotchas that only come from long-term use.



The Meta post is particularly interesting. Thanks for sharing!

Not an expert, but I was paid to write OCaml and Haskell for several years. There are pros and cons but my conclusion is that Haskell has cool features but is too complex. I can iterate faster in OCaml, with the same safety benefits. Learning curve is also less steep for beginners. I think it's also more suitable for programming in the large thanks to the module system.

I'm pretty sad that OCaml isn't more popular than it is.


I think OCaml needs a face lift to make it look like Rust.

I believe they did that and called it Rust.

No. I meant literally, leaving all the semantincs intact.

Fun fact: the first Rust compiler was written in OCaml

That's kind of a thing already, the ReasonML syntax (not to be confused with ReScript, which is a fork of both this syntax and the toolchain). The dune build system supports ReasonML out of the box, I think.

There's one nebulous property of Haskell that I've never found in other languages – not even F#, which is near OCaml in PL space: Haskell never gets in the way of what I want to express. In many languages I have an idea for what code I want to write, but then some quirk of the language forces me to express it differently lest it looks weird. Haskell never really feels clumsy that way.

I know this is not objectively true because there are many cases in which Haskell also forces me to write things a different way than I would have intended (e.g. due to behaviour around resource allocations) but they don't hurt as bad, for some reason. I don't know what it is!


for me, its the non-strictness

That helps because it lets you shuffle code around a bit without worrying about computing things in the wrong order.

First class effects and type system enforced purity have made my life as a programmer so much better. They dramatically reduce the size of the state space that must be reasoned about, and having all context being declared in a function definition makes it trivial to really grasp what any given function does.

I do agree with the points about language extensions (and I have certainly cursed my fair share of operator heavy point free code), but until someone makes something better (maybe that thing is even lean4?) Haskell still brings me more joy than any other production ready programming language.


> I feel that I can focus on actually building stuff with this language.

This is very much how I felt when using Rust as a previous production Haskell user. I enjoyed aspects of Haskell, and I'm still very impressed with it, but it seems to call up a kind of perfectionism that I don't experience nearly as much with other languages.


IME, the trick with perfectionism is knowing when to apply it. I.e, don't do it at the beginning but at the moment when the initial version starts falling through. Also, do it on the general area the issue lays (not "everywhere").

Absolutely. But with Haskell in particular, there are endless expressiveness micro-optimizations like "how can I write this point-free", "is there a combinator that would allow expressing this differently", "how can I express more of this in the type system"...

I am sure it's possible to write Haskell and just not fall into that trap, but I found that Haskell was especially prone to nerd-sniping the micro-optimizing part of my brain, in a way that other languages don't.


This for me sums up the library ecosystem and the type of folks who like using Haskell as a hobby. When I write production code, I aim to be as clear showing a good working design that's easy to understand (reading the issue/PR descriptions if necessary). The code style is chosen to maximize human comprehension. What I'm not doing is playing a game of one-upmanship with (virtual) others.

The only parts I'm really interested in optimizing are the bits that matter: factoring, naming, datastructures, algorithms, queries (for database), and minimizing abstractions that don't pay their weight.

I spent some time using F# out of curiosity of both it and OCaml and found that it was very easy to use with the exception of (mutable) arrays.


I've used both of the languages previously for my own fun and private mini-projects. From a perspective of an clueless outsider, OCaml was so, so much better to start with than Haskell. I've always been fascinated with Haskell, but it took me around 5 years of attempts to finally get the tooling to work. Some parts always failed me, especially language server and vscode interaction (hardly something to put blame on haskell itself though). Finally I braced myself and with some hacks (like bash code in haskell comments in Setup.hs) made it all work. Still, to the day HLS would repeatedly hang or not start, and the only solution is to restart vscode. Of course, there could be a problem with my setup etc, maybe I configured it "slightly incorrect" way, I'm just rating my experience.

OCaml was everything opposite. I made two Ocaml tours in recent years, and both times it literally just worked (tm). Granted, I've been using it less than haskell, but the experience of starting out is just heaven and earth. The only issue I have with ocaml tooling is that ideally I'd like to run the language server for real-time hints from the compiler, but also be able to invoke my program interactively. Unfortunately it seems you either have to run "dune build watch", or you can build and run, but not both as there is some locking happening.

As far as the languages themselves go, I'd say haskell is more "fun", in a way that it has a lot of features, and it reads a lot nicer (unless it's point-free code). Monads are pretty fun, although when I finally got through Monad transformers I started feeling "I wish we had no monads tbh" Ocaml feels much more barebone, syntactically less appealing and somewhat clunky. On the other hand there is a kind of spartan appeal to it.

Honestly, I like both of the languages a lot and wish for them to continue their development. I can certainly see myself using both in the future.


> Unfortunately it seems you either have to run "dune build watch", or you can build and run, but not both as there is some locking happening.

This is annoying, yes, but for some use cases you can use `dune exec --watch`, which builds and restarts the executable.


Compiler messages:

The big difference here is that the OCaml compiler has a lot less work to do. It's not that the Haskell error messages are inadequate (they are actually pretty good), but that the amount of compiler features and type gymnastics make the errors deeper and more complex. For example, if you get the parens wrong in a >> or >>=, you'll get some rather cryptic error that only hits home once you've seen it a few times, as opposed to "did you mean to put parens over there?"


Maybe because I haven’t used languages like these in the past, but I hardly think this is elegant, much read readale. I would hate my life trying to parse code like this in a 10k LOC codebase.

strSum = sum . map read . words


It's actually very readable once you get the hang of it. The transition from imperative paradigms to Haskell can be tough, but once you've overcome this barrier, the code reads effortlessly. In your example case: split the string into a list of words, that is tokenize based on spaces. Then map the read function onto this, which will parse each of the "words" in the list into some type. Annotations would likely be needed here. Then sum this list.

I much prefer this over 10 levels of class indirections or procedural style.


Isn't it the case that anything is very readable once you get the hang of it? I think the problem is exactly in the "get the hang of it" part. For some languages, that's really difficult, while for others it's almost trivial. From my own experience, Haskell is very very difficult to "get the hang of" despite my multiple attempts, while something like Javascript is trivial for me (interestingly enough, except when it uses too much of the new FP patterns!) even when I do not program on it daily.

This is like complaining that Spanish is so hard to learn while English is actually quite intuitive. Of course it’s easier to understand what you already know.

Not at all. I tried to make the point of distinction clear by saying I do not use JS, nor Haskell, daily, but JS is more readable, without a doubt, so it's more like saying something like "english is more readable than french to a spanish speaker" (the analogy makes much less sense, but trying to correct yours). I think that we can agree that everyone's a priori is what they know in the world, which is common language... and everyone is familiar with "recipes", or step-by-step instructions... which is the same as imperative code, not functional.

Don't get me wrong, I like FP and have been trying to get into it for a long time. But currently I strongly believe FP as commonly done in Haskell is just too far from what we expect even before we start writing code. Combining functions and chaining Monads just seems to me to be extremely hard to do and understand, and I don't need to do any of that in "lesser" languages. However, I am finally "getting it" with newer languages like Flix and Unison - they let me just use `let` and stuff like that which makes the code trivial again, while being basically purely functional.


> JS is more readable, without a doubt

In what sense?

Haskell's: strSum = sum . map read . words

in JS would be: const strSum = str => str.split(' ').map(Number).reduce((a, b) => a + b, 0);

for a person who's not already a JS programmer, the first one would be more readable (without a doubt), it literally reads like plain English: "sum of mapped read of words".

Haskell's version is more "mathematical" and straightforward. Each function has one clear purpose. The composition operator clearly shows data transformation flow. No hidden behavior or type coercion surprises.

Whereas JS version requires knowledge of:

- How Number works as a function vs constructor

- Implicit type coercion rules

- Method chaining syntax

- reduce()'s callback syntax and initial value

- How split() handles edge cases

So while the JS code might look familiar, it actually requires more background knowledge and consideration of implementation details to fully understand its behavior. Javascript is far more complex language than most programmers realize. btw, I myself don't write Haskell, but deal with Javascript almost daily and I just can't agree that JS is "more readable" than many other PLs. With Typescript it gets even more "eye-hurting".


> in JS would be: const strSum = str => str.split(' ').map(Number).reduce((a, b) => a + b, 0);

It's funny to me that you quote the FP-like version of that in JS.

The more traditional version would be more like this:

    function strSum(str) {
        let words = str.split(' ');
        let sum = 0;
        for (word of words) {
            sum += new Number(word);
        }
        return sum;
    }
I do sincerely think this is more readable, no matter your background. It splits the steps more clearly. Doesn't require you to keep almost anything in your head as you read. It looks stupid, which is great! Anyone no matter how stupid can read this as long as they've had any programming experience, in any language. I would bet someone who only ever learned Haskell would understand this without ever seeing a procedural language before.

I don't even know where to start, your biases here are so explicit.

- The assumption that "verbose = readable" and "explicit loops = clearer"? Seriously?

- The suggestion that "looking stupid" is somehow a virtue in code? "Simple" I can still buy, but "stupid"... really?

- You're using new Number() - which is actually wrong - it creates a Number object, not a primitive;

- You `sum +=` is doing not a single op but multiple things implicitly: addition, assignment, potential type coercion, mutation of state;

- for loops are another layer of complexity - iterator protocol implementation, mutable loop counter management, scoping issues, potential off-by-one errors, break/continue possibilities, possible loop var shadowing, etc. Even though for..of is Javascript's attempt at a more FP-style iteration pattern and is safer than the indexed loop.

You clearly underestimate how natural functional concepts can be - composition is a fundamental concept we use daily (like "wash then dry" vs "first get a towel, then turn on water, then...").

Your "simple" imperative version actually requires understanding more concepts and implicit behaviors than the functional version! The irony is that while you're trying to argue for simplicity, you're choosing an approach with more hidden complexity.

Again, I'm not huge fan of Haskell, yet, the Haskell version has:

- No hidden operations

- No mutation

- Clear, single-purpose functions

- Explicit data flow

You have just demonstrated several key benefits of functional programming and why anyone who writes code should try learning languages like Haskell, Clojure, Elixir, etc., even though practical benefits may not be obvious at first.


> Haskell is just too far from what we expect

Who's "we"?

I spent years writing JavaScript, PHP, and Ruby. I thought Haskell was weird and hard, and probably not practical in the real world.

As it turned out, I was just being a fool. Once you actually learn it, you realise how silly the opinions are that you had of it before you learned it.

Advent of Code is running right now. Why don't you just try learning the language?


I learned the language more than 10 years ago. No, it's not for me. Please don't assume that because somebody doesn't find Haskell readable the person must be ignorant.

> it's more like saying something like "english is more readable than french to a spanish speaker"

On the contrary French is much more readable than English to a Spanish speaker. Because French is much more similar to Spanish than English is.

Same with your JS example, I would guess it is much more similar to what you are used to


Maybe „French is more readable than Korean” would be a better analogy. Sure, if you’re already familiar with the Latin alphabet, but Hangul is clearly a better writing system.

> I do not use JS, nor Haskell, daily, but JS is more readable

I’m guessing you do use languages that are very similar to JS. Like a Spanish speaker saying “I don’t speak Italian or Chinese but Italian is way easier.” If you wrote F# every day you would probably find Haskell syntax quite intuitive


I was trying to make the point that no, it's not that at all. But I guess it's a very hard point to make and even though I am convinced that I'm right and this has nothing to do with familiarity, I can't find any serious research showing either way.

I know a dozen languages well. Everyone here thinking it's just ignorance, but that's not the case. There's just no way that, for me, Haskell and similar languages are readable in any sense just because they're more concise. If that was the case Haskell still wouldn't be close to the most readable, but something like APL or Forth would. I've tried for more than 10 years to be like you guys and read a bunch of function compositions without any variable names to be seen, a few monadic operators and think "wow so easy to read"... but no, it's still completely unreadable to me. I guess I am much more a Go person than a Haskell person, and I am happy about that.


It is a point of familiarity. Just because you've been coding in multiple languages before doesn't necessarily make you "a better programmer" (in the sense that you've developed good instincts to quickly mentally parse different pieces of code) - you could have been using programming languages of similar paradigms. It took me a few months of writing Clojure (niche language) to start seeing things in a different light - I also, just like you, used to think that imperative constructs are more readable and easier to reason about. I was wrong.

There's no such thing as a "Go person" or a "Haskell person"; all programming languages are made up. Nobody has "more natural inclination" for coding one way than another. Just get your ass out of the comfort zone, try learning a new (to you) language - give it a heartfelt attempt to use it - that may change your life.

Just to be clear - I'm not saying Haskell is easy, oh no, not at all. I'm just saying that it stops being so intimidating and weird after a while.


> and everyone is familiar with "recipes", or step-by-step instructions... which is the same as imperative code, not functional.

everyone is familiar with "I don't know how exactly, but generally it would be this way..., we can discuss specifics later" which is the same as reading the above pointfree notation (sum . map read . words) verbatim instead of imperatively inside-out: something is a sum of all parsed values of space-separated words.


Yes but js is very similar to other languages you know.

It's a matter of "density".

The more dense the code, the more there is to unpack until you are deep in the language.

It's OK for languages to be more verbose and offer structural cues (braces) as this often helps in human parsing of logic.


> In your example case: split the string into a list of words, that is tokenize based on spaces.

You've made a common mistake. You're wiring your listener's thinking with the imperative inside-out approach that you're used to. Instead, it should be explained as this: "strSum = sum . map read . words" is "a sum of all parsed values of the original input of space-separated words". The reason you should avoid inside-out explanations is because in Haskell you're allowed to move from general ideas to specifics, and you can sprinkle `undefined` and `_` for specific details whilst thinking about general ideas and interfaces.


I really like the FP paradigm, but could you all stop using weird abbreviations and random characters as substitute for operations?

You don't do programming with chalk on a wallboard, for crying out loud. Ideally, you are using a good IDE with syntax completion. Therefore, readability matters more than the ability to bang out commands in as few keystrokes as possible.


Iverson's Notation as a Tool of Thought defends the opposite idea (and explains the reason for APL): https://news.ycombinator.com/item?id=25249563

It's about phase transitions. When you understand the system, shorter symbols are easier/faster to reason with. If your primitives are well thought out for the domain, this notation will be the optimal way of understanding it!

On the other hand, longer names help on board new people. Theoretically, you could avoid this issue by transforming back and forth. Uiua e.g. lets you enter symbols by typing out their names. Typing "sum = reduce add" becomes: "sum ← /+". If you transform it back...

Imagine if you could encode the std lib with aliases!


I've originally studied social sciences, so my training uses maths but its core is words. I dislike symbol only notation. Real words trigger different parts of my brain. I am very good at memorizing content and flow of texts but bad at keeping symbols in my head. I have the same issue with language, e.g. I have no problem with pinyin, but written Chinese characters are taxing me.

I second this, most programmers seem to be fixated on the idea that all code should show at every moment how data types and their values are being passed and tossed around, and they simply ignore or refuse to realise that you can omit it and think in terms of functions fitting the slots.

If you're familiar with Haskell, this is something you can just look at and parse without thinking. It's all basic Haskell syntax and concepts (function composition and partial application). I haven't touched Haskell for a few years and I didn't have any trouble interpreting it as "strSum is a function that takes a single string argument, splits it by whitespace, interprets each chunk as a number, and returns the sum".

I’m not familiar with Haskell and I could read almost all of that from it. No idea how you can tell it splits on white space though.

I guess the more succinct the code, the more the reliance on understanding what a function actually does - either through experience, or by reading the docs. The words function is simply:

  words :: String -> [String]
So that

  words "foo bar baz"
  -- Produces: ["foo","bar","baz"]
In my experience, both the blessing and the curse of Haskell's incredible succinct expressiveness is that, like other specialised languages - for example using latin for succinctly-expressed legal terms - you need a strong understanding of the language and its standard library - similar to the library of commonly used "legal terms" in law circles - to participate meaningfully.

Haskell, and languages like Go (which anybody with a bit of programming experience can easily follow) have very different goals.

Like many in this discussion, I too have developed a love/hate thing with Haskell. But boy, are the good parts good ...


I recently learned that things like Goroutines aren’t naturally written with buffers and channels. Granted anyone who reads the original documentation would likely do it correctly, but apparently that’s not how they are intuitively written. so while it may be easy to read it might be harder to write than I was assuming.

So maybe there a difference where Haskell has an advantage? I mentioned it in my previous comment but I don’t know Haskell at all, but if this is “the way” to do splits by word then you’ll know both to read and write it. Which would be a strength on its own, since I imagine it would be kind of hard to do wrong since you’ll need that Haskell understanding in the first place.


It all comes down to knowing the FP vocabulary. Most of FP languages share the names of the most widely used functions, and if you're well versed in Haskell you'll have 80/20 ratio of understanding them all, where the 20% part would be language-specific libraries that expand and build upon the 80% of the FP vocabulary.

As a Haskell noob, I had the same problem a few times. Essentially: there's a function to do what you want to do, but good look finding it!

Someone thought "words" was the perfect name, and it wasn't me!


https://hoogle.haskell.org/ can help you find the function that you're looking for.

As for "words"... yes, possibly not the best name. But also so common that everyone that has ever written any Haskell code knows it. Such as Java's System.out.println


not quite as helpful, but this reminded me of: https://wordly.org/wordle-games/haskle "a wordle clone for learning haskell functions"

Yeah this language probably has a lot of Stackoverflow questions. This is basically like tasking someone’s personal dot file and trying to reason about it

Yeah, that's where you just have to know what the "words" function from the standard library does.

Compare it to a bashism like

  find . -name '*.py' | sed 's/.*/"&"/' | xargs  wc -l
But instead of using | to tie the different functions together you're using . and the order is reversed.

> Maybe because I haven’t used languages like these in the past [...]

Yes, that definitely the case.

If you know what each function above does, including the function composition dot (.), then this is like reading English — assuming you know how to read English.


The “map read” part is what’s off. I think it’s because parens are optional or not required.

There are other languages which are functional as well. like the one in the article and like Elixir where readability is it sacrificed.

I still think readability is atrocious in this language. Sure I can get used to it, but I’d never want to subject myself to that


Parenthesis are not really optional, they're just used differently than other languages. Other languages use parenthesis for function application and grouping, in Haskell it's just grouping.

    wordsPerLine = filter (>0) . map (length . words) . lines
Funnily enough, parenthesis are actually optional in Elixir, although it's a warning to use pipe syntax without them. The following is valid in both Haskell and Elixer:

    length [1,2,3]

Whenever you see something like

  apply foo bar (lol wat)
in Haskell, and it confuses you, simply mentally replace it with

  apply(foo, bar, lol(wat))
To translate into a more popular syntax (e.g. JavaScript).

Problem solved:

    strSum = sum . parsed . words
             where
                parsed = map read

In our codebase we enforced usage of `>>>` instead, which composes forward instead of backwards:

    strSum = words >>> map read >>> sum
For most people this then becomes "Apply `words` to the input argument, pass the result to `map read` and then `sum` the results of that".

I don't think `.` is super complex to read and parse, but we had people new to Haskell so I thought it prudent to start them off just with `>>>` and keep it that way. Most things are read left-to-right and top-to-bottom in a codebase otherwise so I don't see why not.

Edit:

I also told everyone it's fine to just spell out your arguments:

    stringSum sentence =
        sentence
        & words
        & map read
        & sum
In the example above `&` is just your average pipe-operator. Not currying when you don't have to is also fine, and will actually improve performance in certain scenarios.

Edit 2:

The truth is that there are way more important things to talk about in a production code base than currying, and people not using currying very much wouldn't be an issue; but they'll have to understand different pointer/reference types, the `ReaderT` monad (transformer), etc., and how a `ReaderT env IO` stack works and why it's going to be better than whatever nonsense theoretical stack with many layers and transformers that can be thought up. Once you've taught them `ReaderT env IO` and pointer types (maybe including `TVar`s) you're up and running and can write pretty sophisticated multi-threaded, safe production code.


In a production application you generally don't write code like that. I find it tends to be the opposite problem where you often see giant `do` blocks performing all sorts of monadic effects.

You especially wouldn't use `read`.

You can see it as a pipeline where the output of the right-most function is plugged into the input of the function to its left.

It’s not inelegant, it’s just unfamiliar to you.

1. Change . to |

2. Reverse

Now you have:

words | map read | sum

Or..

$ cat words | map -e read | sum


Yes but the notation of dot, plus such function names plus optional parens makes it sure read like English. That’s great but it’ll be a nightmare when you are also dealing with strings which similar English in it.

|> is a common way of writing the pipe in haskell, so

     words |> map read |> sum
IHP uses it a lot.

Would you propose the same change for nested function calls y = f(g(h(x))), changing it into y = x | h | g | f ?

It can look nice. It is like asking: passive or active voice? It depends what you are writing and what makes sense to the reader.

I do like pipelines though!


Hey,

Author here

Happy to answer any questions! At this point, I have 18 months of OCaml experience in production.


Great write-up and I must say a surprising conclusion and good case for OCaml. I also love Haskell but using it for a practical project I came up against some of the pain points mentioned here. If I could suggest some Emglish grammar fixes to this great article, the word order needs to be flipped around for the interrogative sentences.

Wait, why doesn't the standard library have docs at all for this version I use?

Instead of:

> Wait, why the standard library doesn't have docs at all for this version I use?

And

How can Haskellers live like that without these usability essentials?

Instead of:

> How Haskellers can live like that without these usability essentials?


Happy to hear you enjoyed the article!

Thanks for the suggestions! English is not my first language, so mistakes could happen. I hope it's not too off-putting, and people can still learn something interesting from articles.

I'll incorporate your suggestions :)


Your English is great and clear! Those kinds of mistakes (not reordering the word order in interrogative clauses) are so common that in the upcoming years that may become the new standard for English grammar! But for now I will keep trying to correct them. ;-)

Both Haskell and OCaml are fantastic. It is really astonishing the degree to which advancement in mainstream programming languages these days is just copy pasting ideas from either.

I thought (based on some posts I've read on HN) that Lisp was the "culprit" for that.

Missing from this post: string_of_int, int_of_string, +, +., etc. That alone is a massive turn-off for me, I'd rather write C at that point. Any modern language necessitates some kind of polymorphism and make user-defined types feel like first-class citizens.

> string_of_int, int_of_string

That didn't bother me so much because i speak spanish and can read french. OCaml is of french origin. `string_of_int` is a bad english translation—should have been `string_from_int`.

I like F# where I can use the `int` or `string` functions:

    let myString = "2024"
    let myInt = int myString
    let myStringAgain = string myInt

The Ocaml library added a Int module with a function to_string (so Int.to_string) and a generic printer ages ago. There is also a (+) operator for Float in the float module which you can open in any scope if you so wish.

Ocaml obviously supports polymorphism and an extremely expressive type system. The fact that operators are not polymorphic is purely a choice (and a good one).


As the other commenter mentioned, its a mistranslation. I read string_of_int as "string out of int" to make it better.

Interestingly enough, OCaml has a great polymorphism story in its OO system. Because it is structurally typed with nearly automatic inference, you can in fact write completely generic code like `x#y(z)`, which magically "just works" for any x that has a method y that accepts z - all inferred and statically type-checked.

> Because it is structurally typed with nearly automatic inference, you can in fact write completely generic code like `x#y(z)`, which magically "just works"

aka let's mix two jars of jam and shit via this funnel and see what happens.


On the contrary, static and structural typing are a match made in heaven.

Interesting. Why doesn't the standard lib use that for the examples I listed?

Because those types are not object types, so they don't have methods associated with them at all. This is unlike, say, CLR languages in which all types are object types.

There's been research on modular implicits for OCaml to solve this more generally, but that's not landing upstream anytime soon.


Haskell provides by far the best refactoring experience of all languages I’ve ever used.

Also opam nowadays is finally working on Windows, which is kind of great, Haskell used to have an advantage there.

So much of this article, and the comments below, can be summarized by one of Rich Hickey's aphorisms:

> Everything is hard to read until you learn to read it.


> It’s not exciting to write a GitHub API client and parse tons of JSON.

While buried in our monorepo, so not very accessible, we just open sourced our product that is written in Ocaml and we have a GitHub client that is generated from the OpenAPI schema.

It is separated out from any I/O so it can be used in any I/O context.

https://github.com/terrateamio/terrateam/tree/main/code/src/...


I like this:

> A great standard library is a cornerstone of your PL success.


Massive amounts of headache could have been avoided by using “OCaml but with ecosystem and great tooling support” known as F# :)

I got turned off of F# because it seemed like knowing C# libraries and tooling was assumed if one wanted to do anything non-trivial, kind of like some of the functional JVM languages always assume some amount of Java knowledge and use of Java tooling. F# seemed nice, but it didn't seem like a real stand-alone language. Unlike Elm or Purescript, where one should also know JavaScript and its tooling, I don't find learning all the C# and Java stuff independently compelling enough to use F#, Scala, Frege, etc.

"C# libraries"

Isn't this just .NET?

Think this was a feature. F# has access to all of the existing libraries, plus those made for F#.


It's a feature but also sort of an anti-feature. To do anything productive you will need to reach for a lot of .NET libraries, and those will force you to write code differently than you would if you could live in blissful pure-F# land. This results in a mishmash of styles.

Something as simple as calling a .NET function that doesn't have F# bindings forces a change (e.g. `someLibraryFunc(arg1, arg2)` instead of `f arg1 arg2`).

This gets worse as libraries force you to instantiate objects just to call methods which could have been static, or use ref parameters, etc.

I say this as somebody who loves F# - you do absolutely have to know .NET style (which really means C# style) in addition to F# style. It's extremely pragmatic if you're already familiar with C#, but I'm not sure how wonderful it is if you come in cold.


> ref parameters

I actually like the way F# does refs more! byref<'T> aligns more closely with internal compiler terminology and makes it more clear that this is something like a pointer.

Having to perform tupled calls over curried is indeed an annoyance, and even a bigger one is the prevalent fluent style which results in

  thing
  |> _.InstanceMethod1()
  |> _.InstanceMethod2()
  |> _.Build()
Ugh, I usually try to shuffle around what I find annoying into bespoke functions (which are thankfully very easy to define) or even declaring operators. It also helps that while selection is not vast, F# tends to have various libraries that either provide alternatives or wrap existing widely adopted packages into a more functional F#-native API e.g. Oxpecker, Avalonia FuncUI, FSharp.Data, etc.

" if you could live in blissful pure-F# land"

Yep. I love F# too, wish I could stay in blissful F# land.

Wish MS would just release a .NET re-done in F#? Huge task, with no payback. But would be nice.


F# compiler is so slow.. a big turnoff coming from ocaml, where compile times are fast, even faster than Go or Haxe.

Relatively yes, but F# also happens to be faster than other FP languages and has access to a vast ecosystem unlike Haxe. I doubt the public looking at FP languages would appreciate Go :)

F# includes an optimizer that performs e.g. lambda inlining. Then, .NET compiles the assemblies themselves to a final application, be it native executable or otherwise, so I feel like relative compiler slowness is not a dealbreaker. It is also relatively quick at starting for executing F# script files with `dotnet fsi` (I'm happy that F# is included in standard .NET SDK, much different to e.g. Clojure or Scala), it's slower to start than Python or Elixir shell but I can live with that.

This was also a good opportunity to write down a small workflow to use F# interactive to quickly build small native console applications thanks to FSharpPacker:

https://gist.github.com/neon-sunset/028937d82f2adaa6c1b93899...

In the example, the sample program calculates SHA256 hashes for all files matching the pattern at a specific path. On my M1 Pro MBP it does so at up to 4GB/s while consuming 5-10 MiB of RAM.


Well Haxe has access to whatever it compiles to. If i target JS i get the entirety of the JS ecosystem, in good and bad.

F# is indeed fast. Thanks the the work MS has put in. But so is Ocaml, it is close to C when written in perf first mode. Having said that i rarely need the speed of C for what im building, as bottlenecks tend to be IO anyway.

Finally, ocaml 5+ got multicore (domains) and effects, that really is a better abstraction than monads ever will be (imho)


Not all Clojure REPLs have slow startup. Clojure is a hosted language, it sits atop another PL. For instance, REPL of nbb (which runs on Node) is instantaneous, and babashka (for bash scripting) is also very fast.

F# is good because MSR used to hire a bunch of the top GHC devs and pay them to work on F# part-time. They put all their actual passion into Haskell.

I tried that about 10 years ago and spent 2 hours trying to open some files under Linux.

Then I gave up. Have things improved?


The below should work as is:

  sudo apt install dotnet9 # or dotnet-sdk-9.0 if the repo doesn't have dotnet9 metapackage
  dotnet new console --language F#
There is also a package for Avalonia that lets you write GUI applications in F#: https://funcui.avaloniaui.net

I didn't use it 10 years ago but I've been using it for the last 4 years on mac and linux exclusively.

Microsoft seems to be prioritizing "cloud" on all their developer products (rather than just windows). I don't feel disadvantaged by NOT using dotnet on windows.


Vastly. You were probably using Mono. .NET is now fully cross-platform and very good at being so.

Very much so.

"You may find a solution in Haskell. But often you’ll discover too many solutions, you won’t know which one to choose."

Will this eventually also become a problem with Rust?


Depends on the domain. For some, there are already de facto standard crates that the ecosystem has concentrated around. For some, like web frameworks, there are many competing choices

What about performance? I wrote my thesis in OCaml and my recollection was that it had an amazing native code generating compiler that not infrequently output code that was almost as fast as C if you wrote it the right way. The story I heard about Haskell was far different at the time (admittedly decades ago).

As with all garbage collected languages, optimizing comes down to removing allocations, in Haskell this means strictness and choice of data structures. You can make C-speed programs but you may need to work for it, and you'll also need to know the compiler/evaluation model you're working against.

Sure, I think what I noticed was that even idiomatic OCaml code was relatively fast, maybe 2-3x slower than C, but plenty fast enough. Whereas I was under the impression that idiomatic Haskell was far more likely to have unexpected and harder to fix performance issues (e.g. requiring more of an architectural rewrite) because of its lazy evaluation model.

The problems brought about by the lazy evaluation (laziness when you don't want it) do not require architectural rewrites. It's mostly just profiling while setting a very low stack size limit, and then discovering which part of your code triggers a stack overflow. It can be solved by adding the right amount of seq (sequences evaluation), or ! (strict patterns). Maybe you'll also change a few incorrect uses of foldl into foldr. Even if you need to change from lazy Map to strict Map the change isn't disruptive; it's just changing some import and done; all the functions work. No architectural changes needed.

Just curious - what is the niche for this languages and what's the motivation to choose one?

They’re not meant to be niche languages. They’re meant to be general purpose languages. They’re trying to use functional programming to achieve a high degree of correctness with short, readable programs.

Compilers are a good one. I know Rust's original compiler (before being self hosted) was implemented in OCaml. Darklang's compiler was in OCaml as well.

rustc targets LLVM IR by default, and I daresay the bulk of the optimisation and assembly-lowering work is done in LLVM. Which is solidly C++.

Not niche, Jane Street uses OCaml (and they contribute a lot to the compiler, too), it is "a research-driven trading firm where curious people work together on deep problems".

More about it here: https://www.janestreet.com/what-we-do/overview/


> where curious people work together on deep problems

'curious people': people who got jaded by academia and were attracted by the six- to seven-figure salaries at Jane Street.

'deep problems': Buy X units of Y instrument at A exchange, and sell Z units of said instrument at B exchange, and do this often enough that said company makes a pile of money for itself and its employees (mostly itself, given it can afford to pay its employees six to seven figures).


Jane Street is just one example, OCaml is widely used elsewhere, too. For example, the first compiler for Rust was written in OCaml, too.

I mentioned Jane Street because it uses OCaml for high-frequency trading, and because they are huge contributors to OCaml.


I thought the main draw of OCaml is that it has imperative features such as for-loops, instead of trying to be purely functional like Haskell. This probably made OCaml easier to pick up for people coming from other programming languages.

For-loops and mutation are not really encouraged, often loops are implemented with recursion as in normal fp. The big difference to me is that OCaml doesn't try to be pure, so you can perform side effects as needed, instead of forcing the user to jump through hurdles when trying to print to stdout

Agreed. The main draw of OCaml is that it isn't pure like Haskell. Haskell's purity is really what makes it weird (and great).

(2023) in case it looks familiar

I was thinking about the error message comparison and for me that difference expresses succinctly why I have little interest in using Haskell. Indeed, I have actually started viewing any language with long, detailed and subtle error messages as a red flag. These types of error messages indicate to me that the compiler is trying to do too much and the language would benefit from simplification, especially with respect to trying to infer the programmer's unstated intentions. It of course feels magical when the compiler can infer everything for you, but increasingly I see this as black magic.

Anyone built simple (but not trivial) projects with Haskell or OCaml with source that I can look at?

I built a compiler and interpreter for a statically typed but fully inferred functional programming language in OCaml a while ago: https://github.com/coetaur0/nox The project is fairly simple, but there are non-trivial parts such as the type-system's implementation that might be worth looking at.

I'm sure a lot of people here have much better examples, but I wrote some basic regular expression and finite automata algorithms in Haskell a long time ago:

https://github.com/jl2/Compiler-Algorithm-Code/tree/master/h...

I tried it out and after renaming fold -> foldr, it still builds and seems to work. The main function takes a regex as a command line argument and creates a finite automata graph using GraphViz's dot.

In the Compiler-Algorithm-Code/haskell directory:

    make
    ./test "(foo)+(bar)*(foo+)" | dot -Tpdf -ofoobarfoo.pdf && xdg-open foobarfoo.pdf

The are LLVM Caleidoscope (toy compiler) in both Haskell and OCaml

https://github.com/sdiehl/kaleidoscope https://github.com/arbipher/llvm-ocaml-tutorial

The Haskell one is a nice one. Can say nothing about the OCaml one since I found it using a google search.

I've had a try at implementing an Caleidoscope compiler in OCaml but did not finish it. But it was fun to write.


I wrote this like ~10 years ago as a "Hello World++" type demo (basic key/value server) in Haskell. It's about 200 LoC, with a Haskell and Python client. http://github.com/wyager/neks

Some time ago I made a chip-8 emulator with haskell https://github.com/EzequielRamis/chisito. I suppose it may be easier the state management in ocaml.

I released a game using OCaml bindings to the Raylib library. I had never written OCaml before and I didn't spend very much time refactoring, so the code is pretty messy and maybe isn't the best example of the language. But some of it turned out pretty nice - the first ~90 lines of this file detect collisions between two shapes using the Separating Axis theorem: https://github.com/mega-dean/hallowdale/blob/main/src/collis...

This should be short enough to read: https://github.com/Artamus/git-split/

https://github.com/matthiasgoergens/Div7 is a simple one that you might like.

This seems more on the trivial side to me.

Yes, depends on where you draw the line.

XMonad is a bit bigger: https://github.com/xmonad/xmonad


I wrote a Lox compiler and interpreter in OCaml a few years ago: https://github.com/gaultier/lox-ocaml

No idea how it holds up, it was my first try at a compiler, but it’s quite small. I was following the Crafting Interpreters book.


You can check my (not finished) example of GitHub TUI built in OCaml:

https://github.com/chshersh/github-tui


An unfinished command-line client for Hacker News:

https://github.com/LucianU/hn-reader


The Haskell examples at the beginning are a bit weirdly chosen. You wouldn't want to write code like that except in scripts etc. because it would crash the program if you encounter bad input.

The first example could be more idiomatically written as:

  strSum :: String -> Maybe Int
  strSum = fmap sum . sequence . fmap readMaybe . words
(You'd also probably want to avoid String and use Text instead.)

For more complex parsing scenarios, the various parser combinator libraries can take a while to get used to (and I wish the community would standardise on one or two of them), but they're extremely powerful.


Let's take that further: you wouldn't want to use your code above either, because it would be impossible to tell why the parser failed. Which is going to be really frustrating once it fails. And if it doesn't fail the first version is fine.

Sure, you can improve this by e.g.

  import Data.Either.Combinators -- from the either package

  strSum :: String -> Either String Int
  strSum = fmap sum . sequence . fmap tryReadInt . words
    where
      tryReadInt w = maybeToRight ("not an integer " ++ w) (readMaybe w)
This keeps the bulk of the method the same. Being able to go e.g. from Maybe to Either with only few changes (as long as your code is sufficiently general) is one of the nice things you get from all the Haskell abstraction. You can't really do that if you start with exceptions (unless you're in IO).

Hm... for anything real, you would need to provide good error reports, i.e. why did it fail to parse. Code like this looks pretty, but once you add in good error reports, I feel like it tends to look almost exactly the same as in an imperative language?

For a real use case, consider using a parser library. If you ignore the boilerplate import stuff, it's IMHO rather short and elegant - and not very imperative:

  {-# LANGUAGE OverloadedStrings #-}

  import Data.Text (Text)
  import Data.Void (Void)
  import qualified Data.Text as T
  import Text.Megaparsec
  import Text.Megaparsec.Char
  import qualified Text.Megaparsec.Char.Lexer as L

  type Parser = Parsec Void Text

  numParser :: Parser [Integer]
  numParser = L.decimal `sepBy` space1

  -- just for demonstration purposes - this will print an ugly debug string, it can be customised
  main = putStrLn $ show $ parse numParser "some-source-file.txt" "10 20  30 x"

  -- this would print "Right [10, 20, 30]"
  -- main = putStrLn $ show $ parse numParser "some-source-file.txt" "10 20  30"

  strSum :: String -> Either String Int
  strSum = fmap sum . mapM readEither . words

That doesn't exactly give a good error message, though:

  > strSum "10 20 x"
  Left "Prelude.read: no parse"
But yeah, I didn't remember "mapM". mapM f = sequence . fmap f, which is what I used.

> I’m interested in building stuff, not sitting near my pond on a warm summer day, thinking if TypeFamilies + DataKinds would be better than GADTs for making illegal states unrepresentable.

I feel differently. I would rather sit by the pond on a summer day rather than build stuff


Engineer vs mathematician. Haskell is the schizophrenic product.

> If I come to an existing OCaml project, the worst thing previous developers could do to it is have poor variable names, minimal documentation, and 200+ LOC functions. That’s fine, nothing extraordinary, I can handle that. > > If I come to an existing Haskell project, the worst thing previous developer>s could do… Well, my previous 8 years of Haskell experience can’t prepare me for that

This is kind of like Go vs C++, or <anything> vs Common Lisp. The former is a rather unsophisticated and limited language, not particularly educational or enlightening but good when you need N developers churning code and onboard M new ones while you're at it. The latter is like tripping on LSD; it's one hell of a trip and education, but unless you adopt specific guidelines, it's going to be harder to get your friends on board. See, for example: https://www.parsonsmatt.org/2019/12/26/write_junior_code.htm...


Go is good for onboarding people onto a project, but not much else.

There's a reason Google is migrating Go services to Rust:

https://www.theregister.com/2024/03/31/rust_google_c/

> "When we've rewritten systems from Go into Rust, we've found that it takes about the same size team about the same amount of time to build it," said Bergstrom. "That is, there's no loss in productivity when moving from Go to Rust. And the interesting thing is we do see some benefits from it.

> "So we see reduced memory usage in the services that we've moved from Go ... and we see a decreased defect rate over time in those services that have been rewritten in Rust – so increasing correctness."

That matches my experience: Go serivces tend to be tire fires, and churn developers on and off teams pretty fast.


You'd expect a rewrite to take less time than development of the original system from scratch. So I'm not sure this is actually as favorable a result for Rust as it's presented.

Isn't Go's concurrency model an advantage over other approaches?

When it exactly fits your problem, yes. But it's not like you can't express that model in Rust (in a more cumbersome way) when you need to.

OCaml is not an unsophisticated language. It inherits the features of ML and has first class modules, which is not present by default in Haskell (present in backpack). Not having first class modules leads to a lot of issues.

Also, there is a better story for compilation to the web.


OCaml's type system is quite janky and simplistic compared to Haskell's. The first class module system is fairly nice, although it leads to an annoying problem where now you kind of have two "levels" to the language (module level and normal level). This is arguably analogous to Haskell having a "term level language" and a "type level language", where the type system is more prolog-y than the term language. Also, Haskell's type system is powerful enough to do most of the things you'd want the OCaml module system for, and more. I do occasionally miss the OCaml module system, but not most of the time.

Conversely, the Ocaml module system is powerful enough to do all the things you had want to do with Haskell except the Ocaml module system is nice to use.

Anyway, the issue has nothing to do with relative powerfulness. The issue is that the Haskell community encourages practices which lead to unreadable code: lot of new operators, point-free, fancy abstraction. Meanwhile, the Ocaml community was always very different with a general dislike of overly fancy things when they were not unavoidable.


> except the Ocaml module system is nice to use

This comment doesn't lead me to believe you've ever worked in an ocaml shop. It's only "nice to use" for trivial use cases, but quickly devolves into a "functorial" mess in practice

> the Ocaml community was always very different with a general dislike of overly fancy things when they were not unavoidable

This is the exact thing that people always say when they are coping about their language being underpowered.


If by "encourages" you mean "has features", then yes. The typical haskell shop doesn't really encourage complex feature use, it's the people learning/online who don't actually need to work within their solutions, do. That's what seems to draw (some) people to haskell.

Learning a “pure” language is a lot like tripping on LSD.

The people who do it can’t stop talking about how great it was, but also can’t really explain why it was so great, and when they try it just sounds ridiculous, maybe even to them. And then they finish by saying that you should drop acid too and then you’ll understand.


The reality is people want what you produce when you're sober, not having fantasy hallucinations.

> also can’t really explain why it was so great

I like it when

  assertTrue (f x)  -- passes in test
means that

  assertTrue (f x)  -- passes in prod

Is there a language where that isn’t the case?

Approximately all of them. The property is "referential transparency", and it's such a sensible thing to have that people assume they already have it (per your question).

The "test/prod" was an unnecessary detail - there's really nothing saying that f(x) will equal f(x) in most languages in most circumstances! It can return different things on repeated calls to it, it can behave differently if two threads call into it at once.

It's a major part of the reason people don't see the appeal of Haskell. They think they already have "type-safety" and "functional stuff" and "generics" and "null-safety" - but it's really not the same.


Haskell isn't all that pure.

what do you mean by that? all functions in haskell are pure unless you explicitly use unsafePreformIO or similar (which is rare to ever have to do)

They can still have side-effects like non-termination.

But I didn't mean purity in that formal sense. I meant that Haskell is plenty pragmatic in its design.


To me, "pure" means referential transparency: same input, same output. So an `Int -> Int` function will return same result on same argument. So, similar to `Int -> IO Int`, the function (action) will return an Int after interacting with outside world, `IO` tracking the fact that this is the case.

Lambda calculus is as pure as can be, and also has terms that don't normalize. That is not considered a side effect.

A better example of impurity in Haskell for pragmatic's sake is the trace function, that can be used to print debugging information from pure functions.


> Lambda calculus is as pure as can be, and also has terms that don't normalize. That is not considered a side effect.

Many typed lambda calculi do normalise. You can also have a look https://dhall-lang.org/ for some pragmatic that normalises.

> A better example of impurity in Haskell for pragmatic's sake is the trace function, that can be used to print debugging information from pure functions.

Well, but that's just unsafePerformIO (or unsafePerformIO-like) stuff under the hood; that was already mentioned.


> They can still have side-effects like non-termination.

you can still have total functions that don't finish in humanly/business reasonable amount of time.


Yes?

Just like pure functions can use more memory than your system has. Or computing them can cause your CPU to heat up, which is surely a side-effect.


It doesn't have great support for Dependent Types

what does that have to do with purity?

Nothing, but arguably a language with dependent types is more Haskell than Haskell

"You mean you're going to make a copy of that every time?"

Haha, can't tell if you're joking or not.

For anyone else reading - you don't need to make a copy if you know your data isn't going to change under your feet.

https://dev.to/kylec32/effective-java-make-defensive-copies-...


I was half-joking. I wasn't aware Java was promoting "defensive copies" :D



Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: