This is way too many needlessly rude words for what it's actually trying to say, but the underlying issue is one I've experienced.
I find it really interesting how a the success of programming language (or project) is a combination of multiple factors that aren't necessarily correlated -- like for example, the Haskell community is full of really genuinely nice people while the Linux kernel community is full of screaming. Or how lisp is famously great except that there's multiple incompatible implementations; there's nothing about lisp itself that causes that. And similarly in this case: there's nothing inherent about Haskell that required the versioning mess it seems to have* .
In all these cases there are some weak correlation: perhaps the research origins of Haskell led to a culture of respectful disagreement, perhaps the flexibility of lisp implies more differences of opinions, perhaps extra-powerful static typing leads to more breaking APIs on version bumps. But that's not enough to fully explain it. A lot of it is just how things turned out.
In an imaginary world I'd be able to just read the language spec and have enough info to judge whether to base my next project on something, but it turns out there are all these "soft" factors that matter just as much.
* You could imagine for example a different design where you get a separate copy of all needed libraries for each project so that they never stomp on each other (like what the various fooenv tools do for ruby/python/etc.). It has different tradeoffs, sure, but it at least superficially seems it would reduce frustrations like in this post.
Yeah, up to a point, the maturity of a language's ecosystem matters much more than the quality of the language. That's the main reason I still choose Python for my web-related projects. Haskell, Clojure, Go, Racket or Arc might all be more powerful languages in principle. But in Python or Ruby, I can pretty much assume there's already a library to do anything I want to do with minimal fuss. For these niche or up-and-coming languages, not so much.
I'm disappointed to hear the Haskell toolchain is so problematic, as I was looking forward to trying Yesod.
Dont let this discourage you from trying Yesod. The Yesod team has made it really really easy to install a core of isolated packages for the framework. The problem comes from trying to upgrade. I know that sounds insane(and it is), but you should try it out, it has quite a lot to offer.
I find Python to be one of my least favorite languages in terms of the design of the language, but the one I end up using the most for the reasons you mention.
You would think by this point all languages would have standardized on a basic library. I mean, if you're not starting off with what Python (or even Mono for that matter) already has, then you're just making things needlessly complicated for yourself.
Everyone goes crazy over syntax and idioms and OMG GROUNDBREAKING NEW IDEAS (not) when really 85% of a language depends on the library and toolchain.
The differences between standard libraries, programming idioms, communities and ethos often play a large role in defining a new language, so surely sharing a standard library would in many ways defeat the point of trying something new?
A programming language is not just a spec and a syntax, it's also a way of seeing the world - "The limits of my language mean the limits of my world.", so to get a new way of looking at things, you also need to reinvent the basics to some extent, and rethink how (for example) basics like errors, threads or strings are handled.
Personally, I think it's useful to have a multipolar programming landscape, even if it sometimes leads to duplicated work, because it can give you a fresh perspective on old problems.
I'm hardly an advanced Cabalist, but it does the job nicely for me, and is quite stable at that too. Like virtualenv, it can also create a "sandboxed" GHC, so even system-level updates to GHC don't wreak havoc in your projects.
cabal-dev is close to giving you a separate copy of all needed libraries for your projects. It just doesn't do it for dependencies of your dependencies
"It just doesn't do it for dependencies of your dependencies"
Yes, it does. cabal-dev can install a fully hermetic package database if you ask it to, but by default it only installs packages that aren't already present in your global package db.
I don't understand why he wanted to reinstall things so often; it would have been much easier (and certainly faster) to just install libraries once.
The first problem is that he didn't read any of the Cabal documentation. He complains that after upgrading his Cabal binary, that it still complains that an update is needed. I'm almost certain this is because he hasn't put the Cabal binary directory (~/.cabal/bin) on his $PATH. While this is arguably something that Cabal should be detecting and warning about, it's also not a requirement unique to Haskell -- any package manager that installs to a user's home directory will need them to change their $PATH.
Continuing on, he installs cabal-dev (good!), then proceeds to not use it. Instead, he tries to jam all of Yesod (a notoriously version-picky package) into his global package directory.
Then he uses cabal-dev to build his new site, and here's where things get weird. It should absolutely not install anything other than what his particular package needs as dependencies. If it's installing the world again, then there is something horribly misconfigured on his system.
Finally, I strongly object to Yesod being considered a "success story". While web frameworks are currently the hip thing, and Yesod is indeed a web framework, to my knowledge it's gained no significant userbase in part due to the huge nest of conflicting library versions that it depends on.
Here's what he needs to do:
# Clean up whatever mess he's currently made
$ rm -rf ~/.cabal ~/.ghc
# Re-initialize Cabal
$ export PATH="$HOME/.cabal/bin:$PATH" # optionally add to .bashrc
$ cabal update
$ cabal install cabal-install cabal-dev
# Assuming that Yesod is using standard Haskell packaging properly, this
# should install Yesod in its own little sandbox so it won't destroy the rest
# of the system.
$ mkdir my-yesod-project
$ cd my-yesod-project
$ export PATH="$PWD/cabal-dev/bin:$PATH"
$ cabal-dev install yesod
# Now Yesod can be used without fear
$ yesod init .
$ cabal-dev install
Ideally, getting a basic setup going shouldn't take a deep-read of the docs of my package manager. This is absolutely a weakness of the current Haskell ecosystem; perhaps not as bad as presented in the post here, but something that deserves improvement.
Contrary to the thesis of the post, the haskell community seems to be well aware of this, and there are several attempts to fix it being actively worked on, just nothing particularly standardized yet. Meanwhile, cabal-dev lets people get work done.
>>Ideally, getting a basic setup going shouldn't take a deep-read of the docs of my package manager.
>Please name one package manager which installs binaries to the user's home directory, but doesn't require updating the user's $PATH.
Do you even realize how little those two statements follow each other? The whole red herring argument here is kind of annoying, but I'll hit it anyways- every package manager that I've used which installs binaries (or anything, really) in the user's home directory (or any directory really) does make me update the path- but it also tells me this and gives me sane errors when that hasn't happened.
In my experience, other package managers (my experience is with Python's and Ruby's) will behave just like Haskell's. Specifically, they do not seem to inspect the user's $PATH variable to determine whether their default installation directory is present.
Having used both Haskell's package manager cabal and Python's pip and easy_install. Cabal has a long way to go before it's anything like pip. You can see cabal moving in the right direction, but pip is leagues ahead of it right now, especially when using virtualenv.
1. npm When installing packages with the --local flag, the npm package manager moves up the pwd path, looking for a node_modules directory. Once installed there, packages can be required by package name in dependent scripts. Common use case is to put a node_modules directory in your home directory to keep multi-project development dependencies out of your global system node install.
2. Ivy/Maven. Dependencies are placed in ~/.ivy, and are available for use without changing environment variables.
3. rubygems. A ~/.gems directory is created that provides dependencies by ruby version, if you wish.
Now, running a clis still requires an addition to path. But package managers do exist that organize dependencies on the user path without explicit environment variable edition.
Sounds like pip (popular package manager for Python) avoids this problem because you don't use pip to install or upgrade itself. If using the package manager to install updated versions of itself is an intended use case, then I agree with others: it should check your path after the install and make sure the newly installed binary is the one that will get used. Otherwise the design is making it too easy to do the wrong thing.
While it is true that some of those problems are real, this level of rudeness is just not acceptable. Especially the accusation of "inbreeding".
One thing that is wrong is the lack of "semantic versioning". There is the Package Versioning Policy (http://www.haskell.org/haskellwiki/Package_versioning_policy), and most packages now adhere to it. Granted, its prescriptions differ from "semantic versioning", and some people disagree that Haskell packages should specify such restrictive versioning boundaries all the time.
(The PVP essentially says that for a version a.b.c.d, every change of a or b means that the API might have changed. Now it might be that most packages are not even affected by the change...)
I admit, not having published a haskell package yet, that I've only done a shallow reading of the PVP - is it actually incompatible with semantic versioning or orthogonal? That is, is it possible in the general case to adhere to both? If not, are they incompatible in most cases or very few?
This is the biggest difference (quote from semver.org):
"Major version X (X.y.z | X > 0) MUST be incremented if any backwards incompatible changes are introduced to the public API. It MAY include minor and patch level changes. Patch and minor version MUST be reset to 0 when major version is incremented."
Meanwhile the PVP lets you get away with just updating the minor version there.
Oh, I realize that PVP does not require everything semantic versioning does. It sounds like PVP requires nothing semantic versioning forbids, nor vice-versa? In which case, you could (and, I would assert, should) follow both.
The problem with packages and Haskell is well known; there are a few camps currently trying to solve the issue. Personally, (although the author went about things in a rather roundabout way) I can feel for the author. I LOVE Haskell, I cannot think of a better language for prototyping crazy ideas in and I work in it every chance that I get; that being said, its awful to get up and running with cabal. The suggestion which has been posed to me is "simply use cabal-dev", when I do this, I wind up with crazy huge project files because of repeat installs. I sincerely hope that someone much smarter than myself can come up with a solution for this; I think it is a primary reason why Haskell is not being more widely adopted.
I find it really interesting how a the success of programming language (or project) is a combination of multiple factors that aren't necessarily correlated -- like for example, the Haskell community is full of really genuinely nice people while the Linux kernel community is full of screaming. Or how lisp is famously great except that there's multiple incompatible implementations; there's nothing about lisp itself that causes that. And similarly in this case: there's nothing inherent about Haskell that required the versioning mess it seems to have* .
In all these cases there are some weak correlation: perhaps the research origins of Haskell led to a culture of respectful disagreement, perhaps the flexibility of lisp implies more differences of opinions, perhaps extra-powerful static typing leads to more breaking APIs on version bumps. But that's not enough to fully explain it. A lot of it is just how things turned out.
In an imaginary world I'd be able to just read the language spec and have enough info to judge whether to base my next project on something, but it turns out there are all these "soft" factors that matter just as much.
* You could imagine for example a different design where you get a separate copy of all needed libraries for each project so that they never stomp on each other (like what the various fooenv tools do for ruby/python/etc.). It has different tradeoffs, sure, but it at least superficially seems it would reduce frustrations like in this post.