Hacker News new | past | comments | ask | show | jobs | submit login

Haskell aims for simplicity. That is favoring the reader/maintainer of code over the writer. It’s a different trade off that is not obviously worse.



I don't really consider Haskell an example of simplicity, I consider something like Elm or Go simple. Haskell is...correct?

But this is a problem with these discussions because my definition is different than yours and they could both be argued to be correct. Is this like Hickey's easy vs simple? Is my definition incorrect?


I meant it in the objective sense of information theoretic complexity. I’m not sure what other measurements would be objective in comparison.


Is there any research which shows that Haskell is superior in terms of information theoretic complexity.?

I'm not familiar with the term. The only thing I can find it in reference to Haskell is this:

https://en.wikibooks.org/wiki/Haskell/Algorithm_complexity

which isn't really related to this:

That is favoring the reader/maintainer of code over the writer.

What material is there which shows that Haskell is simple in terms of information theoretic complexity and superior to other languages (not?) designed for that?


Being more rigid about semantics and formalization of program meaning means that it is a simpler process to answer queries about what a certain piece of code does vs an alternative “pragmatic” language that is riddled with exceptions and lacks a strong type system.

If your job is to take a piece of code written by someone else and either fix a bug or add a feature, it is objectively easier to do when you have strong guarantees about input types and format, exceptions generated, referential transparency, side effects, execution model, etc.

You also get shorter programs and simpler programs in the algorithmic complexity sense when you have strong first class features for modularity, interface definition, and constrained polymorphism.

Haskell has all of these properties as good and often better than “production” languages, and has since the 90’s. Haskell is objectively better on all counts.

However what matters is not objective truths but subjective realities. Haskell is also “different” in a way that is only endearing to mathematicians and CS theorists (which are really the same in the extremes). You can hire someone and train them on Haskell, but it’ll take much time and money to get them to similar comfort levels, and not everyone is willing. And with Rust, which carries over many of Haskell’s benefits to the imperative world, you can get 80% of the benefit for 20% of the cost. So why bother?

If I had a time machine though, I think code dropping Haskell 98 back prior to the invention of FORTRAN would have put us in a much more desirable alternative history. One where code mostly works as advertised, security is based on proofs of correctness not boxing, requirements and intentions are more clear and explicit, etc. Too bad we live in the world we do.


Two of the core tennents of Python is: simple is better than complex and code is read much more often than it is written.


It amazes me that someone could write down that principle and then design an untyped programming language.


Dynamic types can be more problematic to modify, but many of us find them easier to read: assume that the code actually worked and does something reasonable, now skim for the gist of it (without having to see a bunch of extra detail).

Assembler is untyped (just bytes and words). I'm not big into Python, but I'm pretty sure it has types, but they are late/runtime bound.

Dynamic types are probably not a good choice for an army of idiots, but if dynamic types were so completely unworkable, you would think that they would disappear, eh?

That said, I'd rather see avionics written in Ada than Python, but not every problem needs that level of scrutiny and pain.


> assume that the code actually worked and does something reasonable

That's almost never the case if you're in a situation where you're reading code


Why is dynamic typing (with reliance on duck typing) any harder to read, or more complex?


    def add(a, b):
        return a + b
In a dynamically-typed language, you can't actually know if this dead-simple function will throw an exception, until you know the entire call graph leading up to where this function was called. That's fine in small scripts, but really freakin hard if you have call-stacks 10 levels deep.


Sure, but it is both exceedingly readable and exceedingly simple. It is pretty much pseudocode. Which is what we are discussing, not type safety in huge codebases.


Thats true in small functions, but that's also true for statically-typed languages with type inference. In OCaml, the same function would also be

    let add x y = x + y
Looks reasonable to me, and this is statically type-checked


If I were making an argument in this case, I'd say that several other languages have fulfilled the "readability" benefit while also fulfilling orders of magnitude higher performance and type safety garuantees. This implicates that one case is write only instant legacy code and the other will be highly maintainable going forward if the codebase and team need to scale in size.


Sure. That doesn’t mean it achieves those goals very well.




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: