Hacker News new | past | comments | ask | show | jobs | submit login
Dijkstra on Haskell and Java (2001) (chrisdone.com)
142 points by karshan on Jan 18, 2015 | hide | past | favorite | 116 comments



I don't see why they don't just teach C, x86, and Haskell everywhere. If you know the concepts behind these, you can quickly grok any other programming language. Also I never got the whole hype behind OOP. IMHO its self explanatory, nothing you need an entire course on to be effective and understand OOP programs. Memory layout and management, advanced pointer concepts and monads are a slightly different matter.

If I had to teach someone OOP I'd say, OOP is like the real world if it were made of puppets and every puppet could pull another puppets strings, pull a string and a dog barks and wags its tail causing a cat to meow, the key is just not tangling up your strings.


and monads

I really don't understand why people have such issues with monads. I don't have a haskell background and originally built up most of my functional programming skill in Python (going somewhat against the grain) and I find monads to be a fancy scary term for an extremely simple concept. I think if they were named something friendlier perhaps people's eyes wouldn't immediately glaze over and they could realise that, actually, monads are just a simple strategy for controlling how things are executed/computed. Computational or execution strategies, if you will. Lets just call them strategies and see if they are less confusing and scary?


I don't know if I can agree with this. I've invested a significant amount of time in learning about monads (and functors, applicatives, monoids, and more from the bestiary of commonly used category theoretic structures in Haskell) and I find them quite challenging to grasp in their full generality. While specific use-cases are readily understood, getting to the heart of what a monad "is" has taken me time, and I honestly don't believe I've properly grasped it yet.

So I think there's a spectrum of understanding and effective use when it comes to monads and other algebraic structures like these, and I'm skeptical when people say, "oh yeah, a monad is just X, it's simple." I've said that myself in the past and I was wrong, and so were most of the people I've heard say that.

To put it another way: when I hear people who know what they're talking about say it's simple, they are talking about its structure, not about understanding what it is and how it is used. In that sense it is quite simple. But as Euclid said, there's no royal road to geometry (er, or monads...).


Absolutely there's a spectrum and I certainly don't fully understand the full set of category theoretic structures (I'm slowly trying to learn, but.. its slow going). I just think that monads as programming constructs are quite simple. I don't doubt that in a full mathematical context there is a hell of a lot more to them and the mathematical reasoning behind how they work is surely beyond my understanding, but from a how they work, how to use them, how they interact with the language and what they enable you to do point of view, they aren't difficult at all.


...but from a how they work, how to use them, how they interact with the language and what they enable you to do point of view, they aren't difficult at all.

I guess we'll have to agree to disagree here: I have found it challenging to grasp all of these things (even simply within the context of programming language usage). I guess that is, in the end, rather subjective.


For me the block for a long time for monads was understanding how they made IO possible.

That is, the basic ideas for things like a Maybe or similar were obvious and intuitive to me, and everyone software dev for long has done such things, even if they never think about what parts they could refactor out into something reusable.

But the leap from that to how it makes IO possible without side effects I just did not grok. It really only clicked reading something or other that described (admitting it was lying a little bit) the IO monad as something that threaded the state of the world through the program; that is, that world state becomes both an (invisible) input and output for functions, and so effects on it are fully encapsulated by the param(s) and return type of the function.


That's a fair observation and I admit I didn't consider that when I wrote my comment. It is pretty difficult to visualise and leads to a lot of "but you still interact with the outside world so how can it be pure" kinds of confusion. Its very hard to visaulise how the program essentially gets wound up and then at runtime as IO occurs gets unwound. Some of the other monads are definitely simpler to understand!


> controlling how things are executed/computed.

No...see...you understand a monad. That is not the same thing as groking "Monads" in the abstract sense.

For instance, I don't think the State monad really has anything to do with how things are 'executed/computed'


Sure it does - you execute code in the context of a monad. In the case of the State monad, your code executes in the context of a stateful environment.


I really don't understand why people think monad is a "fancy scary term". It's a five-letter easily-pronounceable word. What makes it fancy or scary? I think people assume that they're scary, and then confirmation bias themselves into making them hard to learn, despite them being a reasonably simple concept overall.

As for calling them strategies… if we called them that it would make learning and understanding the intermediate and advanced concepts significantly less accessible and more difficult. We should call things what they are.


hmm, if we called them what they are we'd call them "type wrapper functions" and talk about "type wrapper function chains", monads carries zero metaphorical/associative meaning. I think thats the problem with the name and a large reason people find them so unapproachable, most other terms in computer science carry significant associative meaning that helps in learning them, pointers have to do with pointing at things, functions carry the mathematical meaning of a set of inputs mapping to a set of outputs, etc.


You're demonstrating that you don't know what monads are. "Type wrapper function" could mean any number of things (lack of specificity is a bad trait in naming), but it couldn't mean a monad: monads consist of three things, whereas "type wrapper function" implies one thing.

Names don't need to be metaphorical. The purpose of a name is not to teach you what the thing is. For that, you should have a proper explanation from someone who knows. Names have a different, very important purpose: uniquely specifying the thing being named. "Monad" does that very effectively.


> I really don't understand why people think monad is a "fancy scary term".

The reason could be that the term "monad" comes from category theory, and that is a topic not everybody is too familiar with...


Lack of familiarity doesn't make it fancy, and for a software engineer it shouldn't make it scary either. One of the largest parts of our job is learning unfamiliar things.


We should call things what they are.

Sure - I wasn't seriously suggesting we rename them, just that in my personal experience, newcomers get scared off seemingly by the name alone before ever actually finding anything out about them. A friendlier name may avoid this for newcomers. It definitely wouldn't be worth it for more advanced practitioners where referring to them exactly and in the context of their category theory roots is much more useful than friendliness.


I'm not sure how worried I am about the class of newcomers that will be scared off by a word that they don't know. PHP is friendly to newcomers, and the result is thousands of insecure websites built by people who think they know what they're doing but don't. Software engineering is a professional discipline. I don't understand the desire to pander to the lowest common denominator. When's the last time you saw a physics practitioner propose renaming vector spaces to be friendlier to the students?

(For what it's worth, you may not have been serious about renaming monads, but many others have seriously proposed it. In most languages other than Haskell they've been successful.)


I don't understand the desire to pander to the lowest common denominator.

If people want to learn about monads is up to them and I don't have any skin in the game, but I hear a lot of people complaining that monads are so difficult to understand or use and I guess I was really just observing, in a roundabout way, that I feel a lot of that is down to monads appearing more complicated or scary than they are and that people are put off by it, kind of how people are put off Lisp's due to the parentheses.


Yep, on that point I agree with you completely. I don't at all understand why people find their appearance complicated or scary, but I agree that it's more appearance than reality.


> What makes it fancy or scary?

Its an abstraction and abstractions are hard. In away, its similar to why beginners tend to have trouble with recursion (if you try to understand a recursive program by stepping through without using abstractions like preconditions and postconditions)


Why would calling them anything different have any effect on learning other concepts?


Because the associated concepts, for example Kleisli categories, are easy to find if your search is seeded with the word "monad", but very hard to find if it's seeded with the word "strategy".


How about F#'s Computation Expressions? Not exactly the same, I'm told, but they're close, if so.


They're called computation expressions in F# and not monads because they're not exclusively used to implement monads. The term is more generalised to capture the much larger scope of the feature.

Here's a breakdown of the difference:

http://tomasp.net/blog/2013/computation-zoo-padl/index.html


Excellent! Thank you!


I was kinda thinking of F#'s Computation Expressions when I posted my comment.


They actually do this at the University of Chicago. That's exactly the introductory sequence.

The first class in the major is in Racket (Standard) or Haskell (Honors).

The second class in the major is in C, and also covers basic UNIX tools.

The third class is in x86 assembly and C, focused on understanding low-level systems.


I wish I was there. My curriculum at Carleton University is

  1. Intro to CS in Python ending with a taste of Java
  2. Java
  3. C (intro to UNIX)
  4. C++ (design first, implement after, waterfall, etc.)
Along with 3 there is data structures taught in Java [0] and with 4 there is web dev taught in Node.js.

0: Following this text: http://opendatastructures.org/ Taught by its author, Dr. Pat Morin.


Actually, the first class is now Java and C++. Morin isn't teaching Python any more.

"Functional" programming is taught in third year (COMP 3007), but you don't get too deep into it.

My issue with the listed classes isn't the chosen languages. A commenter mentioned that you can learn a lot of functional (and other) concepts in a procedural or object oriented language. My issue is that I basically learned the same concepts over and over again, just using different programming languages. You learn to iterate over a multidimensional array in Java, and then you do so again with C, but you have to deallocate and reallocate, and then you do it in C++, but you create an object for the array. It's all the same stuff.


Aha. Do you have an outline of a better way?


In that sense imperative programming is equally hyped. It too is like the real world, first you do this then you do that. Similarly there is nothing fancy about functional programming since algebra and calculus already introduce the ideas of composing smaller things to get bigger things.

What I'm getting at is that computation exists on several planes of abstraction and maybe covering C, x86, and Haskell might give you a good overview it will still leave things hidden because there is also Prolog and the branch of computation that Prolog leads to. Most schools settle for teaching their students the theory and some marketable skills by sticking to well-known languages like Java and Python.


> If you know the concepts behind these, you can quickly grok any other programming language.

"Grokking" a programming languages is just a means to an end, and far from being the most important means (well, for interesting problems, at least). Most (though not all) of important concepts in CS have absolutely nothing to do with the choice of language.

At my university they taught Scheme (SICP) in Intro to Comp Sci, and C -- both in the first year -- and all the rest was just data structures, algorithms, computational complexity, operating systems, numerical algebra, compilation, and electives in vision/graphics/PLs/DSP/whatever. We could do the exercises in whatever language we chose (usually C). In short they taught us just theory and one practical PL (maybe Java, too, for concurrency problems) so that we could actually do the exercises, and boy did we have a lot of good programmers who came out of that school. I owe to my university the realization that the programming language is one of the least important things in CS and software engineering.


> I don't see why they don't just teach C, x86, and Haskell everywhere

On my university in Portugal back in the mid-90's, we had:

Pascal, C, C++, Caml Light, Smalltalk, Prolog, Java, PL/SQL, x86 and MIPS Assembly

Additionally we also got lots of CS stuff like lambda calculus, relational algebra, language semantics and so on.

It always feels strange so to me that there are universities out there focusing on single languages, or paradigms.


I'm not sure pointer concepts are very difficult, pointers might confuse people that didn't grow up with assembly language, but they really aren't very hard to understand and they are also less and less relevant. Over the years the abstraction level increases, it's far better to teach people to continue computer science to the next 21st century level, not go over stuff from the 70's again that they will likely never use. The breakthroughs in CS coming very likely won't be written in C (or assembly for that matter).


You cannot write effective code in Java, Python, or C++ without understanding the difference between references and values. You can certainly stick to higher levels of abstractions - most functional languages do a good job of that, especially ones like Haskell that emphasize immutability and value-only semantics - but pointer concepts are absolutely relevant to the industry-standard languages.


I kind of feel like as long as we have imperative programming languages, pointer concepts will be useful. I mean, that's basically what references are.


Add a Lisp or Scheme dialect to your list and I'm sold. Homoiconicity, macros, s-expressions and such are too powerful and novel not to cover in a college CS curriculum.


The problem is, people finish their CS study, and do not have such level of knowledge that they could start working right away. So Haskell might be nice, but is hardly a sought for skill. Sure if students would learn different stuff by themselves, they could learn useful skills too, but most of the students are lazy, and only learn what they have to.


So, what would you cut out of CS so they can learn things that makes them employable right away, by a wide variety of companies, and not just yours?

Keep in mind it should be stuff that will have staying power, and not the latest fad, because you tend to go to school for at least 4 years.


The pursuit of knowledge shouldn't be dampened by silly things like getting a job. It's computer science after all.

Aside from that, learning Haskell at university has eventually made me a better programmer. It gives you a whole other world outlook and that's useful once you finally start figuring out how to build reasonable software.


"The pursuit of knowledge shouldn't be dampened by silly things like getting a job."

Here's how I just read this: "The pursuit of what makes me look good shouldn't be dampened by silly things like serving people other than myself".

And than just to dampen that statement, you added "Haskell made me a better programmer", which clearly implies you got a job, serving the needs of others.

Would it be fair to say you are more concerned with looking good than serving others, based solely off what you said? Correct me if I'm wrong please.


I'm not the person you're addressing, but - yes, you're wrong. (And an asshole to boot, but that's another issue.)

"Getting a job" has absolutely nothing to do with "serving others," except maybe in the slavery sense of "serving," depending on your politics. The point is that universities are supposed to be dedicated to knowledge and understanding for their own sakes. It's not about "looking good," and I have absolutely no idea where you got that from. It's about learning, in whatever way best expands your understanding of the world, yourself, and the absract.


That's exactly the spirit in which I was meaning it and there's not much more to add really.


You clearly have no idea what I'm talking about. Thanks for the compliments.


This is the worst attitude. University is not supposed to be vocational school.


It is de facto vocational school, though, since the demand for jobs generally outstrips the demand for labor and employers decided university credentials were indicative of what they were looking for.


No, the worst is when you get to talk with people fresh out of college, and you can't give them a job, because they are useless. Sober up people.


No, the worst is when you get to talk with people fresh out of college who "know" all the latest industry hype but still can't put basic programs together, lack basic problem solving skill and basic algorithmic thinking.

Sure, teach them some practical get-a-job stuff, but also teach them how to learn, how to solve problems, the different tools (functional, OO, logic, low-level etc) available and make sure they can solve real problems on their own.


> I never got the whole hype behind OOP.

It's really not much to it. Get value records right. Get method records right. Sadly, none of C, x86 or Haskell does this.


I love the analogy. I'll use it with my freshman students.


There's a link on the page to this essay by Dijkstra: http://www.cs.utexas.edu/~EWD/transcriptions/EWD10xx/EWD1012...

It's overall badly (at least foggily) written, but I find this quote really on point:

>The moral of the story is clear: real programmers don't reason about their programs, for reasoning isn't macho. They rather get their substitute for intellectual satisfaction from not quite understanding what they are doing in their daring irresponsibility and from the subsequent excitement of chasing the bugs they should not have introduced in the first place.

And that, ladies and gentlemen, is why I think people think C/C++ is an acceptable tool for anything besides drivers and kernels, for which they're only acceptable because there's no better alternative. Sure, pointer-pointers is a magnificent idea! I'll totally keep that under control!


You could actually argue the opposite.

People who can use C++ need to reason about their programs because otherwise they're going to shoot their foot off. You can be irresponsible in a language like python and get away with it because you're not going to cause the issues you would in C++.

I think that, to some extent, this is true. If you don't know what you're doing I find it's easier to code yourself into a corner in C++ than something like python.

(Disclaimer: I haven't used C++ in a decade)


I've never formally studied CS, but I can definitely see the benefit. I can divide my own (amateur) programming experience into pre-Haskell and post-Haskell. Before I started learning Haskell, I thought I knew how to use a good half a dozen or more programming languages. After I started learning Haskell, I realised I'd really only known how to use one all along, and that they were all fundamentally the same. I wish I'd had the experience sooner (plus, learning Haskell is just plain fun).


If you liked the "I'd really only known how to use one" type of experience, then I encourage you to keep learning languages after you are happy with where you are with Haskell. My experience is that languages stop feeling like cohesive entities and start feeling like a collection of features.

The nontraditional categories I've encountered are something like:

The macro facilities in lisp, scheme, forth (or factor), template haskell, and C++ templates.

Array languages like APL, K, or R.

Prototype inheritance with smalltalk or Lua.

Avoiding side effects with Haskell, Ocaml, or C#'s linq.

The "we're serious about type theory" languages like Haskell, coq, agda, and idris.


As much as I love Haskell, one problem with the intro to CS courses is that many of the students are not CS majors, but students from other disciplines that need to learn a little bit of programming. It makes sense here to teach Java and Python, as imperative languages are more useful to real world programming than functional languages.

Perhaps it would be better to create a separate intro course for CS majors. On the other hand, most curriculum in CS demand that students learn about languages like prolog and Haskell, so it's not like students will never be exposed to it.


At my university, pragmatic programming courses were part of the math department. If students want to learn to program, perhaps they should take programming courses rather than CS courses, instead of compromising the education of full-time CS students. The courses do have to be available though, and not all universities provide them unfortunately.


My alma mater is having this fight between the CS departments and engineering/physics/etc. The CS department switched the first two courses of the intro sequence from C/Java to Scala, moving C to later in the curriculum. Instead of C/Java/C++ in the first three semesters, it's now Scala/Scala/C++.

The department only has 7 professors and is struggling to even offer enough intro classes for their own majors, much less for students in other majors who only need a single CS credit. The engineering department is upset because they want their students to learn C. Physics wants them to know something else, the business department wants them exposed to a tiny bit of Java and so on.

When I started in 2008, there were about 30 freshman taking CS1. Now it's >150 last I heard. They really had no choice but to (diplomatically) tell the other departments to go pound sand. They can't compromise their own majors in favor of those from another department.

My guess is they aren't the only CS department facing this problem.


To followup with a note from the article: "A very practical reason for preferring functional programming in a freshman course is that most students already have a certain familiarity with imperative programming. Facing them with the novelty of functional programming immediately drives home the message that there is more to programming than they thought."

It's no surprise that this attitude can discourage students without programming experience... A parallel intro track seems like a nice solution that has worked in a number of schools.


That actually did bother me in the letter.

He states:

-Most students taking CS courses are already familiar with an imperative laguage.

-we are all shaped by the tools we train ourselves to use, and in this respect programming languages have a devious influence: they shape our thinking habits.

This would imply that most CS students already come pre-ruined. At that point then, does it matter what language is picked?

Another alternative would be to start functional languages with data structures and algorithm courses(Generally taken sophomore year, and one of the first CS specific classes taken), and use that to fix broken habits. idk if the professors though want to take time to teach a new language if students already know one.


This is the same guy who said

"It is practically impossible to teach good programming to students that have had a prior exposure to BASIC: as potential programmers they are mentally mutilated beyond hope of regeneration."

and

"The use of COBOL cripples the mind; its teaching should, therefore, be regarded as a criminal offense."

and

"FORTRAN, 'the infantile disorder', by now nearly 20 years old, is hopelessly inadequate for whatever computer application you have in mind today: it is now too clumsy, too risky, and too expensive to use."

Basically he's a mathematician who codes, and who wants code to be math.

Unfortunately in the real world code is also politics, business, history, sociology, and psychology. The math part is important, but it's not the whole story.

He's often funny, in a dry way, but it's useful to understand that he's actually not very insightful about the psychology of effective programming.

He praises mathematical intuition, and he wants to develop it, but he doesn't seem to have considered the possibility that simply throwing Haskell at people, or asking them to work through formal methods, may not be the best way to do that.

There's almost no useful research I know of that examines what mathematical intuition is, and even less about whether it's teachable. So Haskell and formal methods may help some people some of the time. But it's not in any way a given that they're the best of all possible teaching methods.

He also misses an entire class of UI/UX bugs where the code works exactly as it's designed to, and is provably formally correct, but an application is useless because the UI is incomprehensible and/or misleading and/or actively distracting to the task at hand.

Ironically, it's exactly that interface between fuzzy human expectations and implicit modelling tendencies and the rigidity of code that causes the most issues for practical projects.

This doesn't just apply to applications - it also applies to languages, operating systems, and development environments, some of which add unnecessary cognitive loads which make programmer mistakes more likely.

His solution - formal rigour - only works if you can formulate a problem precisely in the first place.

The set of problems where it's possible to do that doesn't come close to covering all the problem spaces touched by computing.


I've always dreamed of building a programmable calculator that runs a lightweight Python (or Haskell) interpreter to replace all that obsolete TI crap that most schools use. I would have loved that kind of thing as a kid.


FWIW, UC Berkeley's intro CS course sequence goes through the following languages in order: [Scratch (in an optional gentle-intro CS course)], Python (with an emphasis on using it for functional programming), a tiny bit of Scheme, a tiny bit of SQL (before they used a toy logic programming language), Java, C, MIPS assembly. I think this walks down the ladder of abstraction very nicely.


Odd to use Python as your intro to functional programming, since it lacks many functional programming features besides lambdas which the language creator doesn't even like (http://www.artima.com/weblogs/viewpost.jsp?thread=98196)


Python is more object-oriented than anything else. It's not terribly functional. Most tasks that can be accomplished with functional programming in Python aren't.


Lambdas (in Python) are just syntax; it's usually better to achieve concise Python code using list comprehensions. The critical feature for functional programming is first-class (and higher-order) functions, which are supported and encouraged in Python. I think Python is better for an introductory functional programming course than C#, Java, C++ et al, which also support "functional programming features" but only in the context of a fairly rigid OO system. What do you think is missing from Python?


I'm actually somewhat sad that they added the two in front of Scheme. I took an intro to CS course there the summer before heading to college and it was taught in Scheme. The teacher literally taught us the language the first day and then, from there on out, it was learning different techniques through mostly computer labs. Prior to that, my only programming experience had been in TI-Basic, so it was an eye-opening experience to realize that programming was more than just an ordered set of instructions. If I hadn't had that experience, I think I might have learned the wrong lesson when my college CS courses dumped C on me.

I'm not sure about Haskell as a beginner's language, but I consider Scheme to be almost ideal for that purpose since it's so easily digestible.


I particularly like that they walk the ladder of abstraction downwards, not upwards. I'm not totally convinced, but I have a strong hunch that walking down that ladder is more pedagogical than walking up. At least in math I have a much easier time reading the proof if I know what it's trying to prove beforehand.


I was attending UT during this time period, but luckily it took until my senior year before the Java-fication of nearly every class was complete.

My intro class used Dr. Scheme, and I think it worked very well for introducing new concepts. Following classes were mostly C++ with some C and assembly mixed in. When they finally introduced Java it took students a couple days learn it since they all had decent C++ backgrounds from previous classes. I don't think that would work in reverse, a Java background is not going to let you pick up C++ in a couple days. That is what the department did not understand, not knowing Java isn't that much of a determent to getting an entry level Java job. With a solid foundation, a junior programmer can pick up Java quite fast.


Indeed, my first CS class in 1999 at UT was in Haskell. Wow! What a change it was from high school AP Computer Science in Texas, which was (in those days) C++.

Seeing QuickSort in just one line was what hit it home.

I remember thinking at the time that this was really incredible, but that Haskell had no future. This was at the time of Hugs 98, although I remember hearing about GHC. I'm glad to be wrong about this.


Hold on: does Haskell allow "the" QuickSort on one line, or just a sort? Because from what I've read, if you want to make Haskell do all the efficient stuff in a QS, you have to tell it a lot more, bloating the program.

Edit: this is what I had in mind: http://augustss.blogspot.com/2007/08/quicksort-in-haskell-qu...

Btw, I took AP Compsci with C++ in Austin around that time, so we might know each other.


If you think quicksort is bloated, wait till you see a full implementation of depth-first search (I mean full: with all the back/cross/forward-edges and everything). It's a good exercise, but not very pretty.

Most of the bloat comes from the fact that when you implement it in Haskell, the resulting program doesn't really depend on being executed in the IO monad, and can be easily modified to run in many other monads too (like ST, or some transformed monad). The result is that the Haskell imperative version is far more general than a similar C implementation.


Like a lot of UT Computer Science, the implementation is a detail left for the reader. The semantics of the language make the one-liner n*log(n)--that's the takeaway.


But the one liner was the implementation, and doesn't do a QuickSort...


In tests and homework, we would implement the one-liner quicksort with paper and pencil, much like a math test. In a pencil implementation of that Haskell, it is n log(n). Each line on the paper represented one level of recursion.

Implementing it on silicon requires the trade-offs you mention. But silicon is just an implementation left to the reader...

EDIT: I guess where I'm coming from, Haskell was used in the context of the Theory of Computation, not real-world implementation details and the like.


I'm not disputing the scaling properties of the Haskell one-liner, but it doesn't give you the benefits of the true QuickSort as your comment suggests, e.g. it doesn't guarantee in-place sorting or efficient use of when access be locked.

The point is, I find it questionable to praise Haskell for how you can implement a one-line QuickSort that isn't really a QuickSort.


Each list comprehension is O(n). O(n)+O(n)=O(n). Each iteration will roughly bisect the list, resulting in roughly log(n) iterations. Thus the one-liner is O(n*log(n)).

Haskell says nothing about how lists are implemented, so with respect to "true"ness (I'm not really sure what this means), we can cannot generalize. A sufficiently optimized Haskell compiler implemented in silicon would have the liberty of using the memory in-place.


>Each list comprehension is O(n). O(n)+O(n)=O(n). Each iteration will roughly bisect the list, resulting in roughly log(n) iterations. Thus the one-liner is O(n*log(n)).

"I'm not disputing the scaling properties of that sort" = this proof does not convince me of anything I didn't already agree with. Was I unclear about what I was objecting to?

> Haskell says nothing about how lists are implemented, so with respect to "true"ness (I'm not really sure what this means), we can cannot generalize. A sufficiently optimized Haskell compiler implemented in silicon would have the liberty of using the memory in-place.

"Trueness" means "doing the real QuickSort". The one you posted doesn't. You can get the real QuickSort, but not with one line. A better compiler might identify useful invariants; you were not using such a compiler.

But if you had such a compiler, you wouldn't need to write I that way; you could just specify what conditions the final sort would obey, and that would be enough. You wouldn't need to tell it to filter explicitly.


What do you mean by the "real QuickSort"? The professor who taught me Haskell at UT used to work on Burroughs LISP machines and wasn't too concerned with caches, memory, or the like. Again, Haskell can be viewed as a language for computation. From this perspective, the one-liner is pure beauty.


Dijkstra's advocating a Sapir–Whorf[1] of programming languages, which I long suspected to be true; but haven't been able to rigorously prove or disprove.

[1] http://en.wikipedia.org/wiki/Linguistic_relativity


My main problem with teaching CS courses in Java was that it produces folks who have no clue about pointers or memory management in general. This however applies to Python and Haskell as well. I personally had a strange mix of Java, x86 assembly and Ruby in my curriculum.


That's like saying "The problem with teaching people how to drive cars is that at the end, they don't know how to ride motorcycles".

There are plenty of other classes at the grad and undergrad level that teach you about lower level concepts such as pointers and memory (system, C, OS, ...). The Java classes teach different concepts.


I never got C or C++. The OS course, sure, but it was awfully theoretical and we didn't get to code all that much. Depending on the rest of the curriculum you should probably decide what to use to teach introductory CS.


Dijsktra isn't suggesting that the entire computer science curriculum be taught in Haskell. That would be impossible. He's saying only that the introductory course in programming paradigms be taught in Haskell. Haskell and Scheme are well-suited to such a course.


A classic curriculum in functional programming, which goes back as far as SICP and is continued in modern books such as PLAI, is to build an interpreter for a programming language. This forces issues like memory management to be discussed.


I was lucky to get a mix of C, Java, and Haskell. Unfortunately (to my detriment) I totally ignored Haskell, and it didn't really click until I picked it up at the end of my degree out of curiosity- it really does help to clarify one's thinking.

I think teaching the motivation for learning it before beginning would have been better than just throwing it at freshmen, but I can't truly blame anyone other than myself.


I agree with him in the sense that these are CS courses, and in in such a way Haskell ties together well with the math that one learns in a CS program. Keeping that in mind it makes sense to only have taught Java in courses specifically designed for developing software.


I am looking forward to the day the notion that functional languages are not suitable for developing software will die.

It is not only that functional languages are more succinct, easier to reason about and are more readable, they also often come with significantly better type systems. Some have type systems sophisticated enough to specify nearly all the legal states and guarantee these are the only states the program can stay in at compile time. One can look at languages with dependent types for that.

On the more practical front, Erlang proved itself over three decades to be one of the best choices when it comes to fault-tolerant, highly concurrent systems. Jane Street is using OCaml for all of their trading software, Morgan Stanley has moved to Scala, and so on and so forth.


Neither Erlang nor OCaml bring with themselves the burdens of purely functional programming and the sophisticated type systems that you talk about. You also mentioned Scala and that one too is extremely pragmatic and lets you mix and match both OOP and functional concepts instead of sticking to purely one paradigm. Erlang is dynamically typed and OCaml does not mark side-effects with types and force you into the monadic context for all such side-effecting computations.

Purely functional programming still remains a purely academic exercise because it fetishizes type systems to the detriment of all other concerns in software engineering. Although I do enjoy some of the things that come out of that kind of work, e.g. parser combinators.


There's nothing unfunctional about OOP. The two are not at odds. It's just frustrating when people confuse OOP with imperative programming and concludes that for sophisticated type systems to exist, all the lessons from OOP has to be thrown out. Imperative programming needs, to some degree, to be thrown out if you want programs that are easy for the compiler to reason about, but OOP doesn't.


Well there is something unfunctional about OOP, encapsulation of state and some implicit assumptions about mutability. Although I do agree that neither paradigm is at odds with the other. Scala demonstrates that there can be a fruitful interplay between both.

As for the existence of sophisticated type systems I again don't disagree with you. I'm having a lot of fun playing with TypeScript and mixing and matching dynamically and statically typed portions of my code. I'm looking forward to see where that line of work leads because it is an extremely pragmatic approach to type systems and helps me program instead of adding unnecessary cognitive overhead.


> Purely functional programming still remains a purely academic exercise

This is false.

Also, as far as I can see, only you mentioned pure functional programming. Sophisticated type systems do not mandate purity; see Scala for example.


That day will come, for many of us, when a large AAA game, web browser, or usable OS is written in a FP language.

And I realise that is an unfair target, but there is very little user facing FP software. The only one I can think of I have used is xmonad, which is both hard to use and fairly buggy


You mean like Crash Bandicoot, Abuse, Jak and Daxter?

Or maybe Genera.

Or maybe Remote Agent software used by Nasa Deep Space 1?

Or eventually the train control systems running on software from Siscog?


You can do functional programming in Lisp, but even the wiki page for GOAL says "GOAL encourages an imperative programming style". Not to mention most Lisp data structures being mutable if we're talking purely functional.


I have a problem with modern notions of purely functional.

My first functional programming language was Caml Light, back in 1996. Followed by Prolog and eventually Lisp.

All alongside traditional lambda calculus and logic proofs, with ocasional references to a programming language called Mirada.

So for me, Lisp is functional programming and I don't buy into this modern notion that only Haskell is the poster child of FP.


I looks like alayne is using 'purely functional' to mean Pure + Functional, but you are using it to mean Everything is an Expression / First-class functions, etc.. The functional paradigm.

Lisp is definitely a functional language. It's just not as pure as Haskell, which is the poster child for Maximally Pure FP, if not for the functional style.


Hence why I made the reference to Miranda.

We were already doing functional programming while Haskell was still using diapers.


Well, yes. Haskell is heavily inspired by Miranda. Haskell was just managed in a way that allowed it to become very popular.


Probably more like when Mozilla ships a browser written in Rust. Even though Rust handles low-level so well, it heavily uses FP concepts.

Meanwhile, this is good news for people creating startups or otherwise being competitive. Having advanced programming languages where your competition is mired down having to churn out 10x more code, is certainly a benefit. Although not every (many?) business really comes down to technical ability.


> Probably more like when Mozilla ships a browser written in Rust.

Actually I would rather see this happening, than the continuous effort on Firefox OS.


> You mean like Crash Bandicoot, Abuse, Jak and Daxter?

Huh, TIL:

http://en.wikipedia.org/wiki/Game_Oriented_Assembly_Lisp


While Crash Bandicoot used LISP (which I wouldn't call FP, although definitions vary), it was only for a small part of the code (the AI), nowhere near all the game was written in lisp.


Not there at all (slower than the Pascal equivalent, limited to fairly simple games, and uses some C), but nevertheless interesting is http://cleangl.sourceforge.net/thesis/CGL.pdf, which describes a game library in Concurrent Clean.

(reading http://clean.cs.ru.nl and looking at the archives of the mailing list may give the impression the project died at the end of 2011, but http://clean.cs.ru.nl/Download_Clean has binaries from November 2014)


Xmonad is buggy? I've been using it for 4 years and never had a problem. I actually also use it on multiple displays.


Since I use Xmonad as my window manager, I'm rather invested in you elaborating on Xmonad being "fairly buggy". Should I worry about my data?


I found that several applications would cause it to go crazy (lock up), including many Java applications and open office. Further, if you started adding many extensions, they would cause crashes (I'm sure it's because they were badly written, but I thought the whole point of haskell was no crashes).

So, if you find nowadays Java apps and openoffice work fine (or, you don't use them), and you are careful about which extensions you add, then you will probably be fine.


Oh! I know what you are talking about and remember the issue. You had/have to add `setWMName "LG3D"' to the configuration since versions older than Java 7 don't include Xmonad on their hardcoded list of suppported window managers. Correct me if I'm wrong, but it doesn't seem like that one was Xmonad's fault.

Interesting, to be honest I don't really use any extensions. I do use Java apps and openoffice these days and they work fine.

Thanks for your response! I'll have to try adding some extensions and see what happens.


Yes, how statically-typed languages deal with dynamic data is one of the interesting features of type systems. In C for dynamic types you resort to void *, in Go it's Interface{}, but in OCaml you can construct a type that identifies exactly what you need.


I suppose you are correct, although all the training I received for software engineering was in straight OOP Java. I really dislike java in general.

Now I write mostly in python and go, learning a bit of haskell.


My favorite quote:

"A fundamental reason for the preference is that functional programs are much more readily appreciated as mathematical objects than imperative ones, so that you can teach what rigorous reasoning about programs amounts to. The additional advantage of functional programming with “lazy evaluation” is that it provides an environment that discourages operational reasoning."


Although I didn't do CS, but electrical engineering, I would have loved to get some more exposure to functional programming. We got Scheme which everybody hated because of the cumbersome bracket counting.

And of course we got C, Java, Matlab, and VHDL, besides a bunch of assembly. VHDL or Verilog would maybe also a nice eye opener for CS students. It's again another mindset.


Think about how that would have changed if they had a simple text editor with parenthesis highlighting.


Absolutely. Or parenthesis coloring, that would be even better.


While I wouldn't start with Java (it's not the simplest language about - I'd probably use C or Python) Dijkstra's anti-Java rant is over the top. It's still a huge language and that's NOT because of any corporate advertising campaign. Something that succeeds as well as Java has to have some actual value.


>Something that succeeds as well as Java has to have some actual value

Well yes, it has some value. However, just because a lot of people like/believe something doesn't make it good/true.

It's not like corporations have some vast conspiracy in place to encourage the use of Java; it's simply that Java, for a number of reasons, is attractive to middle-management types, even though it's not the best language to make good software. That's not necessarily a criticism of Java; from many perspectives, making good software is not the primary goal.

Dijkstra's big thing was prioritizing good software over cheap software. This may not be a realistic goal, but it certainly explains why he preferred Haskell to Java.

He wrote some really good essays on why you need languages like Haskell to raise the overall quality of software floating about. The more you can shift the burden of guaranteeing correctness away from humans and towards infallible mechanical systems (like Haskell's relatively powerful type system), the more likely you are to end up with good/correct software.


> Something that succeeds as well as Java has to have some actual value.

Like cigarettes?


"Something that succeeds as well as Java has to have some actual value"

Sounds an awful lot like, the Bandwagon fallacy




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: