Hacker News new | past | comments | ask | show | jobs | submit login
Imperative vs Functional Programming (apocalisp.wordpress.com)
81 points by gtani on May 31, 2011 | hide | past | favorite | 66 comments



Quoted by the article: Computers function by executing a series of instructions, at the level of machine language, which perform calculations and manipulate memory and I/O.

Today, our computers increasingly have many cores, and many threads. On top of that, we often have to co-ordinate between dozens of separate servers, each with their own memory and I/O.

In this environment, manipulating memory becomes tricky, because you must co-ordinate among anything from 2 to 2,000 simultaneous processes. This raises a whole host of issues: (a) direct interaction using locks and semaphores is hard to get right, (b) the memory in question may be located on separate machines, and (c) if you're running at a billion instructions per second with a 0.1 second lag between machines, simply defining "simultaneous" can give you a headache. And with the rise of GPUs for blazingly fast general-purpose computation, even the hardware itself is becoming deeply functional.

We've come up with a whole host of strategies for tackling this problem: Worker queues, MapReduce, Erlang-style message passing, shared-nothing architectures backed by databases, and even eventually-consistent NoSQL systems. Each of these, in some fashion, represents a retreat from the old-school model of a single process manipulating global memory.

In this world, functional programming adds one more tool to our kit. It treats mutation and global memory as a special case, and gives us a new vocabulary for working in this world: map, reduce, transactional memory, monads, and local state (via Haskell's ST monad, for example). Even if you work in traditional OO languages, this functional vocabulary will give you new ways to tackle hard problems.


I actually like the machines/mathematics dichotomy in CS, though it often seems to come up as a one-versus-the-other holy war. I guess I'm actually okay with both of these statements, and am glad people are pushing both:

1. Computation is fundamentally a machine doing things; although of course these machines' operation is also analyzable via mathematics; and

2. Computation is fundamentally a mathematical phenomenon; although of course it is also implementable on machines.

I guess there's a philosophical debate about which is more fundamental (does mathematics 'exist' ontologically?), but pragmatically I'm okay with considering them both fundamental approaches to CS.



There's a fantastic paper called "Why Functional Programming Matters" by John Hughes [1]. Most of you here will know it, but it's so good that I wouldn't want any newcomers to miss it.

It's from 1984 and IMO still the best essay on the basic idea of FP. For Hughes, FP's fundamental strength is in program composition and modularity, and he shows this with relevant examples.

[1] http://www.cse.chalmers.se/~rjmh/Papers/whyfp.html


"programmers well versed in functional programming are extraordinarily productive in any language"

This could just be due to the fact that people who learn functional languages are interested enough in learning languages that they're productive in most of them. If, for example, functional languages were the norm, and some people decided to experiment with OO languages, then it might be that people who knew OO languages would be the ones that were productive in any language.


Except that FP basically dominates class based OO[1]. If FP were the norm, nobody would study class based OO. Remaining alternatives could be prototype based OO, bare-bone programming (assembly, C, LLVM), stack languages, and domain specific languages.

[1]: http://www.loup-vaillant.fr/articles/classes-suck


That's an opinion, and it may have merit and be valid, but it still could be the case that people might want to experiment with OO languages, and also doesn't change the fact that people who know FP might just be better in any language because of their interest in programming languages, not because of the merits of FP.


It might be true that functional programming is difficult to learn for programmers not well versed in mathematics. My response to that is: Toughen up, princess.

Pithy, but also tinged with sadness. It's a response to the same argument against programming languages being "too powerful" or using "difficult to understand" pattern like parser combinators.


I'm a fan of FP and I find most of the arguments being made here for FP over OO to be quite unconvincing. You don't need to understand mathematics anymore in FP than you need to in OO. If there is any mathematics in FP it's of the logical variety anyhow - which programmers are already familiar with.


Many programmers have also become blind to all the conceptual baggage needed for conventional OOP programming. What the heck does "public static void main" mean?


Since I don't have any training in Category Theory, I don't know what I don't know. I do know that it seems from the outside of the Ivory Tower that the explanations for FP are far more difficult to understand than the FP itself.

But then again, perhaps if I had studies mathematics, I might have some additional insights into FP that go beyond knowing all of the jargon.


You don't need to understand the Y Combinator to understand recursion. You don't need to understand the theory behind the Lambda Calculus to write anonymous functions. You don't need to understand Gentzen Calculi to write interesting Prolog programs. You don't need to understand the Curry-Howard isomorphism to understand static typing.

These things deepen your understanding certainly, but not understanding them does not in any way limit our ability to write incredible programs.


To add to what you said, you certainly need not understand Category theory in order to be effective as a functional programmer.

Category theory is a very powerful concept. Understanding elements of it will certainly open your mind to the fundamental nature of mathematics. But the essential aspects of it have long been mapped in to other, more accessible, theoretical frameworks. So you can usually get away with never knowing that subject :)


It might be true that functional programming is difficult to learn for programmers not well versed in mathematics. My response to that is: Toughen up, princess.

It's not that bad. To make an OO comparison: You didn't have to invent MVC (or any other design pattern) in order to use it. I don't have to know how to invent a monad to model the next 20 effects just to use the state monad today.

FP gets its design patterns from many brilliant mathematicians and logicians. The pipeline to the math/logic world is wide open in FP. And you can just have a sip if you want.


"It might be true that functional programming is difficult to learn for programmers not well versed in mathematics."

This is only seen to be a valid complaint because mathematics education in the US sucks.

As such, it's likely that future dominance in the functional programming field is likely to come from other countries.


He quotes the guy saying "Computers function by executing a series of instructions".

This is no longer true, and becoming less true all the time.

Using a massive amount of GPU's, and many CPU cores is no longer a simple series of instructions.

Then, the more abstract model of functional programming allows the implementation more freedom when distributing work on the many cores/GPUs, and can make use of this new architecture.

If we have even more radical changes in hardware architectures, it is almost certain that the more abstract models of programming will survive better than those tied to a particular hardware architecture (Von neumann model).


Then, the more abstract model of functional programming allows the implementation more freedom when distributing work on the many cores/GPUs, and can make use of this new architecture

I have read this many times over the years and it sounds perfect. It would be great to have some form of programming that maps to parallel architectures without headaches and bug-prone extra work.

Still, for GPU programming we're mainly stuck with things like OpenCL/CUDA, which are imperative like ever, and it's a lot of work to port non-trivial programs to them.

Also, even with Haskell/Erlang/etc AFAIK you still have to do special handling to get programs to be efficient with manycore parallelism. And that's not even talking about GPU.

Is this system ever implemented or does it just exist in theory?


The basic problem in porting to OpenCL/CUDA is identifying the mapping and reduction operations buried in imperative code. That these data-parallel operations are fundamentally functional is clearly reflected in the primitives that the Thrust and CUDPP libraries provide.

The real challenge emerges during program restructuring and performance tuning. This requires intimate architectural knowledge and frequent trade-offs between conflicting optimisations.


So we're already doing 'fundamentally functional' operations? That's an interesting way to look at it, but it's still driven from an imperative language. The grandparent post talked about functional programming in a functional programming language on a GPU.

Is Theano a step in this direction? ( http://deeplearning.net/software/theano/ )


>> Then, the more abstract model of functional programming allows the implementation more freedom when distributing work on the many cores/GPUs, and can make use of this new architecture

> It would be great to have some form of programming that maps to parallel architectures without headaches and bug-prone extra work.

One of the advantages of age is that you can remember instead of predict.

There are three problems - finding enough work that can be done in parallel, controlling the amount of parallel work that you find (yes, you can find too much), and managing the overhead.

Functional languages help superficially with the first but the latter two are important as well.

One reason why I write "superficially" is that writing in a functional language isn't enough - you have to write programs with the correct dependencies as well. Functional languages merely make it easier for the compiler/run-time to identify the dependencies in the program, which is a very different thing.


Functional programs encode their dependencies as arguments/return-values. In Haskell, using the par combinator is guaranteed to only affect performance, and not the actual result of the program.

This is already a huge boost.


> This is already a huge boost.

Like I said, I've been to this rodeo before. Heck, I even drank the Kool-aid one of the previous times that functional languages were going to enable parallel execution. Since there's nothing new this time, I think that it's safe to say that the result will be the same.

Easy to find and nicely separable dependencies may be necessary, but they aren't anywhere close to sufficient. (BTW - the relevant criteria is not "huge boost" but significant boost, and I don't mean distiguishable from random chance.)


What functional languages did you see before? Were they pure? Could they guarantee that parallelism did not affect the semantics of their programs like Haskell can?

Did you actually see the use of the "par" combinator?


> What functional languages did you see before?

If you're going to argue novelty, shouldn't you be the one describing past work?

> Were they pure? Could they guarantee that parallelism did not affect the semantics of their programs like Haskell can?

Yes. Why is this surprising to you?

> Did you actually see the use of the "par" combinator?

Do you really think that "do something before it is strictly needed in a lazy language" is new?

In other news, your generation didn't invent/discover any sex acts. (Neither did mine.)


It is surprising because pure functional languages have not been around for a long time, especially not ones with good optimizers.

> Do you really think that "do something before it is strictly needed in a lazy language" is new?

That's not what the "par" combinator does.

"par" along with "pseq" is already used, today, to achieve parallelism in Haskell code, and it already is easier to parallelize than in other environments.

Additionally, STM in Haskell is usable in practice, unlike other implementations of software transactional memory, and it also relates to purity (in this case, the ability to distinguish different types of impurity).


All of this work has already been done at the first massively parallel (64k processors in early 1980s) supercomputer startup: Danny Hillis' Thinking Machines. Look at StarLisp, Paralations, and NESL.


Adding lots of GPUs and CPUs doesn't make that not true. Those processing units still work by executing a series of instructions, they just do some fancy things with regards to the order of instructions and communication between units.

It may be more productive to think of things in terms of a more abstract, functional model of computing, but in the end, for anything to actually happen, that abstract model must be translated in to some series of instructions. I think that's what the quote was trying to get at.


It would be quite possible to design a processor that is not designed around series of instructions at all, but something more akin to functional programming. In this case, everything will “actually happen” without any kind of series of instructions.


Maybe my imagination isn't strong enough, but I don't see how a processor could work without being reducible to a series of instructions. The lowest level of "code" you give it might be more "functional", but there's a direct mapping between that and a sequence of states the processor goes through to actually do anything.

In looking for such a thing, I've seen http://en.wikipedia.org/wiki/SECD_machine and http://en.wikipedia.org/wiki/Graph_reduction_machine, but I'm not sure either of these fit your description. Do you have experience with them or another machine you could share?


One thing to say is that OO doesn't mean no-functional and functional does't mean no OO. For instance, Python's String is functional since you can't alter it.. 'test'.replace('t', 's') will always return something new and does't not depend of anything but the string itself. I'm not saying python by itself is purely functional.. (as scheme is not purely functional).

I'm saying that because I've had great success recently by using immutable objects in languages that aren't.. how to say, functional friendly?


I always trie to explan that to people its not about FP vs. OO. You can have a very functional OO language (dylan or scala come to mind) or you can have a imperativ OO Language (Java).

The real importend fight is FP vs Imperative. If the language is OO or not does not really matter.


How is Java, as opposed to its occasionally questionably-designed API, "imperative OO" in ways that Scala is not?


It's hard to use functional paradigms without closures and first class functions. But, more importantly, I would say it's more about the mentality of API designers and java programmers.

I've once used high level functions in PHP in a real production environment and it caused so much political problems that I had to rewrite everything in an imperative style. Not that the code was badly written.. it's just a different style that not everyone is used to.


I'd totally agree with the mentality bit, but it's not hard (although it can be unwieldly) to write functional-style code in Java if you're disciplined.

Imperative programmers will write imperative Scala, too.


By hard I just mean it's extremely verbose. (i.e. implementing interfaces for each anonymous function is annoying at best)


For OO you have to have state. Objects should have identities which distinguish them. That means that two objects are distinct even if all their properties are equal.

This break referential transparency. Two runs of "new Integer(2);" will be different.

So you cannot have proper functional language with OO. Even if all your objects are immutable.


Doing stuff based on identites is bad. Two runs of "new Integer(2) may be diffrent but I don't care about that as long as "equality" checks are based on guess what "equality" and not identety. What address on the heap the objects live on is an implementation detail.

If I remember correctly in Dylan you have two operators = and == the first checks for equality the secend for identety.


Yes you can. Check out my language Babel-17 (http://www.babel-17.com) .


>>So you cannot have proper functional language with OO.

>Yes you can. Check out my language Babel-17

I failed to see how purely functional structured programming address the need of identity for objects.

Could you elaborate a little more?


Noob question: what the hell are you needing identity for?


To differentiate objects one from another.

http://en.wikipedia.org/wiki/Identity_%28object-oriented_pro...

"identity is the basis for polymorphism in object-oriented programming."

In a purely functional object-oriented programming you don't need identity. But then that practice is almost indistinguishable from usual pure functional programming we have for at least 20 years with Haskell.


Forgive my insistence, but, what kind of problem does the ability to differentiate objects from one another solves? What kind of programs are made simpler by this ability?

Besides, I don't see how you would need identity in the sense of being structurally-equal-but-not-the-same for class based polymorphism. You could perfectly do this without any side effect, despite what the `final` keyword in Java might mean.

By "identity" Wikipedia probably means something like "you think I'm a Foo, which is right, but underneath, I'm a special kind of Foo: a Bar. By the way, there are others special kinds of Foo: the Baz, the Fiz, the Buz…". In other words, subtyping is the basis for polymorphism in OOP.


>Forgive my insistence, but, what kind of problem does the ability to differentiate objects from one another solves? What kind of programs are made simpler by this ability?

Oh, I don't know. I am not an expert in OO* at all.

All I know is that for something to be true OO it has to have objects with identities. I was told so. ;)

All I know about OO is Liskov substitution principle. Where it is unnecessary I tend to use State monad and immutable objects.

>subtyping is the basis for polymorphism in OOP.

Which can be solved by several means, interfaces and inheritance being two of them.


> All I know is that for something to be true OO it has to have objects with identities. I was told so. ;)

Well, you were told wrong things :-)


Please, list object-oriented systems with objects with identities and object-oriented systems without those. Count both. Discuss. ;)

Alan Kay, who coined term "object-oriented" described object-oriented system as an assembly of objects communicating by message passing. Message passing requires identities (and state, except in severely reduced form).


Message passing does require a sender, a receiver, and a message. You can have those in purely functional programming. No identity needed. But if you really want one, you can have it: Two objects are identical, if they react to all messages in the same way. You can also explicitly define equality ==, and then say that a and b are identical if a == b.


>Message passing does require a sender, a receiver, and a message. You can have those in purely functional programming. No identity needed.

Would you mind show me some code in pure functional language?

I mean, in Haskell or Clean. Those are pure enough.


http://www.loup-vaillant.fr/articles/classes-as-syntactic-su...

My Ocaml code is not pure, but we can easy remove any occurrence of `ref`, and restricting oneself to pure methods, which produce a new fresh object instead of modifying `this` in place.


Sorry, I am not very fluent in OCaml. Also, I didn't find word "message" in the link above.

I am experiencing limitations of pure objects and message passing almost right now. In my spare time I am currently developing dynamic data flow CPU which is based on immutable "objects" and message sending. It is quite unusual style of programming, very unlike anything else, including Smalltalk, Erlang and everything.


It's like Haskell, with a slightly heavier syntax. And it's eager. And you can unsafePerformIO all over the place (but avoid doing so even in Ocaml).

    (* Ocaml *)    --Haskell
    let foo = 42   foo = 42    -- top level declaration
    type bar =     data bar =  -- type declaration
Also, records in Ocaml are a separate kind of type, more like C structs:

    type bar = { x:int;     (* defination *)
                 y:float;
               }
    bar.x                   (* "member" acces *)

"Message passing" and "Method calling" are strictly equivalent, at least when everything is synchronous.


Smalltalk is a beautiful marriage between OO and functional programming.


The contrast to imperative programming is not FP, but rather declarative programming. The idea of referential transparency comes from DP, not FP.

Example: Prolog, a declarative language with many of the qualities of FP, yet not FP.


Prolog is not really declarative except for the pure core, though. Cuts, assert, retract, etc. break its referential transparency.


How is functional style rooted in the "purpose" of the software and the "nature of the thinking" that programmers do?

Well I would say that for now the nature of thought is still unknowable.

The process of thinking is something a programmer acquires by rote training. I don't believe that there is a "style" of programming that is universally applicable or models the very nature of thought itself. Such is the stuff programmers dream of.

Even language is an abstraction we put on reality.

Mathematics is just another language and one many feel represents reality more succintly and accurately.


The facts of reality that give rise to the FP approach is the recognition that computation is essentially a process of inferring quantities. Its purpose is to aid in human thinking. And the conceptual faculty is a quantitative mechanism. This is laid out eloquently in David Harriman’s book “The Logical Leap”. See also “Concept-formation as a mathematical process” in OPAR.

The computer is in essence a quantifying device. Its invention was motivated by man’s need to quantify.

Anyone else tired of verbose writing?


What about dataflow programming? Shell scripting is one example. I think the same thing could be implemented using event driven programming, the event being the completion of some function, though that would be a little less clear.


What would be a good first functional language to learn coming from an imperative/OO background?


Depends on you you can either "jump in the deep end" or you can gratually move more to it.

Gratually moving is good if you want to stay productive. (Scala, Common Lisp, dylan come to mind)

Jumping in the deepend will totaly go against everything you learned but you will learn very fast. (Haskell, ML, Clojure and Scheme are probebly best here)

Books: Land of Lisp/Practical Common Lisp for Commen Lisp Not sure about Scala Programming Dylan for Dylan

Learn you a Haskell for Haskell Programming Clojure/Practical Clojure for Clojure Structure and Interpretation of Computer Programmes/The little Schemer for Scheme (Don't know a good book for ML)


I'd put ML closer to the gradual camp, though it depends strongly on where you learn it from, and partly on which flavor you choose. I see a whole lot of procedural-flavored Ocaml, and it's fairly easy (like Lisp) to mix procedural and functional styles.


Ok, I didn't know that. I have never seen a line of procedural code in ML. Thx for the info.


Yes, the plan is to jump in the deep end, unwire my brain and apply the concepts I learn to my day to day Ruby/Javascript programming as much as possible. I think I'll go with Haskell.


I am currently learning Haskell. I started by not really knowing where to start. Now I follow this advice, and until now I find it very helpful:

http://stackoverflow.com/questions/1012573/how-to-learn-hask...


If you are interested in the "deepend" then you could always have a look at lambda calculus and combinators:

http://en.wikipedia.org/wiki/Lambda_calculus

http://en.wikipedia.org/wiki/Combinatory_logic

Then you get to see where the term Y Combinator came from :-)


OCaml. You can get "real work" done by dropping back to familiar imperative constructs when necessary, yet you still have an excellent functional language at your disposal. Eventually you will go more and more FP as you become more comfortable, e.g. you can use a for loop if you want, but pretty soon you'll be more comfortable with a map and it will come naturally. Or you can use an if.. else construct until you are happy with match...with.

I personally treat OCaml as a type-safe Python or Tcl, you can work very productively with it.


Standard ML. It's simple, well-designed, stable, and consistent.

http://www.itu.dk/~tofte/publ/tips.pdf

http://homepages.inf.ed.ac.uk/stg/NOTES/notes.pdf

A full book. It's good, but don't start with it: http://www.cs.cmu.edu/~rwh/smlbook/


Probably my language Babel-17 (http://babel-17.com) if it just came down to the concepts.

The Babel-17 ecosystem (input, graphics) is not in place yet, though, so for now the best language would be Standard ML .




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: