Hacker News new | past | comments | ask | show | jobs | submit login
Don't Distract New Programmers with OOP (dadgum.com)
192 points by ColinWright on Dec 15, 2013 | hide | past | favorite | 218 comments



IMHO, the thing that new programmers really struggle with is learning how to break a problem down into the steps their language allows them to express. Lots of people have never had to logically think out and algorithm to do simple things and it takes them lots of practice to understand how to do that.

For example, suppose I'm a new programmer given the notional task of making a program to make me a sandwich, can I just:

  makeSandwich(); 
and be done with it?

Or do I need to break it down to the n-th degree?

  //make bread
  bread=harvestwheat();
  bread=millwheat();
  bread=bread+water;
  //and so on
is that the right way to even make bread? Or do I already have bread I can work with someplace?

Do I?

  Sandwich=bread+jam+butter+bread;
Or

  Sandwich=Sandwich.add(bread).add(jam).add(butter).add(bread);
Or do I need to do something else? Maybe this particular programming environment doesn't need me to make bread, but I have to transform a milk object into butter for some reason that seems entirely random to the new programmer. Who knows? Lots of the things that we take for granted seem entirely arbitrary to the new programmer and learning to break a problem down in a way that maps to the environment is a very hard problem to learn to solve repeatedly.

edit in a way this is a common topic even on HN with the "do x in n number of line" posts that are common.


Here is a methodology that I recently started using:

Describe, at high level, what needs to be done to make a sandwitch, in your head. Then write code that looks exactly like what you said in your head, as much as possible. Dont worry if it turns out to be impossible to implement that way - you can always tweak it later until it becomes possible. Just try to make sure it reads more or less clearly.

Repeat with each of the steps, checking whether there already is a library that does that for you.

  function makeSandwich() {
    var breadSlices = getBreadSlices(2);
    var ham = getHam();
    var butter = getButter();
    var butteredSlice = spread(breadSlices.first(), butter);
    return foodPile(butteredSlice, ham, 
                    cheese, breadSlices.last());
  }
Continue the same way with getBreadSlices, getHam, getButter, pile.

As you progress this way, you'll notice that you'll be able to implement some processes in terms of more generic functions (e.g. spread and foodPile).

You'll also notice that you need an environment (macro, such as kitchen, or micro, such as frying pan) to temporarily place stuff that you're working with, while you wait for some process to complete. Thats when you start creating classes and objects that represent this environment.

Real code example:

https://github.com/jlipps/async-showcase/tree/master/promise...


Gerald Sussman had explained this in one of his SICP classes. He called it wishful thinking.

One of the best ways I ever came across to break down a problem.


I know it's a bit silly but the thing that jumped out about your comment and the parent comment is the butter. You butter a hame and cheese sandwich? Is this a cultural thing I'm not familiar with? zooms off to google to check


I moved from the US to Ireland a year ago, and while I love it here the constant vigilance against sandwiches slathered in mayonnaise and butter can be exhausting. I go to the sandwich shop and they helpfully ask "Butter or mayo?" - to which I respond "mustard", and immediately reveal myself as an outsider.

I thought garlic fries, like the sort at Giants games, were delicious! Some garlic, olive oil, parmesan, etc. I was shocked to discover that here and in the UK garlic chips consist of chips (fries) with garlic-accented mayonnaise spread all over them.

And don't call it Aoli. That's just dressing up a disgusting condiment with a fancy word.


I live in the UK, and the thing I thought summed up the British and Irish obsession with mayonnaise best was when I recently bought a hoi sin duck wrap at the local Tesco and it had "No mayo" in big letters on the front, as if the lack of may was something highly unusual and/or edgy.

Since then I've looked at the ingredient list of every other product with great suspicion, expecting to find mayo everwhere...


I'm not an outsider, yet I always answer "no thanks".

I use butter and mayo for one of two things: flavour (eg I love kerry gold on toast ;-) often nothing else) or moisture - but if I don't need the moisture, usually because I've got tomato, peppers, coleslaw, relish or something else providing that, I will leave butter and mayo out. Unless I want them for the flavour. (As an side, I don't understand why people would ever want butter on a breakfast roll)


"... usually because I've got tomato, peppers, coleslaw, relish or something else providing that..."

These are precisely the instances in which I would want something oily (butter, marg or mayo) as waterproofing for the bread. (Mustard also works, if it's appropriate for the sandwich.) "Everyone has the gout," as they (don't actually) say in French.


I feel I should give credit where it's due - while I'm no fan of butter on sandwiches, the butter here is fantastic, and makes for delicious toast, popcorn, etc.

I just don't need a peanut butter and butter and jelly sandwich.


Where were you in the US that didn't put butter and mayo on sandwiches? Both seemed to be standard on sandwiches I bought in the Bay Area.


Odd.. This would be Berkeley, Santa Monica, and San Luis Obispo.


I can't think of a sandwich I wouldn't butter, including peanut butter.


How could you eat it without butter? Isn't it far to dry to be eatable? But this could be because of the difference in bread, just image-google "bread" vs. "Brot".


I can't think of any type of bread I've tried which is too dry to be edible without butter when it is fresh.


I grew up in the mid-Atlantic region of the U.S. and saw it both ways. Butter on sandwiches just sort of seems like just another way to make one, like some people like Mayonnaise (I don't) or not. I'd probably put it on a jam sandwich, or a ham and cheese, but not a PB&J (the Peanut Butter takes on the same role).

Of course I also grew up in very immigrant heavy areas so that may have been part of it.


UK, every sandwich buttered. Helps hold things in place.


Ah, here we'd only butter a sandwich if it were being fried (Grilled cheese or grilled ham & cheese)


Just to be clear: are you thinking of buttering the inside or the outside of a sandwich?

In the UK, standard practice is to butter sandwiches on the inside, ie on the surfaces of the bread which interfaces with the filling. Are you by any chance talking about buttering the outside?


I recently encountered this cultural difference (I grew up in Australia) with my wife (we live and she grew up in the US). My daughters vastly prefer my sandwiches to hers because I butter the bread (on the inside, the way God intended), usually using as little butter, or margarine, as possible. When asked why, I say it's mainly because it prevents the moist stuff in the sandwich from turning the bread to mush, but also because it tastes good.

I think a lot of it comes down to the fact that most sandwiches these days are prepared and then eaten right away, whereas I was brought up in a world where sandwiches tended to be prepared long in advance of being consumed.

That said, Americans will cheerfully slather "mayonnaise" on bread.


"You butter a ham and cheese sandwich?"

In Russia, where I grew up this is the only way to make a sandwich.


In Germany it's common but regionally different. My Swabian friends thought it weird I'd do that but my family is from the North. After 25 years in the US I now think it's gross.


My dad does this. I don't get it.


This approach makes sense for a procedure you know exactly how to do. In the case of a procedure I do not know, I break them down into simpler and simpler pieces. Until I get to one I can do, then come back up the graph. Maybe akin to a depth first graph

example story: as a user I would like a form page that take an address I enter and renders a google map of it

http://i.imgur.com/38pXuJT.png


No, you don't have to know exactly what to do. The key idea is that you can always correct it later as your understanding about the problem grows, but you can always start by writing the code the way you think its going to work (sort of like a plan)

High level overview:

  var form = renderForm();

  form.onAddressEntered(function (address) {
    renderMap(address)
  });
How do I show a form? Ah, I do that part in HTML, so no need to renderForm()

  form.onAddressEntered(function (address) {
    renderMap(address)
  });
How do I access the form? Hm, its not an object. Maybe I should make it one, maybe not. I'll simplify this time, as I don't know what the future will bring:

  onAddressEntered(function (address) {
    renderMap(address)
  });
Okay, so how do I run something when an address is entered? (reading, experimental code in REPL). Right, so the best way is apparently by attaching to the submit handler of a form. Oh, and I also learned that the handler takes an event argument and I need to call e.preventDefault()

  function onFormSubmit(form, listener) {
    document.getElementById(form)
      .addEventListener(function(e) {
        e.preventDefault();
        listener(e);
      })
  }
Lets modify the original code then

  onFormSubmit('address', function (address) {
    renderMap(address)
  });
Wait, how do I get the address?

Lets see if there is a pre-made function to serialize a form. (looks up) Ah, found it.

Lets modify onFormSubmit again. It should now add an event listener to the selected form which serializes the form and prevents the default submit event.

  function onFormSubmit(form, listener) {
    var form = document.getElementById(form);
    form.addEventListener(function(e) {
      var data = serializeForm(form);
      e.preventDefault(); // which is submit
      listener(data);
    })
  }
Back to the original code

  onFormSubmit('addressForm', function (data) {
    // lets make sure we got that first part right
    console.log(data.address); 
    renderMap(data.address)
  });
In the process, a neat utility for forms came out - a function that attaches a submit listener that gets all the form data.

So how do I render a map? Etcetera.

I think its important to always have a written-in-code high level overview of the process, even if it turns out that the current high-level overview is potentially wrong. You can always change it as your understanding (of both the process and the tools) grows. But at the end, that overview becomes your code, and the end result will unavoidably end up as understandable as your current understanding of the entire thing.

The hardest part of this process is to stop worrying about getting all the details right (at the beginning) and only focus on whether the code you're writing is describing your understanding of the problem well.


Totally agree. A simple example: print, princ, echo, cout, ... For god's shake =8-/


In Python, OOP boils down to a style choice, which is exactly as it should be, IMO, and is very helpful to beginners.

In Java, you're forced to use OOP for every problem. No beginner understands why to define "hello, world", you also need a class, and a 'static' method on that class, and to invoke a weird-looking special class called "System".

It isn't obvious why there are "String[]" types and "ArrayList" types, and why I should care about the difference. Why does "main()" take "String[]" yet I am encouraged to use "ArrayList" in my code? This is just a sampling of difficult-to-answer questions that a beginner encounters in using Java that don't have a grounding in computer science, but just in Java.

It just so happens that among design choices, the choice to implement your Python code using OOP is a trade-off. It always is, but Python makes the trade-off clear as day. The win is increased ability to implement interfaces, language protocols, inheritance, polymorphism, the ability to bundle state and behavior, and the ability to bundle related behavior together. The loss is reduced readability and reduced beauty. Since these latter things are counter to The Zen of Python (PEP 20), idiomatic Python programmers will tend to torture themselves over OOP usage, to the point where only data structures that truly demand OOP's features come into existence.

This means that beginners will more often see a useful_function than a UsefulClass in Python's stdlib and associated ecosystem, and this is A Good Thing, since functions are much more composable than classes are (not to mention simpler).

I think Python will therefore also teach beginners vigilance in the use of classes, which the entire software community could use a little more of!


You're saying that in Java you're forced to use OOP for every problem and then you exemplify with "hello world". Well ...

1. function definitions being constrained to classes has nothing to do with OOP

2. static functions or members have absolutely nothing to do with OOP

I see a lot of criticism of OOP, but when I get to reading the details, it's all about Java's flavor of OOP.


I think you missed OP's point. Beginner programmers introduced to Java are forced to grapple with OO from the outset.

What is a class or static method? These are OO constructs and require additional mental overhead when learning.


They also struggle with functions, parameters, pointers, scope and so on.

You don't need to pass parameters to your car to start it. You don't need to force yourself to only focus on the recipe (scope) when making dinner.

None of these things map very well with the real world and could also be considered additional mental overhead when learning. You don't need to know how to code to learn best known practices in building software.


Functions, parameters, and pointers are also optional concepts in python. None of them are needed in a hello world program, nor for a few weeks into an introductory course. Variables, expressions, conditionals, lists, strings, and loops can all exist on their own without any "extra stuff".

Learn things one at a time; "best practices" can wait til you can demonstrate why it is a best practice. Why should I make a function? Until you actually face the problems that abstractions are used to solve, learning them is confusing and difficult.


Optimizing for hello world is probably the least important thing when you have a 10+ year journey of learning ahead of you. In the end, what do you learn from python's hello world anyways?

This also happens in math quite a bit. There is a knowledge gap when teaching concepts like derivatives for the first time. So we are told to initially ignore details while learning.


That's probably why I didn't understand derivatives until Calc 3. I barely passed Calc 1 and 2 because my professors just told us to ignore the details. My Calc 3 professor actually showed us how everything worked, from intuition to theorem to proof.

In my experience, it's best not to delay learning the details. You don't have a real understanding of something until you understand the details.


You are right, it is quite bad to ignore the details in the long run (it requires patience of course).

However that makes even less of a case for Python.


It's far easier to understand the concepts piecemeal. Methods are easier to understand if you already grok functions. Integrals are easier to understand if you already grok derivatives. When learning these things, you're often fed an over-simplified explanation so that the instructor can go on to cover something more advanced. The problem is that now you're trying to learn something more advanced before you've got a solid understanding of the primitives.


And you missed my point. Static methods are not an OOP construct, unless we would be talking about class methods on classes that are objects themselves, but in Java we don't.


The problem with Java has always been that it assumes the programmer is incompetent and requires an OO straitjacket around their code in order to produce "robustness".

You're right: static methods are not an OOP construct. They're a hack to allow the user to break out of the straitjacket at a specific moment, because it's impossible to write anything but a library otherwise.

The issue, thus, is that Java requires a beginner to learn a paradigm-breaking manuever as a basic operation. "Java is a glorious OOP language. If you want to write anything that's pure Java, though, here's how you stop being OOP in order to do so."

(Also, for the record: I like OOP. I like Ruby specifically because it makes OOP so natural and honest.)


Static methods are not a hack, they are methods that don't depend on a state.

I like how Rubyists/Pythonists jump to criticize Java, but they don't know the first thing about OOP.

I don't even use Java but these critiques are retarded, seriously.


By "methods that don't depend on a state" I guess you mean methods for which "this" is not passed implicitly.

Technically, such methods are not methods. They are functions. And the distinction between a "static method" and a function is completely arbitrary, dictated by a completely arbitrary limitation of how bytecode is organized on top of the JVM (e.g. class-files instead of package files) and by somebody's flawed vision that all function declarations should happen inside a class, even if those functions are not methods.

And actually, what I've said in my first sentence is wrong. "this" may not be passed to static methods, as classes are not objects, however classes are some sort of pseudo-objects that are special for the JVM and "this" does exist in certain contexts, like when declaring a "static synchronized" method, which does use this pseudo-object for acquiring an intrinsic lock. Also this pseudo-object does have a constructor.

So you see, classes play the role of packages for static members and methods, except that in a language like Python, packages are objects themselves, whereas in Java they aren't. Plus, if you look at classes with static members as being packages, they are retarded packages, because you can't pass them around as values, you can't override imports and so on.

I can keep going btw.


By "methods that don't depend on a state" I mean that you don't need an object. The point of objects is that they manage their own state, and you only need to worry about sending them messages.

I agree that they are just functions, but no one is trying to "hack" or to break encapsulation. They could have used packages instead, yes... But having them inside classes is pretty convenient as this allows us to have a better syntax.

Oh and not being able to pass classes around is a language-specific thing, so you can't really use that as an argument against static methods.


Your point is, well, beside the point. You're saying that "OOP does not imply static methods," which is correct, but irrelevant, because the converse is true: "Static methods (in the Java sense) imply OOP." Assuming they're learning Java, it's the latter that matters to beginners, not the former.

You say it's about "Java's flavor of OOP." That's 100% true. So, how do you explain Java's flavor of OOP — which is necessary to explain why Java does certain things that would otherwise seem like arbitrary invocations — without explaining OOP at least in part? As soon as the beginner sees the word "class" or "public" they'll be forced to either contextualize those concepts or take them as arbitrary strings one puts into their program because Magic™.

The OC's point was that Java forces people to grapple with certain OOP concepts, even if those concepts are particular to Java's flavor of OOP.

You're both just talking past each other.


Nonetheless, beginners will still have to be introduced to the concepts from the start.


Only a few of them. I'd say:

    1.  Instantiate Object.
    2.  Invoke Method.
    3.  Get or modify attribute.
    4.  Do some introspection (even if the term is not used)
Most importantly, learning how to define their own classes isn't really important for beginners.


Those are the concepts starting the beginner in the face, sure. Here some of the questions a beginner will have, phrased ~10x more clearly than a beginner would ever be able to phrase them:

    1. What is an object?
    2. What does it mean to "instantiate" an object?
    3. What is a method?
    4. What does it mean to "invoke" a method?
    5. What is an attribute?
    6. What does it mean to "get" an attribute?
    7. What does it mean to "modify" an attribute?
    8. What is introspection?
And so on, until we run into a concept through which we can explain something. The idea of an "object" is a big hurdle for beginners, especially if they don't have experience writing code — it seems like weird boilerplate. Likewise, "introspection" and other meta-concepts are incredibly challenging for beginners because they have a hard time stepping "out of the code," so to speak.


The thing is, students don't need to be able to ask those questions to learn the concepts. It can be done as an interactive exercise and in the process, a student will actually see what objects are used for and gain intuitive understanding. You do not need to go into all the cumbersome detail about how they are defined in the language. Critically, you do not need to introduce concepts that require them to decide whether they should define their own classes or use built-in data types.

Here is an example that instantiates an object, introspects, gets an attribute, introspects again, and invokes a method. It's 11 commands.

    >>> import requests
    >>> def introspect(obj):
    ...     for x in dir(obj):
    ...         print x
    ...
    >>> request_object = requests.get('https://news.ycombinator.com/item?id=6919275')
    >>> introspect(request_object)
    __bool__
    __class__
    __delattr__
    __dict__
    [...]
    request
    status_code
    text
    url
    >>> request_object.status_code
    200
    >>> request_object.url
    u'https://news.ycombinator.com/item?id=6919275'
    >>> introspect(request_object.url)
    __add__
    __class__
    __contains__
    __delattr__
    [...]
    title
    translate
    upper
    zfill
    >>> request_object.url.upper
    <built-in method upper of unicode object at 0x29a3330>
    >>> request_object.url.upper()
    u'HTTPS://NEWS.YCOMBINATOR.COM/ITEM?ID=6919275'
After enough practice using objects from various python libraries, in the course of doing functional or imperative programming, they will have an intuitive understanding of objects and what they are for.

Some interested students might ask "What's an object, under the hood?" and "how can I create my own classes?" That's when you can introduce them to classes (http://docs.python.org/2/tutorial/classes.html). If they're fine with the intuitive understanding that comes with performing examples like the one above, leave them be. Don't inflict OOP principles they don't need to know yet.


Whether those questions come out of a student's mouth, those questions will be rattling around their head — most likely much fuzzier, more confused version of those questions. If you thought I was proposing that a teacher, say, give a lecture about those answers before one can even begin programming in an OOP language, I wasn't. Nor was I proposing that one go into the details of how these things are implemented in whatever language. I was only making explicit what was going through a student's head when you hand them a chunk of code which does the four things you outlined.

I think the example code you gave is fine, but does not actually answer those questions. The mind of a beginner is like a map of a country with unmarked or incorrectly-marked territories. They don't know where the relevant boundaries between concepts/territories are or even how to begin drawing them. They can't differentiate between important and unimportant details. Like a child who is just learning how to speak, they see the differences in everything and every difference matters. That is, to a beginner, if there's a difference it's important.

Here's a Ruby example. I've given many students this code:

    my_name = "Jesse"
    my_age  = 30

    puts "Hello!  My name is #{my_name} and I am #{my_age} years old."
and had them conclude without prompting, "Oh, you can name variables anything you want as long as they start with my_." They will not know they're forming this belief and will carry it with them until something — a teacher, another bit of code, etc. — contradicts that belief, except that when the contradiction arrives they may not realize consciously they had formed this particular incorrect belief.

When I look at your code, I see a few dozen opportunities for students to form exactly these kinds of beliefs, which is one of the challenges of writing effective curriculum. This is doubly difficult when there isn't a teacher nearby to catch these mistaken beliefs as they're formed or shortly thereafter. In most online or "single player" learning experiences students will form a web of incoherent, confusing beliefs, not even know they're doing it, and carry on for much longer than they would otherwise.

For example, I'm assuming the "u" prefix on those strings in Python is a way to indicate that they are unicode strings. If a student already knows that strings exist and sees that output, it's likely that a large minority of students will assume that u'HTTPS://NEWS.YCOMBINATOR.COM/ITEM?ID=6919275' does NOT represent a string but something else entirely. They will then carve out a hole in their schema of the world where "u-things" go and it will never occur to them that doing something like

    request_object.url == 'https://news.ycombinator.com/item?id=6919275'
is even possible.

Maybe we're saying the same thing, but I hope you get my point. This has to do with the particular way most beginners in any subject interact with new information.


> Maybe we're saying the same thing, but I hope you get my point. This has to do with the particular way most beginners in any subject interact with new information.

Probably, my example was off-the-cuff, and I think I may have misinterpreted the original intent of your post a bit. I'm still not sure which "side" of the discussion you're on, though. (Teach classes and OOP sooner vs. later). For my part I definitely favor "leave it for later"

In Zed Shaw's "Learn Python The Hard Way" book, after raving about dictionaries in section 39, he barely uses them for the remainder of the book, while spending a lot of time on classes and class heirarchies. It's a succinct (and very successful, I believe) introduction to OOP and Python's features so I can't vehemently criticize it, but I do wonder what the last few chapters would have looked like with a more flat function + dictionary approach.

I understand the issue of little details stuff, for sure. I've never taught python but I have taught bash. Gave you an upvode for for the second paragraph, I found it to be an unusually eloquent way to describe the learning experience. I've never been able to express that idea so well.


function definitions being constrained to classes has nothing to do with OOP

It doesn't?


Yes. Although I disagree that what the GP rebuts what he was intended to rebut, I agree with his particular point, viz., hat the idea of OOP doesn't necessarily entail the idea of classes, although most OOP languages implement many principles of OOP through the class/instance mechanism.

JavaScript, for example, is an OOP language without classes per se. Indeed, many JavaScript programmers avoid what little built-in class-like behavior JavaScript has and implement their own class definition/inheritance/instantiation mechanisms via things like Object.extend.

And although it's rare, one can define "singleton" methods directly on objects in Ruby, too:

    animal = "Dog"

    # This is now defined only on the animal object
    # and no other String instance
    def animal.bark!
      "Woof woof!"
    end

    animal       # => "Dog"
    animal.class # => "String"
    animal.bark! # => "Woof woof!"
So, JavaScript is an example of an OOP language without classes at all (or at least one where you often pretend they don't exist) and Ruby is an example of an OOP language which has classes but where function definitions are not constrained to classes.

This is the point I understood the GP to be making.


> I see a lot of criticism of OOP, but when I get to reading the details, it's all about Java's flavor of OOP.

The OP actually did a very clear job of distinguishing between Python- and Java-style OOP. I think you're the one cherry-picking his Java example and then extrapolating that to refer to all of OOP.


I suspect your parent would agree with (1) and (2). Those criticisms were aimed at Java, not at OOP.


In python you don't even have abstract classes, and instead you have to use horrible hacks.


If you're tying to develop interfaces in Python, I would argue you're using the language from the wrong standpoint.

Every language has a design goal. Python doesn't care about interface contracts. The "one correct way" is the contract.


With these languages it always seems like first came the language, and then its users tried really hard to make sense of it. So now apparently enforcing contracts is not important anymore. "Let's just code!" - sigh


If you want to enforce contracts, pick a language that encourages it. There are many to choose from. Plenty of people are doing just fine in Python without this.

It's almost an entirely distinct problem domain. Besides, what would a contract be in a dynamically typed language?

I don't think these features are bad, but they're not appropriate everywhere.


> If you want to enforce contracts, pick a language that encourages it.

Yeah don't worry, I'm not gonna use Python.

> Besides, what would a contract be in a dynamically typed language?

Being dynamic has nothing to do with this. See: Clojure, PHP, etc. They have contracts.

Of course things might not be appropriate everywhere. But with Python/Ruby you don't have a choice, they just lack these features.


Maybe because you don't need abstract classes

I'd say even that Abstract Classes are a horrible hack to the braindead inflexibility of languages like Java

And 99% of the time you're doing them in Python, you're doing it wrong


Yeah I guess you don't need OOP either then. Let's just program in Assembly.


http://docs.python.org/2/library/abc.html

?

Even if that had flaws, PyQt carries over all the Qt abstract classes just fine.


I'm aware of that but that's what people call a hack, don't act like it's not. Python doesn't support abstract classes, period.


Um.. that's in the standard library. What definitions are you using for "support" and "period"?


In a world of circular time, not linear time, all syntactic sugar, or all fads in general, naturally rotate endlessly thru:

"You'd be crazy to use that"

"Um, OK"

"Often, I use it"

"Everyone must be forced to exclusively use it"

"Sometimes its sucks, but its the only sane thing to do"

And then right back to start. Endlessly. Forever.

So in 2014 OO is rapidly nearing the "you're nuts" and I see functional is rapidly nearing the "I use it" with, inevitably, "Everyone must be forced to exclusively use it" coming up soon. The relevant point is, have we started preparing for functional fundamentalism or whatever you want to call it? Also have we started scrubbing our resumes and githubs to remove the stain of OO programming?


Personally I like OOP and I think a lot of its criticism is flawed, since the OOP flavor that most people are exposed to nowadays is Java's OOP. And personally, whenever I see class names suffixed with "Executor", "Manager", or anything really that ends with "er" and "or", I feel like puking.

Besides, the state in which we are in is rather healthy. Back in the late nineties, early 2000, everybody was into class-based designs described with UML.

And I do wish people were forced to use functional programming, because that's the only way many people end up learning something new. OOP is not the best choice for every problem. Sometimes a function is better than an ExecutorManager. Sometimes a Map or a type-class are better than an inheritance chain. Sometimes a monad is better for manipulating data streams instead of iterators. And so on and so forth.

Also, the Gang of Four is seriously outdated.


> Besides, the state in which we are in is rather healthy. Back in the late nineties, early 2000, everybody was into class-based designs described with UML.

UML is great as collaboration mechanism at the enterprise level across sites.

> Also, the Gang of Four is seriously outdated.

Still largely unknown on the enterprise, sadly.


Ah, the mythical enterprise. I've yet to see an example in which UML Class Diagrams have been useful, but whatever, I guess that's the same enterprise in which you've got "architects" outlining stuff in UML, then passing that to the monkeys to fill in the blanks.


I find UML Class diagrams useful for a one-look document describing the overall architecture of a piece of software. Where they are weak is that you really do need a couple of paragraphs of plain English describing each class as well, and there's not really anywhere to put that on a UML Class diagram. The other really useful diagram in UML is the sequence diagram. If you write out the Sequence diagrams for a few key processes in your software, you've gone along way towards giving a roadmap to people that need to get in and modify the code. These are both documentation use-cases though. Using UML as a design tool (other than very informally on a whiteboard) seems wrong to me.


I see them in all multisite projects using offshoring.


Although that may be true, I didn't really read the article as saying that OOP is bad, just that it brings a lot of extra baggage that newcomers to programming needn't worry about. Certainly as they begin to write more advanced programs, more advanced paradigms become relevant.


But the term 'functional programming' is perhaps even less clear than 'object-oriented programming'. Is a language that can pass a function 'as data' a functional language? Does it need to support tail recursion elimination? Does it need to be pure? Does it need to support algebraic datatypes? Should everything be an expression?

I am also not sure if programming trends follow circular time. I haven't seen logic programming making a big return and OO and FP are still in its first cycle. It just took FP ages to go from stage 1 to 3 (outside niches).


HoF is functional nature. The rest just help things along that path. I'm not sure why people get so hyper about classifying things. The map is not the territory. Where you draw lines has no bearing on how easy it is to attain functional style in a language.

FP is far more clear a concept than OOP. OOP has no real fundamentals apart from encapsulation, which really isn't specific to OOP. OOP is whatever haphazard setup a language designer feels like using. FP concepts (like the ones you mentioned) are pretty clear across languages.

FP is certainly in its first cycle; ML just keeps going on. Industry is only now catching on.


I think beginners should be confronted with OOP very early, but not in the typical way:

- DO teach encapsulation of state.

- DO NEVER ever try to teach about objects as representations of real-world things. And for the love of god, do not give out modelling assignments where "Human inherits mammal" is an answer.

- DO start with collections: Vector, List, Stack etc. are easy to implement and showcase the virtues of encapsulating.

- DO teach polymorphism, but only in terms of interface inheritance. The collections you have taught before are a good starting point: ICollection, IRandomAccessable etc.

- DO NEVER EVER talk about class inheritance. Do not even use the word inheritance. Call it "is a relation" or something like that.

- When the time comes to teach inheritance, emphasize the Liskov substitution principle.


- DO NEVER ever try to teach about objects as representations of real-world things. And for the love of god, do not give out modelling assignments where "Human inherits mammal" is an answer

I have to disagree with this. I think this is exactly how to think about this.

I once explained the shift to OO to an older gent who was familiar, not just with ancient scientific programming, but also with building a house. I told him that the idea was that code was so large and complicated that it was divided up among teams with each team getting specific parameters that their code had to fit. He said, well of course.

Then I said it was much like constructing a house. You had to put the frame up, and know where the windows and doors would be, but the details of the windows and doors could be decided later. Every door had to be able to respond to an "open" command and a "close" command, some may have a "lock" command, and so on, but the builder of the door worried about getting that stuff right. All you had to specify were the dimensions and the direction of the door.

Now this was a pretty technical person- and perhaps doesn't qualify as a beginner, but I find, in general, real world metaphors to be powerful for teaching.


Right. And monads are burritos. (http://byorgey.wordpress.com/2009/01/12/abstraction-intuitio...)

> The heart of the matter is that people begin with the concrete, and move to the abstract. Humans are very good at pattern recognition, so this is a natural progression. By examining concrete objects in detail, one begins to notice similarities and patterns, until one comes to understand on a more abstract, intuitive level. This is why it’s such good pedagogical practice to demonstrate examples of concepts you are trying to teach. It’s particularly important to note that this process doesn’t change even when one is presented with the abstraction up front! For example, when presented with a mathematical definition for the first time, most people (me included) don’t “get it” immediately: it is only after examining some specific instances of the definition, and working through the implications of the definition in detail, that one begins to appreciate the definition and gain an understanding of what it “really says.”

Inheritance is not taxonomy. If you want to teach inheritance, show a couple of realistic examples of code which, when you add inheritance, becomes clearly better. We can see the pattern; we can make our own intuitive, abstractive jumps.


"I told him that the idea was that code was so large and complicated that it was divided up among teams with each team getting specific parameters that their code had to fit."

IMO, this is the first mistake. Code should not be large and complicated; when it is, few things can save you, whether it be the language or paradigm you're programming in, or the structure of the team doing the programming. OOP is a solution to the wrong problem.


Depends what you include in "code"; your program might be a one-liner, but its language environment and operating system are the product of huge teams. It's a kind of modularisation that's so effective we don't think of it as a division of labour.

One person writing all the code for a system from the bare metal up is pretty much limited to embedded systems now.


But how do you avoid it in today's world? How do you actually build an app without a large code base?


You can start with realizing that your data (100k rows) does not require Oracle and can be stored in a flat file and loaded in memory, so you can ditch JPA. Then get rid of Spring and Jackson. Then you remove all Abstract Singleton Factories and leave just the code that does what user wants, not what developer wants. Then drop a couple of messaging frameworks that your using when you realize that now all the code fits in one app. Next thing you will see is that NanoHttpd will do just fine for your load level and therefore you can ditch WebLogic. After all that you can remove maven (blasphemy!!!) because your project shrunk to a few dozen files that can be compiled and packaged with 5 lines batch script.

But that wouldn't be glamorous, would it? Besides, this approach will make you hopelessly unemployable in today's world.


The important things for writing modular code in a big codebase is defining clear interfaces and making "black box" modules. An important thing for this is abstract data types but, unlike what many people seem to say, those are not exclusive to OOP - you can also have abstract datatypes in procedural or functional languages.

The core concept of OOP is adding dynamic method-based dispatching to your abstract data types but I would argue that while this is useful a lot of times, its not really fundamental. IN fact, there as many cases when dynamic dispatching will not be able to solve the problem on its oown: a good example is generics in Java and C# - parametric polymorphism helps enforce abstractions but isn't really object oriented.


What is it about "today's world" that makes software require a larger code base? If anything, we have much higher-level languages which require far less code than in the olden days. Anyway, my point was more about the unix nature of building small programs, that do one thing well, and using those progams in combination.


"What is it about "today's world" that makes software require a larger code base?"

The fact that we are still not using the high-level languages we have available to us. Vast amounts of new C, C++, and Java code are written each year. We are solving bigger problems, but we are using languages that are still catching up to the state-of-the-art of the 1970s.


* Anyway, my point was more about the unix nature of building small programs, that do one thing well, and using those programs in combination.*

That just uses different words to describe the same net result and the same net problem. You still have to find a way to make all of those pieces work together to do something complex.


That's using a metaphor to explain OOP, not teaching OOP modeling with examples from the physical world.


No, you can teach it that way.


thomasz's point was that nothing in your example was "Human inherits Mammal".

As kazagistar succinctly puts it, "Inheritance is not taxonomy." Your house analogy isn't taxonomy; that's why it's correct.


"Never ever try to teach about objects as representations of real-world things"

I hear this from time to time—it seems bizarre. Representing real-world things as objects is a fine design approach for many problems and is well worth teaching. Where on earth did the idea that it isn't get started?

SICP, a classic of programming pedagogy if ever there was one, has this to say:

One powerful design strategy, which is particularly appropriate to the construction of programs for modeling physical systems, is to base the structure of our programs on the structure of the system being modeled. For each object in the system, we construct a corresponding computational object. For each system action, we define a symbolic operation in our computational model. (p. 217, 2nd ed.)

It's true that by "object" they mean something more general than OO. But that doesn't change the basic design approach.

It's also true that they are careful to teach more primitive concepts first, so what they do is compatible with the OP's point. But there's a reason why the above passage on real-world representations appears immediately after they mention modularity.


The key here is p. 217. Examples and Exercises that come before p. 217 are:

- Square Roots by Newton's Method - Testing for Primality - Arithmetic Operations for Rational Numbers - Extended Excercise: Interval Arithmetic - A Picture Language - Symbolic Differentiation - Representing Sets - Huffman Encoding Trees - Symbolic Algebra

SICP teaches algorithms and data structures before that quote. And goes on to teach more about data structures without doing any of that awful OO modeling exercises you typically do in a typical Java Programming 101.

>But there's a reason why the above passage on real-world representations appears immediately after they mention modularity.

After that quote, you return to data structures. Okay, after lists, queues and tables, there is a chapter about simulating digital circuits, which is both a fairly abstract and a well defined thing. Well, we can disagree on how abstract a digital circuit is, but at least we can agree that its no fucking dog that is_a animal and has_a head and has_a leg array and methods like

    bark(int howManyTimes) {
        for (int i = 0; i < howManyTimes; i++) 
            println("woof"); 
    }
Using OOP to model real world stuff wouldn't be so bad if it would happen in the context of real problems, like simulations where you can show how to use the state pattern to model behavior. What I want to see gone is that stuff like inheritance and composition is explained with examples taken from the physical world, completely detached from the programming practice. If you use File, Directory and FileSystemEntry as examples for a hierarchy, you can demonstrate it's usefulness with concrete examples. With Dog, Frog and Animal, not so much.


I think I get why some people rage against inheritance, but I don't understand one of your remarks here.

Human isa mammal (let's say). Human is a Liskov substitute in any context requiring a mammal. What's the problem here? Is it just the fact that Mammal brings to mind a classification system which happens to be hierarchical, and you're afraid people will become too accustomed to hierarchies?


The problem is the design paradigm where you enforce an arbitrary fixed hierarchy on real-world data.

This doesn't work for several reasons:

* In the real world there are multiple overlapping hierarchies, which have different uses. Human 'isa' SentientBeing as well, which may be a much better heirarchy for modelling communications with aliens and robots. Most OOP languages either don't handle this at all, or handle it badly.

* Your hierarchy will almost certainly have to change (you'll be wrong, or new requirements will come along). Frequent major code breakage / refactoring is the likely result. If you also need to persist data across versions, you're in big trouble.

Property / prototype based methods are in general much more flexible for real world data.


> * In the real world there are multiple overlapping hierarchies, which have different uses. Human 'isa' SentientBeing as well, which may be a much better heirarchy for modelling communications with aliens and robots. Most OOP languages either don't handle this at all, or handle it badly.

There may well be overlapping hierarchies, and trying to capture taxonomy with an adequate analogy to explain the concepts of inheritance to somebody is perhaps not the best way. But to say that you must never enforce an arbitrary fixed hierarchy contradicts the very purpose of inheritance in a good design.

For your particular example the correct way is to have a separate interface called "ISentient" that captures whatever methods and fields are required for sentient communication.

Most OOP languages support that concept.

So you can have a Human that inherits from Mammal, but only Human introduces the ISentient interface. Your "Robot" can then inherit from "Machine" and also introduce ISentient.

A much better way is to use interfaces for most things: use IMammal then override "procreate" so you can account for specie extrema like the duck-billed platypus class, which is a mammal that lays eggs.


Could you elaborate on how prototype based OOP solves overlapping hierarchy and hierarchy modification? It is not clear to me, and it seems actually like multiple inheritance, mixins, interfaces, and/or type-classes would address multiple hierarchies reasonably well, while I am not sure how prototype inheritance does so at all.


Exactly. It fucks everything up: As soon as you teach OO with real world stuff as examples, all they see are inheritance relationships. It is really hard to make 'em stop creating extremely complicated hierarchies in order to solve simple problems.


That is absolutely NOT my experience. People look at the real world and see concrete types, not hierarchies of abstractions.

New programmers will only try to build deep inheritance trees if they are told that's what they're supposed to do.


DO teach encapsulation of state.

I agree, with the caveat that there should be good discussion about when it's important to encapsulate state, rather than a rather blanket "always encapsulate data and methods" which is how it's often presented.


Writing complex classes which require inheritance is never a necessity of a begginer. He can only know that objects encapsulate values and methods and that's all.

When he needs inheritance he will know about it.


Why should you never try to teach objects as representations of real world things? Just curious.


Not the grandparent, but in my view it distracts from what the code actually needs to do. OOP is a tool to structure code and data. It gives us some useful ways to deal with responsibilities, namespaces, some kinds of polymorphism, etc. We should use it in a pragmatic way, and making a class hierarchy reflect the real world often puts aesthetics above practical considerations.


Isn't it tied together? If a new programmer can't structure their code they can't write it.


You'd be surprised what kids can achieve with little more than global variables, gotos and if statements. Structure speeds things up and prevents stupid mistakes once you understand the basics, but before that point it can be overwhelming. Far more important, in myexperience, is the "programming mindset".


I agree with that. I was just remembering the first OO stuff I was taught. I think it was a person object which we then had to use to create "Bill", "Jane", etc. Constructor set stuff like eye colour, height, etc.

Then we went on to create a dating app where it would retrieve things like "all the people with brown eyes"


Let's put it this way: you are asked to write a program that simulates the behavior of an LCR circuit. Do you create objects to represent the components of the circuit?


That example isn't specified enough to show the problem. It's either too simple or too hard. Since we're talking about a beginner, I'll assume you meant it at too simple. But even then you're assigning it to a person with a reasonably technical background. In which case, the LCR circuit is the object and you are asking for a method.


"the LCR circuit is the object and you are asking for a method"

I think that you still have the wrong abstraction here. If you are looking for a general solution, it should be one that can handle LCR circuits, mass-spring systems, etc., though that is probably too much for an introductory course. My real point is that there is nothing to be gained by modeling circuit components or even circuits as objects -- you do not improve maintainability, you do not encourage code reuse, and you do increase the amount of code that does nothing to solve the problem.


"My real point is that there is nothing to be gained by modeling circuit components or even circuits as objects"

That directly contradicts how SICP teaches it:

Our computational model of a circuit will be composed of objects that correspond to the elementary components from which the circuit is constructed. (p. 273, 2nd ed.)

Elsewhere they say:

If we have been successful in our system organization, then to add a new feature or debug an old one we will have to work on only a localized part of the system. (p. 217)

... which sounds a lot like the benefits you deny. Do you really think SICP is wrong to teach this way? I don't.

I quoted a more general passage where they introduce object modeling here: https://news.ycombinator.com/item?id=6910292. But it seems striking that you would have brought up circuit simulation as when not to do object modeling, as it is the very domain they use to illustrate the strength of the approach.


My real point is that there is nothing to be gained by modeling circuit components or even circuits as objects

Then I still find your point unclear. Why would you choose them as an exercise to introduce classes, which is what we were talking about? I agree, it's odd. Mammals and dogs, however, are a concept people can understand. As are windows and doors.


We were? I thought the question was, "Why shouldn't we teach that objects model real-world things?" The answer is that in many cases, the real-world things should not be represented as objects; I gave an example of such a case. If a student is being told that objects model real-world things and has internalized that, then when they are presented with the problem of simulating an LCR circuit, it is very likely that they will do the wrong thing and create classes for circuit components, and a class for circuits as a whole.

Further, "dogs and mammals" or "windows and doors" are extremely contrived examples. The fact that you have to invoke an artificial problem just to illustrate this point should be an indication that you are doing something wrong. On the other hand, simulating an LCR circuit is a real-world problem and one that students will probably be asked to solve as a homework assignment (e.g. in a differential equations course).


Just because some things should not be represented as objects doesn't mean that nothing should.

And mammals and dogs as a concrete example- a metaphor, if you will- is an example of a powerful way to teach abstract ideas, not an indication that something is wrong.

What I would find enlightening is how the mammal-dog relationship fails, not how there is something out there for which the model may not fit.


The problem with 'dog is a mammal' is that there's basically no obvious scenario in which you'd actually be modeling things this way unless you're trying to build some kind of ecosystem simulation or whatever. If you want to teach people, not only what inheritance is but why it's useful, you need to use a situation where inheritance makes things easier.


Here's an example that's visual: You are putting interaction widgets on the screen- they display some text, perhaps some other visual indicator, and have a button or text area or slider or something on them (there are some of each). Your base class has the setup: the window/panel/whatever-your-language-calls it, the place where the text goes, and the place where the variable widget goes. All of these widgets inherit the base class, and only differ in the user response portion. This makes it very easy to change the interaction style.


I don't know. Do you?


No, because it clutters the code. An LCR circuit is a linear system and should be solved as such; the only objects that would be relevant are those that are related to solving linear systems, and at the beginner level we probably should not be asking people to write such a general purpose solution anyway. If you start representing circuit components as objects you will just create a pile of unnecessary code that does not improve maintainability at all.


Unlike many HN'ers, I'm a big fan of OOP (yes, even in the Java sense). Still, I think the author is absolutely right.

OOP is useful because it helps tackling some common non-functional concerns, mostly modularity and as a result extensibility, reusability and, to some extent, maintainability. These are architecture concerns. They're vital to take into account when you're writing any piece of non-trivially large software (and OOP is one, though not the only one, good approach to deal with them). They're entirely unimportant in small little programs, which includes nearly anything a beginning programmer makes when learning.

This also means that languages that religiously enforce OOP, like Java and C#, might not be the best starting points indeed.


This may be the wrong timming, but I'll dare say it anyway: OOP (and OOP-like namespaces) is the best known approach for creating modularity. Other approaches are so far behind that they are not even in the same game.

Yet, there is an entire generation of programmers that learned OOP at school, and are hitting that hammer on everything, hoping for it to be a nail. That's all that's wrong with OOP: it does not solve all the problems in the world.


I definitely agree that namespaces are a good idea, but if by modularity you mean isolating pieces of functionality, I'd suggest OOP is one of the worst approaches to take, because it results in a lot of implicit connections. Only global state is worse.


So is OOP is one of the best or one of the worst approaches to modularity?


One of the worst, I think, because it encourages developers to make connections, rather than enforce isolation.


Yes.


With respect my experience is the diameteric opposite. One of the most talented and brilliant programmers I know learned object-thinking first, starting at about age 10 or 11. Spent the first several years doing stuff like

    window.open();
    Line aLine = new Line(...);
    window.add( aLine );
I will never forget the day he came to me - after about 3 years programming in Java to ask, "Where are the methods for ints documented?" and I and we had to have a Birds and Bees conversation to the effect that not /everything/ was an object. He vanished for several days after that conversation to process this oddity. I began to wish we'd chosen Smalltalk rather than Java. (Actually I still wish that as a matter of general principle.)

No. I believe that we simply approach the problem wrongly. I think that objects are (and were deliberately intended to be) closer to the way we naturally think about the world. It's the baroque mental twists and turns we have to learn to make on procedural programming that are the problem. If you don't learn that odd way of thinking first, you're just fine with learning OO first - streets ahead, in fact.


TBH, your example is not too different from

   openwindow(window);
   Line aLine = make_line();
   addwindowtoline(aLine);
A small pet peeve of mine is people insisting that encapsulation and abstraction are exclusive to object orientation even though abstract data types are also a core concept in prodecural or functional programming.

To make your example "trully OO" you would need to be using dynamic dispatching and polymorphism in those method calls. But are we really going to need multiple Window and Line classes and even if we do, is this really going to be the best way to architecture our code?


To take that example even further down the "procedural with OO-ish conventions":

  Window_open(window);
  Line aLine;
  Line_make(aLine);
  Window_add(window, aLine);


You're right - I didn't attend to developing a really good example. But then I wasn't developing material for a programming course. ;) It was pretty off the cuff.


    open(window);
    add(window, make_line(...));


There's a difference between learning to use object-oriented systems and learning to design and implement them.

This code snippet posted does not implement a new class. In a language like Python, you can code for years relying on objects without ever implementing a single class of your own. As a programmer, deciding when to switch from writing procedures and functions to writing classes and methods can be a tricky design call. The OP's argument is focused on this issue, not the simple matter of using OO APIs or frameworks.


" It's the baroque mental twists and turns we have to learn to make on procedural programming that are the problem."

What? For me the "modern OO" is as baroque as it gets.

First of all, and your example reminded me of that, because it hides what's happening. What's a window and a line? How am I supposed to know what's happening, if one has the .add method? Docs, sure, but it's non intuitive. Granted, today there's so much complexity you can't handle it all in your head.

But especially because procedural is how the computer works. Assembly. A list of instructions that are read in sequence. That I can understand. Now, OO makes this flow very convoluted, and the more "OO" someone programs, the shorter the procedures are, rather relying on inheritance, types to make computations, which of course, are good in one sense, but break the train of thought.


Some people will always prefer assembly code because they eschew all abstraction, but most of us need abstraction to deal with complexity. Picky your poison: do you like your abstractions to be objects or functions? Once he FP programmer starts flinging around higher order functions, FP can just as convoluted as OOP.


Actually, some authors define objects as functions with an internal state. In that sense, objects derive from functions.

For instance, in OCaml, the function "counter" defined as: "let counter = let x = ref 0 in fun _ -> x := !x + 1; !x ;;" is an object with one method that increments and returns the value of a counter.

In my opinion, OOP is intrinsically more complicated that FP because you can't really avoid complex patterns even to do simple things.

An example that comes to mind is the "Visitor" pattern. To me, it requires more mental gymnastic than what is found in most functional programs. It amounts to mixing higher order and functions with state which is something you try to avoid in FP.

But I admit that I'm not well-versed in OOP. I've always found it complicated. Not Objects per se, but design of big programs using classes with complex relations, UML, design patterns and so on...


I haven't used the visitor pattern in OOP in over 10 years and I write lots of code. There are many other better solutions to the expression problem than visitors. I don't bother with design patterns much or UML at all. I do mix my style up with FP when appropriate, but OOP is my dominant organizing principle.

OOP really is just about object thinking, just like FP really is about thinking in terms of functions. All the other baggage in either paradigm is quite optional.


The article is arguing that abstractions like OOP are unnecessary for a beginner. It is only after you understand procedural code that OOP "makes sense".


Another reason I approve of Python has a first language is Turtle[1]. It allows new programmers to approach problems in a modular way, i. e., if they have to draw a pyramid, they'll easily learn why a draw_square() function is important. You can extend this way of thinking to many other exercises using Turtle.

Additionally, it makes students have fun programming, since it gives them some graphical power and making a computer draw things is much more exciting than making a computer print things on a terminal (especially for newcomers). Plus, even though it's very simple, you can write some very interesting stuff with Turtle[2].

[1]: http://docs.python.org/3.0/library/turtle.html

[2]: https://github.com/FranciscoMSM/IPRP/blob/master/4linha.py


Thanks for those links! Regarding making programming fun, I've observed that certain kids (well OK, my kids) get a kick out of programming that interacts with the physical world.

Kids seem to prefer writing programs that "do" something, where their definition of "do" might be different than ours.


This is a good article I think. I've helped mentor a couple of people learning programming and helped get them to understand the web development environment.

One person we started out with some Java since he was taking a Java class, and then I quickly found that he was very confused with Java. I shifted gears and had him start Python from scratch. His understanding skyrocketed, since he was getting to understand how control structures worked, defining functions, and using them.

There is something to be said about the importance of understanding a compiled language though. While it depends on the individual, I think most people would benefit from learning with a language like Python first since some of the most basic concepts of programming become more accessible.


"The shift from procedural to OO brings with it a shift from thinking about problems and solutions to thinking about architecture."

Interesting thinking, never thought of it this way.


If you view OO as a set of organisational facilities above procedural code, then it makes a lot of sense.


Yes, that is the main point of OO. Basically it boils down to extensible modules.

I think the people that have problems to grasp OO, are the hacker type of guys that code away without much thinking about what they are trying to accomplish.

OOP requires putting the keyboard aside and think about the architecture of what you are trying to accomplish.


Python might be the best programming language for an introductory pedagogy. But it is hard for me to take seriously assertions of that which are not backed with a comparison to the languages created by the PLT group and their research.

To put this in perspective, the PLT group of languages starts off eschewing not only OOP but mutation as well. Even more relevant, these are not options in the introductory languages.

This means the practice of not using OOP (or mutation) does not rely on the beginner practicing a type of discipline that even seasoned professionals might struggle for, i.e. a large class of kludgey habits cannot be developed when the PLT languages are used. A side effect of the PLT sequence is that it suggests the idea that switching languages isn't a big deal - rather useful since different languages are suited to different problems.


Programming is hard. There's no magic bullet. OOP is fine. Functional is fine. Procedural is fine. Every generation of programmers started with something different than the previous and turns out fine.

At the end of the day, every abstraction leaks so the student has to have a good working mental model of computing. And your first programming langauge does not dictate your career. You're expected to keep yourself up to date and have continue honing your craft.


I agree with the author for the most part, however here is my experience:

I have taught three people how to program (not a large sample size I know...) each one I have chosen a different approach. It honestly depends more on the individual than the language.

Individual 1: They were interested in history, video games, politics and desired to learn programming. For him, he seemed more interested in the history of programming, so I chose to help him learn BASIC as his first language simply because of the history associated with it (I could have chose FORTRAN or COBOL). He enjoyed it more because he could connect, he thrived off imagining the other early programmers writing in BASIC. Further, he felt as if he was building and learning in a similar fashion to most programmers and that made him want to learn more languages advancing in a similar fashion.

Individual 2: For her, NetLogo was the simple choice. She understood nothing of programming and honestly did not want to at the time, but knew she needed programming knowledge to help her obtain a job in biology in the future. NetLogo offered a way to both teach her the basics of programming (objects, loops, etc.) while being pretty fun, we made an eco-system. This actually excited her about programming to the extent that she is now learning python and joined ACM.

Individual 3: For him Ruby was probably the best introduction to programming, he was interested in learning how to develop website applications. Due to the fact he had no programming experience and was interested in web applications it seemed clear (to me) that Ruby was going to be the easiest way for him to learn to program. He loved it, and has since moved on to learning C, since it turns out he loved the control programming offers him.

The point, is that each programming language has a community/documentation and specific uses, it depends on personality and what ones goals are when determining what language to learn first. Since I have always been interested in both improving myself and others I have read a few books on the subject, my favorite is The Talent Code [1]. The book explains that each person has a different personality/experiences and in turn "learn best" in different ways, which is essentially what I employed here (without explicitly meaning to).

I agree with the author that Python (in many cases) is the best language to have a want-to-be programmer learn. It offers simplicity, logic, standard libraries, community, and tutorials in abundance, however it may not excite a person or have applications in what a persons interests are (which can reduce enthusiasm and in turn learning).

[1] http://thetalentcode.com/


re: "early programmers writing in BASIC"

thanks. I feel oooold :)


Hahaha I started with BASIC myself and I'm only 22, so...


Python advocacy aside, I agree that OOP is best taught when it "comes naturally" and NOT the first thing programmers should be learning. That is, the ideal sequence of concepts I think would be (along with a possibly contrived but relevant example):

1. Very basic, sequential computation. Add two numbers. 2. Conditional computation. Find the absolute value of two numbers. 3. Computations over homogenous data: Arrays and loops. Find the average of the absolute value of a set of numbers. 4. Computations over inhomogenous data: Structures. Find the area of a set of rectangles. 5. Computations over "even more inhomogenous" data: Objects. Find the area of a set of shapes of different types, and find the centroid of them.

That's where you ask "how would you do this with what you've learned so far", and the learners should start to build different structures with a lot of similar aspects and duplication of code (I realize I didn't explicitly talk about the value of functions/procedures above, I guess it'd go somewhere after 3), and then you show them how much more straightforward it is to do with OOP. Now they know the value of OOP and what "doing it without" would entail, and you end up with a bunch of more knowledgeable programmers who won't needlessly create baroque class hierarchies and write more code than they need to, but will use those abstractions when they have value. There is a huge difference IMHO between e.g. being taught dogmatically "you MUST use functions because abstraction is good" like it is something you should take at face value, and being taught by being given a problem in which you do not know about functions at this point and thus write lots of duplicated code and then being shown the value of eliminating that duplication via functions.

I've seen way too many learners create half a dozen classes with a ton of methods in them when given a problem that could be solved in a single function of a few lines, then have trouble figuring out what to put in the method bodies... it's completely backwards! No matter how much "architecture" you manage to create with deeply nested objects, in the end all the functionality of your code is essentially based on a series of statements executed in sequence, much like they would be in a purely procedural language. The fact that many otherwise highly-regarded educational institutions are churning out programmers who can create dozens of classes with dozens of methods in them, and then not know what goes in those methods, i.e. the actual "meat" of the computation, is something I find highly disturbing.


Why start with computation as a sequence of operations? It might be equally useful (maybe even more useful for some students) to start with computation as a series of reductions e.g. with combinator logic.


Because that's how a computer actually works, and the sequential execution concept is something that is intuitive to anyone who has followed a schedule, a recipe, been taught how to tie their shoelaces, ... , in essence, done any sort of living.


"Because that's how a computer actually works"

So what? That is kind of like saying that people need to understand how an airplane works in order to ship a package overnight. Sure there is value in understanding how a computer works, and that will probably be taught later in the curriculum; but teaching students how to think abstractly is far more valuable.

"the sequential execution concept is something that is intuitive"

So is the concept of reducing one expression to another; we teach algebra as exactly that long before we teach anyone how to program.


Sequential instruction is learned before expression reduction in real life...

So what? That is kind of like saying that people need to understand how an airplane works in order to ship a package overnight.

But people DO need to understand a bit about the process of shipping a package, if they want to estimate when it will arrive, or if they want to be able to read the online tracker, or even know that they must bring the package to a particular place before fedex will ship it. Or if the package does not arrive, what would you have to do to find it?

Generally speaking, understanding how a computer works is important to programming because the essence of programming is mapping a real problem onto the available computational hardware.

Not that I want to argue against learning about expression reduction, I would agree that should be done sooner than later, but it's not an argument against learning sequential algorithms.


> maybe even more useful for some students

Maybe. But I've never met those students at real life (ok, I have a very small sampling set). The imperative paradigm is easier to learn.


I TAd a CS101 course in grad school. I saw a number of students try do something like this:

  func(if(x == 5) foo else bar)
In other words, treating an "if...else" statement as something that can be reduced to a value. This frustrated the students because it did not compile. We had to train them to think in terms of sequential execution, despite the fact that they did not become any better at programming in the process (worse, they had to un-learn a perfectly valid way to think about programming -- only to have to re-learn it later if they took the programming languages course).


What language was your class using? There are so many ways to do what the students wanted to do, be it the tertiary function or a lambda function or even, defining a method/function like (Ruby):

  def foowiz(x, threshold = 5)

    if x == threshold

      foo

    else

      bar

    end

  end
And then a call to func(foowiz(x)) would work - even in an OOP language like Ruby.

What did the students need to unlearn?

In the intro CS class I took an age ago, we had to implement OOP as a final assignment in Dr. Scheme, which is a derivative of LISP that is most assuredly a functional language rather than OOP. That did not mean I had to unlearn how lambdas and tail recursion worked.


Why did they have to "un-learn" anything? What they should learn from that experience is that the particular tool they were using did not have a feature they expected. Any reasonable teacher should explain that. It's important because often, extra features come at a cost and while some of those costs have been paid permanently by researchers and software developers of the past, others still require tradeoffs.

Thinking in abstract terms is all well and good until you actually have to run code in a real environment to solve a real problem with the tools you have available.


Well, in C you can definitely do that:

return (x == 5) ? foo : bar;


I don't understand why inheritance is always taught before composition. I almost never reach for inheritance these days, composition winds up more accurately modeling the domain. I'll define a bunch of classes modeling the actual behavior of domain objects, then get some code working using those classes. Then I'll start cleaning up, figuring out what goes together and creating composed classes out of the individual pieces of behavior. As I'm doing this, a lot of times I'll pull out functionality in one class and put it where it seems to belong.


I keep my eye on which programming languages and techniques are recommended for kids (plus kids-of-all-ages such as hobbyists, scientists, etc.).

Then I use those languages myself. ;-)


Im not a computer science major and got functional programming naturally but when I first encountered OOP i found it immensely confusing.I couldn't understand WHY you would want to do any of this, thinking about manipulating data was natural to me,but mapping classes onto so called 'real world ' ideas seemed like an immensely leaky abstraction. I eventually got a mental map that while functional programming resembled maths OOP resembled biology,considering an object as cell which can be a compartmentalised abstraction which could could change state and provide an interface.Probably not the most useful mental model for everybody but it helped me.

What i learnt was: Forget about all this dog inherits from Mammal bullshit(it doesn't work that way,its just wrong!). OOP is good for a few things. 1. Sharing state between a collection of functions(call them methods if you want to use two names for the same thing). 2. Code reuse (inheritance)

Im rather amused that test driven development forces you to simplify OOP as much as is humanly possible,thereby going against the advantages of a shared state,which is one of the reasons you would use an object in the first place.


Python, seriously? How do you explain to 5-year-olds what the following line does?

  if __name__ == "__main__":
Python is there for when you move beyond beginner and you start to need libraries.

For everyone else, there's javascript: open up your browser console and you have a full IDE with breakpoints and inspection, and an interactive REPL. Object orientation is there if you want it, but what 5-year-old does? Functional paradigms are there when you're ready for them (map and foreach on arrays, for example). A smooth learning curve that gives you power when you reach for it, and a fast, flexible GUI environment. Free source code to study comes with every web site.

Best of all, its flexibility fosters many differing programming styles: consider jQuery and Angular. Very few programming ecosystems could give rise to such unique styles. I wish my first programming language was javascript, instead of C64 BASIC, with its 38911 basic bytes free.


Javascript is absurdly complex, awkward, dishonest (in trying to appear to be OOP), and often a completely unintelligible mass of callbacks. I not only don't want beginner programmers to learn in javascript, I don't even want them to see js until they have experience in another functional language (although that ship has sailed.)

If you honestly think

  if __name__ == "__main__":
is more complicated than js variable scoping rules and closures, I'm intrigued by your mind.


JS, as with all languages, has its flaws, but I found JS the most accessible language for me. It solved the problems I was interested in, and so putting up with its issues was fine for me.


When teaching Python to a 5-year-old, you wouldn't use that line.

You'd just have statements in global scope and no main() function.


Can't tell if you're trolling. Ignoring the fact that you don't actually need the if __name__ == "__main__" thing in Python...

    <script type="text/javascript">
        $(document).ready();
    </script>

    int main() {
        return 0; // may or may not be needed depending on version of C/C++
    }

    class Main {
        public static void main(String args) {
        }
    }
Perhaps you should try Haskell, since you seem to care so much about how to write main...

    main = print "blah"


I disagree. OOP is not a difficult concept and not hard to teach. For an example of what happens when "OOP must not be taught to beginners", see PHP.


First, I disagree with your premise that PHP's flaws can be attributed primarily to lack of OO design. But regardless:

For all its (many) flaws, PHP was (and is) an extraordinarily successful project relative to the vast majority of programming projects attempted. This is in fact a confirmation of the original post's thesis. The theory is that teaching OOP to beginners bogs them down worrying about the architecture of their code before they've had any experience solving problems at the scale where the architecture matters. It's easy for a beginning programmer to waste hours on pointless refactoring that doesn't actually improve their program in any meaningful way, simply in order to appeal to some professor, teacher, or other religious figure's concept of "good code". PHP, on the other hand, said "screw you religous types, we're just going to design a product people will use."

The reason that OOP should not be taught to beginners is not because it's a difficult concept. The reason OOP should not be taught to beginners is that it is an advanced concept. There are many other important fundamentals that can come first.


>First, I disagree with your premise that PHP's flaws can be attributed primarily to lack of OO design.

That's not my premise. I just point to PHP (and its community) as it has some examples of it.

>The reason OOP should not be taught to beginners is that it is an advanced concept.

I've never quite got this. Spend any time writing code that's not using classes and you end up reinventing OOP anyway.


It's true that if you make programming more difficult to learn then you will self-select only programmers who are especially dedicated or intelligent.

I've never quite got this. Spend any time writing code that's not using classes and you end up reinventing OOP anyway.

I think that letting beginners rediscover this for themselves and then exposing language constructs that make this easier probably allows a smoother path to enlightenment.

It's often not obvious to beginners why copy-pasting code a bunch of times is bad compared to using loops and functions for example. It certainly wasn't for me, until I tried to build a program that was not highly trivial.


But OOP doesn't make programming more difficult. It just makes structuring your code easier.


It makes it more difficult for a beginner because it is yet another concept to learn when you are at a stage at which they are not yet comfortable with variables and loops. It's also a pretty vague concept where different languages and different developers have different ideas about how objects should work.

I find that it only really makes sense from a code structure point of view once you have got to programs that are significantly bigger than the average beginner is likely to be writing. Even as somebody who understands OO I don't tend to use it for programs that are less than at least a few thousand LOC because I find that it just gets in the way.


HM...

  class Hello {
    public static void main(String args[]) {
      System.out.println("Hello, world!");
    }
  }
Versus:

  (print "Hello, world!")
Yes, this is a deliberately extreme example, but the point is that object-oriented code does not make it easier to structure CS101-level programs. All it does is introduce a bunch of keywords that students have trouble understanding until they are sitting in a much higher-level course.


I think it can help once you've dealt with dictionaries and functions. Classes are handy for storing a bunch of variables and functions together, and that's how I'd teach it.


The problem there is you have to know which variables and functions belong together. You have to commit to coupling them early on, meaning you have to make a design decision that a beginning programmer might not be prepared to make. This means they may struggle figuring out the best way to organize their code when solving problems when such organizational concerns are really not that important.

Here's a sample programming problem a beginner might encounter: CIDR data is available as a "127.0.0.1/8" style string and you need to access the IP and Netmask separately. In python, the simplest, most straightforward path to a solution that still results in encapsulated, re-usable code is to write a standalone function that accepts a string and returns a tuple. You could also create a CIDR class with a "to_tuple" method, or that stores the values in 'ip' and 'netmask' attributes, but this is simply extra scaffolding and without knowing more about the larger context of the program it's impossible to know whether such a design will be beneficial.

The problem with beginners is that they'll see the straightforward solution, but because they're not just trying to solve a problem but also demonstrate "Object Oriented Programming" skills to their teacher, they're going to worry about doing a whole bunch of extra work that may not be appropriate. I believe this is wasted energy that could be spent gaining real experience solving problems and learning about how computers and software work.


I've never quite got this. Spend any time writing code that's not using classes and you end up reinventing OOP anyway.

"Spend any time" implies substantially more experience than what most beginners possess, hence the term "advanced." "Intermediate" might be a better term in an absolute sense. OO design is something to learn after you understand the basic tools and techniques.

Also, a lot can be accomplished without OOP (for most values of OOP) and even if you're writing de facto OO code, it's not always clear whether using explicit OO language features is a net win or not.


Well, I think it's better to teach OOP early on to avoid people ending up reinventing it.


Obviously. I strongly disagree. There are many more important things to learn first. OOP can be introduces little by little as it gradually becomes relevant.


Well, I'd introduce classes after dictionaries and functions, because that's all they are, really.


> I've never quite got this. Spend any time writing code that's not using classes and you end up reinventing OOP anyway.

Only if you start with C. I find that if provided with reasonable alternatives (like function composition) or consistently restricted problem domains (vector manipulations) you very well might never create anything resembling objects.


OOP "for beginners" always seems to be taught as some godawful real world analogy.

It's probably easier to talk about it if you've done a little imperative programming and can grasp OOP as a solution to actual problems, rather than just some abstract nonsense.


> some godawful real world analogy.

Yes, I think this is a real distraction. This Cat/Dog/Animal trope leads to the view that you should use classes to model your domain objects. In some cases, you should. But this can cause as many problems as it solves, and should not (IMO) be the default approach.

In getting away from this view, I've found it helpful to think of most programming as designing DSL's. To that end, you can use OO to build the infrastructure necessary to support this. But I worked in OO languages for many years before grokking that.


It isn't. I took a course in my last year at high school and real world analogies made OOP seem a breeze. Almost no one in the course had problems grasping OOP basics and applying them. Of course, the OOP lectures came after 3 months of intensive studying and basic algo tasks solving.


PHP for the longest time wasn't OOP at all and most of it really isn't. An awful lot of what you do with PHP has nothing to do with OOP at all.


I am aware. I use it as an example because plenty of good API designs for it that could've made it in did not because of "OOP shouldn't be used by beginners". The PHP community is unreasonably scared of OOP.


Why do you put avoidance of OOP down to fear? Maybe it's down to an understanding that OOP is often overused and should not be the paradigm-by-default. Disclaimer: I don't actually know what the cause is in this case, nor do I think that OOP should always be avoided.


Trying to scare us with the PHP bogeyman? Shame on you!


Ah, but what is php popular for other than "it's easy to get started in"? The focus on procedural programming is why people flock to it, because those objects really do make things more complicated.


But they don't. In practice, the difference is between:

  $x = new Object;
  $x->bar($foo);
vs.

  $x = object_create();
  object_bar($x, $foo):
See, for example, the "procedural" version of MySQLi.


We'll php didn't start out as an OO language. Many of the OO "features" we're added later. Just saying (not really disagreeing).


OOP is a bundle of concepts and I find it hard to reconcile them all and explain them to people. Individual principles like information hiding, classes, those are good to teach but the monolithic 'OOP' is tough going for beginners. And non-beginners.


My main problem with OOP is that objects don't bound complexity.

If I call a function with some arguments, that function is composed of smaller functions which can only ever operate on the arguments I passed the parent function. As I follow the function down to primitives, each piece of code is operating on less data.

If I call a method, that method can operate on its arguments, properties of the object, and properties of the superclass. I often find myself ping-ponging around the code base as a class' method relies on a method of an abstract superclass which itself relies on methods of the original class, each step introducing more properties into the calculation, many of which are objects themselves!


There is no right or wrong way to teach programming. You should just do whatever you can to keep whoever you're teaching motivated.

When you teach it's not about you or what you think would have been best for you, it's about understanding other people's needs.

If you think bring up OOP should wait for some time in the future then wait to bring up OOP. If you think that OOP should be the basis from which programming is taught, then start with OOP. Just be a good teacher and keep them motivated. The whole goal is to get them past it all anyway.

Also, programming IS messy. Extremely messy, and teaching people as though it's gonna be all greased tracks the whole way isn't doing them any service.


Would understanding structures be useful before trying to learn classes in any OOP language? If you understand how structures are stored in memory, would that provide the foundation for understanding class instantiation? I feel the answer is yes, and the optimal language before OOP is C. Other aspects of C that help prepare for OOP - pointers: how they are stored and referenced, pointers to functions (so you will understand object references), native data types and the memory that they consume. A foundation in these aspects prepare one for any OO language.


Definitely. Structures are data grouped together, classes are that plus you can define functions to implicitly operate on their contents (also known as methods), extend them, etc. Then it becomes even easier to move onto C++.


In the same vein (linked at the bottom of the article): http://prog21.dadgum.com/156.html


These days, I think Javascript is a superior beginner language:

- Nothing to install, you most likely already have a full blown development environment installed (e.g. Chrome or Firefox).

- The language is straightforward, functions everywhere, easy control structures, very little to be distracted by.

- The reward loop is very high for beginners: load a page, tell them to write a Javascript one-liner and see their face light up when the HTML page gets instantly modified


Odd scoping, equality, type coercion - those seem rather significant distractions. Having to explain that == doesn't really work and === is probably what they want seems beyond bizarre. And "function" as a way to introduce an anonymous function is just verbosity for verbosity's sake.

Unfortunately, the two other points you mention probably outweigh the actual language design.


yep there was an epic debate about "objects-first" vs. "objects-late" CS1 curricula last decade. the objects-first folks seemed to have lost, though. here's a paper:

http://homes.cs.washington.edu/~reges/sigcse/basics.pdf


From the article:

"The shift from procedural to OO brings with it a shift from thinking about problems and solutions to thinking about architecture"

Couldn't agree more... I have been doing OOP for a while in various prog. languages, and I had come to similar conclusions...It doesn't really help you focus on core ways to solve problems: focusing on algorithms.

Another read I would recommend is from the author of C++/STL:

http://www.stlport.org/resources/StepanovUSA.html

See the answer to the question:

"I think STL and Generic Programming mark a definite departure from the common C++ programming style, which I find is almost completely derived from SmallTalk. Do you agree? "

Note: not trying to endorse C++ templates here either...have my own gripes here but it did bring the attention back on algorithms and data structures.


After my brief time in Java, I instinctively agree with the OP...but I wonder if there's a happy middle ground here, such as just using the out-of-the-box parts of OOP?

The easiest example I can think of is: using Ruby without getting into how to define/redefine classes or methods. Just stick to instantiating simple data types and invoking their methods:

     my_string = "Hello world"
     puts my_string.upcase
     # => "HELLO WORLD"
     puts my_string
     # => "Hello world"
Among the first roadblocks beginners seem to have is grokking that `my_string` isn't modified...but isn't that an early point of confusion for functional programming too, to some degree?


I believe, if you want to teach programming as a commercial (9-5) profession it is easier get the ideas of imperative programming (OOP or not) across and it will bring more benefits to the "pupil" ;)

... and teach declarative programming (Functional or not) to the pupils interested in the background of computer science and targeting some innovation/invention trajectory in their "career".

In other words (and very exaggerating):

imperative programming : translating more easily "real world" into code

declarative programming : translating more easily "platonic world" into code

ps: no, i don't believe this to be absolute truth, yes it is quite a hyperbole ;)


> and teach declarative programming (Functional or not) to the pupils interested in the background of computer science and targeting some innovation/invention trajectory in their "career".

Ok, but only after you teach them imperative programming. That's because it's also easier for them to learn the imperative paradigm, and you want to minimize the amount of "pollution" they have to learn alongside with "what is a computer and what it does".

That's also why avoiding OOP at this stage is a great advice. Architecture is "pollution" too.


Yup, I too believe that agent-based imperative thinking is very deeply anchored in us and for sure helps a lot to explain a computer at conceptual level

agent: computer

imperative commands: instructions you give it

... but later on once the understanding of the machine/tool is established i see how much more powerful the declarative resp. transformational "paradigm" gets for "computers" applied to the "platonic world"

I believe that the transition towards the declarative paradigm and understanding of it is essential to grasp the idea of how "computers" and computation differs from other physical machines mankind invented.


re: --- When you're trying to help someone learn how to go from a problem statement to working code, the last thing you want is to get them sidetracked by faux-engineering busywork. ---

I like this for new programmers as they are often trying to solve a problem or make a game. My step son made a nice chunk of change by building a website for someone -- the last thing I would have tried to teach him was oop!

OTOH - there is a point where learning OOP and forms of modularization are a good idea to expand one's knowledge of the art - just like everyone should read tcp/ip illustrated and hackers and painters (#pandering on the later)


OOP is fun, it makes some sense for beginners, doing something "The OOP Way" always felt like an achievement for me, when I was a child. It motivated me to learn more.


Well, no shit sherlock. You also don't start teaching people to drive a manual car by telling them to use all 5 gears and drive on the highway.

I would have thought this was self-evident.


Then I guess you haven't suffered through introductory programming texts like "Objects First With Java".


I would agree - I'm current in a college CS program (I have ~6 years of programming experience already, so the introductory stuff is a breeze). The very first programming class we took was "Intro to Object-Oriented programming in Java". The irony is that a good 3/4's of the final projects people turned in where just a big mess of procedural static methods and global variables all shoved into the single 'required' class to hold main(). There were some who after going through the class understood OOP a bit more then they used to, but those were really just people who already had some programming experience. Those who came into the program with no experience were still trying to figure out the basics of procedural stuff ('This is how a for loop works', 'this is an int, it's not an object', 'i = i + 1 is a valid statement', while() vs. do while(), etc...) (And are still trying to figure that stuff out even now, though now a bit more complex issues, a year later, since it was never really taught that well)

I think the biggest thing that stuck-out to me as being utterly wrong with teaching Java and OOP is that you're forced to talk about references/pointers extremely early on, when most of the students don't have enough grasp on the core language to understand them conceptually (And really, who understands a reference/pointer the first time they heard about it? let alone 2 months after starting programming).

I honestly think starting off with C would have been better, and I don't view C as a good beginners language. But C itself has a pretty simple core language, and writing small procedural programs in it isn't very hard to conceptualize, so it requires a minimal amount of hand-waving to get simple programs running. (Hello 'public static void main(String args[])') The biggest hurdle would probably be string handling, as it's an array of char's. If you're not concerned with buffer overflows though, simple stack-allocated arrays work fine for tons of simple introductory programs, and <string.h> provides enough to get by without messing with any pointers.


Starting with C would be good, but only if the teacher constantly supervises and reviews the code that gets written by the students. Beginners will inevitably do mistakes as they are learning, and the leaky abstraction between C and the underlying architecture would punish these mistakes mercilessly. Writing correct C code is one thing, asking "why is this language broken" because you returned a pointer to a local variable is another.


At my Uni, we learned C first, then OOP in C++, then Java.


Wait, what? I think I was driving on the highway the second or third time I was behind the wheel of a car (I didn't drive an automatic until well after I'd received my license).

Bad choice of analogues aside, I (somewhat) agree with you. Python is a very beginner-friendly language.


Really?! Then you are not an European I guess.


As a contrast, Steven Ragnarok has a interesting talk about teaching ruby with objects [1].

Whether it's appropriate to start with an OO language I guess is another question, but I think done the right way, starting with OO isn't as hard as people think.

[1] http://www.youtube.com/watch?v=SNbBC2pSiVw


> Python is good for a wide range of simple and interesting problems that would be too much effort in C. (Seriously, a basic spellchecker can be implemented in a few lines of Python.)

But, but, someone somewhere in galaxy far away already made lib so you can use it in C with only few lines of code.


I don't think Python is a good first language. It's far too abstracted. I've always felt that learning lower level languages first then moving up the ladder is the better approach.


I disagree; abstractions are good, and it is more important for students to learn to think abstractly than to learn how a computer works "under the hood." The vast majority of computing tasks involve abstract problems with abstract solutions -- computing account balances, querying databases, cryptography, UIs, networking, etc. Even basic things in a CS curriculum like data structures and algorithms require abstract approaches and gain very little from knowing low-level details.

There is a place for the low-level details: computer architecture, compilers, and operating systems. Students should take these courses to be well-rounded and to see how theory is translated into practice, but elsewhere things should be done at a high level.


Even basic things in a CS curriculum like data structures and algorithms require abstract approaches and gain very little from knowing low-level details.

I'd disagree somewhat. Data structures, from a CS perspective, often involve learning about low-level implementation details (why is accessing a hash O(1)? Why do array elements have to be the same size? What's the difference between representing a graph as a linked list versus a 2-dimensional matrix?). It's similar with algorithms.

Even though you CAN use completely abstract computational models to learn algorithms, many of the most important algorithms are designed around tradeoffs between real-world resources and being able to map abstract algorithms to a low-level language like C is a very useful skill to those who have it.


I agree in principle but still think Python is best to start with, for two reasons.

First, many people need a simple introduction to the entire concept of programming-- codifying precise instructions to a computer to get specific results.

Second, python is a great tool for exploration of other low-level systems, eg the socket library, the posix library, the os library.


I have seen some truly awful code that has been written in Python. It wouldn't have been possible in Java due to the nature of the language.


One can write an O(n^2) that should be an O(n) algorithm in Java just as easily as Python.

Bad code is bad code. On the scale of encouraging good practices and discouraging bad ones, Python does pretty well given the very low overhead to get started-- something Java does not have.


You right we dont teach mathematics by starting with advanced concepts we start with addition and subtraction.

Back in high school in the 70's our first language was a variant machine language CECIL - which we had to do by sending of coding sheets to a bureau.


I agree, and I love Python. Python is better suited to be the language you learn the architecture parts with. Basics should probably be in C.


I agree with the article and think Python is great for a first language.

I asked my daughter's AP comp sci teacher: Why Java? A: because that's what the test uses.


Don't start with learning how to use a bicycle. Start with Rocket Science.

The Anti-OOP propaganda gets stronger.

Didn't Alan Kay of Smalltalk fame work with and for children?


tangent: isn't the tricky part of a spellchecker the edit/Levenshtein distance? i.e. to still recognize a word even when misspelled - else you can only look up words that are already spelled correctly.

(I checked the comments on the last three times sunmissions of his spellchecker article, but no one seemed to raise this point...)


How about a more provocative idea (yes, a straw man, but I'm just thinking aloud):

Don't distract new programmers with computer science. ;-)


That is not so bad idea. There is one essential skill - being able to get around and make any technology do your bidding.

People learn stuff well when it is rooted in reality. Make them make few simple programs that could be done with excel.

And right now with these insanely rich libraries that are on the market you could spend half your career without knowing what O notation is and have no problems.


It is good to learn that OO capabilities are almost entirely a set of conventions and how these could be implemented as a DSL, like it have been done many times with Common Lisps and Schemes. CLOS is just a DSL.

Another case study is Scala language, which is functional in the first place, and uses OO as a convenient way to structure the code, which, together with message passing were Alan Kay's original concepts.

OO should be a nice optional add on, not a forsed the only way to do it all as it is in case of Java or Ruby.




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: