- DO NEVER ever try to teach about objects as representations of real-world things. And for the love of god, do not give out modelling assignments where "Human inherits mammal" is an answer
I have to disagree with this. I think this is exactly how to think about this.
I once explained the shift to OO to an older gent who was familiar, not just with ancient scientific programming, but also with building a house. I told him that the idea was that code was so large and complicated that it was divided up among teams with each team getting specific parameters that their code had to fit. He said, well of course.
Then I said it was much like constructing a house. You had to put the frame up, and know where the windows and doors would be, but the details of the windows and doors could be decided later. Every door had to be able to respond to an "open" command and a "close" command, some may have a "lock" command, and so on, but the builder of the door worried about getting that stuff right. All you had to specify were the dimensions and the direction of the door.
Now this was a pretty technical person- and perhaps doesn't qualify as a beginner, but I find, in general, real world metaphors to be powerful for teaching.
> The heart of the matter is that people begin with the concrete, and move to the abstract. Humans are very good at pattern recognition, so this is a natural progression. By examining concrete objects in detail, one begins to notice similarities and patterns, until one comes to understand on a more abstract, intuitive level. This is why it’s such good pedagogical practice to demonstrate examples of concepts you are trying to teach. It’s particularly important to note that this process doesn’t change even when one is presented with the abstraction up front! For example, when presented with a mathematical definition for the first time, most people (me included) don’t “get it” immediately: it is only after examining some specific instances of the definition, and working through the implications of the definition in detail, that one begins to appreciate the definition and gain an understanding of what it “really says.”
Inheritance is not taxonomy. If you want to teach inheritance, show a couple of realistic examples of code which, when you add inheritance, becomes clearly better. We can see the pattern; we can make our own intuitive, abstractive jumps.
"I told him that the idea was that code was so large and complicated that it was divided up among teams with each team getting specific parameters that their code had to fit."
IMO, this is the first mistake. Code should not be large and complicated; when it is, few things can save you, whether it be the language or paradigm you're programming in, or the structure of the team doing the programming. OOP is a solution to the wrong problem.
Depends what you include in "code"; your program might be a one-liner, but its language environment and operating system are the product of huge teams. It's a kind of modularisation that's so effective we don't think of it as a division of labour.
One person writing all the code for a system from the bare metal up is pretty much limited to embedded systems now.
You can start with realizing that your data (100k rows) does not require Oracle and can be stored in a flat file and loaded in memory, so you can ditch JPA. Then get rid of Spring and Jackson. Then you remove all Abstract Singleton Factories and leave just the code that does what user wants, not what developer wants. Then drop a couple of messaging frameworks that your using when you realize that now all the code fits in one app. Next thing you will see is that NanoHttpd will do just fine for your load level and therefore you can ditch WebLogic. After all that you can remove maven (blasphemy!!!) because your project shrunk to a few dozen files that can be compiled and packaged with 5 lines batch script.
But that wouldn't be glamorous, would it? Besides, this approach will make you hopelessly unemployable in today's world.
The important things for writing modular code in a big codebase is defining clear interfaces and making "black box" modules. An important thing for this is abstract data types but, unlike what many people seem to say, those are not exclusive to OOP - you can also have abstract datatypes in procedural or functional languages.
The core concept of OOP is adding dynamic method-based dispatching to your abstract data types but I would argue that while this is useful a lot of times, its not really fundamental. IN fact, there as many cases when dynamic dispatching will not be able to solve the problem on its oown: a good example is generics in Java and C# - parametric polymorphism helps enforce abstractions but isn't really object oriented.
What is it about "today's world" that makes software require a larger code base? If anything, we have much higher-level languages which require far less code than in the olden days. Anyway, my point was more about the unix nature of building small programs, that do one thing well, and using those progams in combination.
"What is it about "today's world" that makes software require a larger code base?"
The fact that we are still not using the high-level languages we have available to us. Vast amounts of new C, C++, and Java code are written each year. We are solving bigger problems, but we are using languages that are still catching up to the state-of-the-art of the 1970s.
* Anyway, my point was more about the unix nature of building small programs, that do one thing well, and using those programs in combination.*
That just uses different words to describe the same net result and the same net problem. You still have to find a way to make all of those pieces work together to do something complex.
I have to disagree with this. I think this is exactly how to think about this.
I once explained the shift to OO to an older gent who was familiar, not just with ancient scientific programming, but also with building a house. I told him that the idea was that code was so large and complicated that it was divided up among teams with each team getting specific parameters that their code had to fit. He said, well of course.
Then I said it was much like constructing a house. You had to put the frame up, and know where the windows and doors would be, but the details of the windows and doors could be decided later. Every door had to be able to respond to an "open" command and a "close" command, some may have a "lock" command, and so on, but the builder of the door worried about getting that stuff right. All you had to specify were the dimensions and the direction of the door.
Now this was a pretty technical person- and perhaps doesn't qualify as a beginner, but I find, in general, real world metaphors to be powerful for teaching.