If you don't teach a seven year old calculus, are you really teaching them math?
I understand you can't teach everything upfront; that's not my point. I remember my 5th grade class, where several people said they wanted to be mathematicians when they grew up. The reality usually sets in by college, that all the 'math' we learned in the classrooms was just a tiny, tiny corner of a vast field of study. So similarly, I thought the 'programming' I was learning as a beginner was the real deal. When you're starting out, you never see the invisible, the unknown unknowns, the hidden complexity; you only see what's in front of you, and it has to make sense all on its own.
So there's this huge, intangible mess of complexity that is 'math' or that is 'programming', and you can't explain it all at once. They won't have any of the prerequisite knowledge, and you can't waste time covering all that. So you have to choose 1 concept out of the 7000, and you have to try to teach it.
There could be C programmers that know nothing about monads, so it stands to reason you could teach lots-of-C before you teach anything about monads. Is the opposite true? Are there Haskell programmers that know nothing about pointer arithmetic?
Good point regarding the math analogy, I feel it was a rather poor one now.
But I feel that the difference is between knowing and understanding. You can know how to get the browser to display a rectangle with the right copy/pasted html.
But without discrete math there really is no foundation to understand the underlying algorithms. Without understanding it is impossible to know how to extrapolate those examples to solve different but similar problems, probably the most important skill a programmer can have.
Teaching C before discrete math is like teaching english without a vocabulary. Sure you can understand the syntax. But without understanding algorithms or data structures you are going to be lost very quickly. You might not need monads for C - since it is such a primitive language. But try writing C without iteration or recursion.
You're right. What I was trying to say by Haskell programmer was more like an industry veteran. 10+ years experience in Haskell/etc, zero knowledge of pointer arithmetic.
There seems to be this ineffable force tilting the scales to create more C veterans that don't know monads (I could find you one) than there are Haskell veterans that don't know pointer arithmetic. Age, popularity, libraries, etc all tie into this, but I still think there's significance amidst the noise.
I understand you can't teach everything upfront; that's not my point. I remember my 5th grade class, where several people said they wanted to be mathematicians when they grew up. The reality usually sets in by college, that all the 'math' we learned in the classrooms was just a tiny, tiny corner of a vast field of study. So similarly, I thought the 'programming' I was learning as a beginner was the real deal. When you're starting out, you never see the invisible, the unknown unknowns, the hidden complexity; you only see what's in front of you, and it has to make sense all on its own.
So there's this huge, intangible mess of complexity that is 'math' or that is 'programming', and you can't explain it all at once. They won't have any of the prerequisite knowledge, and you can't waste time covering all that. So you have to choose 1 concept out of the 7000, and you have to try to teach it.
There could be C programmers that know nothing about monads, so it stands to reason you could teach lots-of-C before you teach anything about monads. Is the opposite true? Are there Haskell programmers that know nothing about pointer arithmetic?