Hacker News new | past | comments | ask | show | jobs | submit login
The mathematics generation gap (worthwhile.typepad.com)
90 points by ColinWright on Aug 25, 2011 | hide | past | favorite | 90 comments



This article has an economics spin to it, which is interesting as only insiders seem to understand how much mathematics is required in that field. Greg Mankiw lays out a similar defense of mathematics in what we might assume to be a more logic or intuition driven field. (http://gregmankiw.blogspot.com/2006/09/why-aspiring-economis...) His advice for aspiring Phd students in economics suggests as much mathematics as many undergrad programs require (http://gregmankiw.blogspot.com/2006/05/which-math-courses.ht...).

What's this mean about calculators and learning the basics? There are concepts that you can't mentally understand if your mind can't map the basics of math. I see it in real world examples (people can't make quantitative business judgements on the fly) as well as abstract reasoning. One of the critiques of Google is similar - by reducing what we need to remember, we reduce our ability to process certain ideas on the fly.

Like many things, it's ultimately a trade-off. Are the things we mentally miss from utilizing calculators and Google more or less important than what we gain in added productivity? (Is it worth the health hit of driving 5 miles to work rather than walking? Can you mitigate it by spending half the saved time in the gym?)


People who can do basic arithmetic on paper have an advantage with back-of-the-envelope calculations and ballpark estimates, and will end up using such simple and effective tools more often. The mental cost of pulling out a calculator and concentrating to enter all digits in sequence is just too high sometimes.

There is also an often overlooked educational advantage in that pupils get early exposure to working with simple algorithms. Becoming used to the idea of following a sequence of unchanging steps, for varying inputs, comes in handy for understanding more complicated algorithms later on (such as those involved in algebra, calculus, and computer programming).


Having an understanding of the meaning of a mathematical concept does not require continuous, ongoing, manual, exercise of its mathematical calculation.

For example, I'm willing to bet that virtually all of you reading this fully grok the concept of "square root". Your understanding of this is probably strong enough that you can wield it in day-to-day usage, successfully employing it to build back-of-the-envelope estimate, etc. Yet how many of you actually know how to calculate a square root by hand?

I've been shown how to do it, but have never done it myself but for once or twice. I also know how to estimate it via via calculus and interpolation. But I don't know how to do it now, and I've not even done it enough times to have gotten an understanding of the workings of "square root" by way of repeated manual application of the mathematics.

So, how to explain the fact that I (and presumably you) are so comfortable in employing the concept?


Knowing how to do a square root efficiently by hand is probably a bit overkill, although it can still be useful sometimes. I was rather arguing about addition, division etc., which come up much more often in back-of-the-envelope calculations; and which also provide a much easier and earlier introduction to the kind of algorithmic thinking that is required in algebra, programming etc.


Highly recommend read: Street-Fighting Mathematics http://mitpress.mit.edu/books/full_pdfs/Street-Fighting_Math...


Calculus. Not arithmetic, calculus.

Economics, even the basic stuff when taught at the college level, involves a couple of integrals. If you have a good, solid, intuitive grasp of what an integral is and what it represents, all this stuff is obvious. Obvious to the point that when the prof says "this is an integral" that will explain the concept better than the English definition. On the other hand, if integrals to you are voodoo witchcraft that puts numbers on a physics exam, then when you reach the same point in the lecture you'll be dealing with something extra you have to learn, rather than something that helps you learn. You'll be worse at economics because you can't switch back and forth between the concepts and the models. If someone gives you an equation, you won't be able to interpret it, and if someone defines a new concept, you won't be able to calculate it.


I missed Calc in hs/undergrad but need it much more later on (and felt pretty innumerate without it), I remember picking up the classic "Calculus Made Easy". I was immediately stunned by it's simple description of integral:

"Now any fool can see that if x is considered as made up of a lot of little bits, each of which is called dx, if you add them all up together you get the sum of all the dx's (which is the same as the whole of x). The word 'integral' simply means 'the whole'"

After reading this and more I asked a few people who I know knew calculus how they would explain an integral (figuring that they would give a similar definition and would delight in finding a book which explained it so clearly) and was shocked to discover how many could calculate it without really understanding what it was or at least being able to describe what it was in clear terms


I think of everything in the form of signals, so I define the integral as "the area under the curve".


>simple description of integral:

dual space to a linear vector space :)


I vehemently disagree. The gap we should be worried about is the reason gap.

While I enjoy math tremendously, part of my deep dissatisfaction with economics as a field is its incredible over-reliance on math as a tool for analysis. I'm speaking as someone who dreamed of being an economist all through high school and early college. Classically, economics had very little to do with (numeric) math, and much to do with reasoning about how people behave.

However, as modern statistics (and math as whole) began to develop rapidly in the early 20th century, and as logical positivism became a dominant philosophy (http://en.wikipedia.org/wiki/Logical_positivism), economists took note and begin applying these tools liberally to their field. They started collecting and compiling tons of data on anything they could measure. Data is compelling: numbers give a sense of precision and clarity that mere reasoning does not. But this appeal is also what makes numbers dangerous. Though rigorous empirical testing of hypotheses in science is clearly one of the greatest advancements of the last 200 years, it has often been misapplied to other fields where the same controls are hard to apply. And experiments without controls can produce essentially meaningless data. Economic data is particularly complex, and there is still much debate as to how to calculate even very basic oft-quoted economic figures like inflation, unemployment, and GDP.

Though there was significant debate about the usefulness of these new tools, they became enshrined by the two dominant mainstream schools of the 20th century: Keynesianism and Neo-Classicalism. This bastardization of the field has made economics into a cargo cult science, where researchers regularly base their knowledge on data that is only slightly more controlled and scientific than corporate accounting.

This is not a trifling academic concern. So much of our lives is affected by what economists do and say. The bigger concern I have for young economics students is that their lack of mental math skills will make them more inclined toward the kind of overly precise large number manipulation that computers and calculators make so easy. I hope, for all of our sakes, that these less mathematically-inclined students will instead be wary of the numbers and critically apply reasoning to the models and assumptions they have been taught.


It sounds like you're calling for the application of reasoning without measurement. How would that work?


It sounds like he's calling for a philosophical approach to economics.


How do you measure human behaviour? How do you predict human behaviour based on that?


What I'm calling for is for economists to drop the pretense and misconception that using empirical methods for studying macroeconomics makes it scientific. And I'm suggesting that being good at math is not useful unless you're using useful data. In the study of logic, arguments can be considered valid if they are formally correct, but still unsound if their premises are false. Much in the same way, one can perform any number of valid mathematical transformations on data but still be left with unsound conclusions if those data were gathered incorrectly.

I am not saying that empiricism is inherently flawed, or that we should stop collecting economic data. And I I do not intend to advocate any particular school of economic thought here. All I'm advocating is that students be taught how to think critically about what they are being taught. So much of a modern economics education consists of looking at the changes in figures over time that very little is spent focused on a more general kind of reasoning.

The kind of reasoning I'm calling for is not easy to define. This is one of the tremendous advantages numbers have over argument in most minds. This kind of reasoning takes into account the notion that most of the information we obtain is not perfect or complete, and that many of our determinations are really judgment calls on what is more likely to be true. If empiricism is reasoning with your eyes, this is reasoning with your nose. It is a trained skill that allows you to recognize dubious premises and unspoken assumptions. When refined, it allows you to distill the essence of arguments down to a set of axioms that you can use to build a coherent model of the situation at hand. It is this theoretical side that allows you to understand how to construct experiments that test hypotheses, or whether that is even possible in each case.

To demonstrate the importance of gaining an understanding of the theory and rules behind something before testing it, I offer a parable:

The commissioner of the NFL once decided that teams were punting too much and he hired an econometrician (economic statistician) to study the situation and provide a solution to this problem. The econometrician applied his skills to the task at hand, aggregating data from several seasons to find correlations. He noted that there is an incredibly strong correlation between forth downs and punting, and he recommended that the commissioner ban fourth downs. In the next season, offenses were only given three downs. To the econometrician's surprise and the commissioner's chagrin, teams actually punted more frequently, as the fewer number of downs dramatically limited offensive opportunities.

The econometrician's misunderstanding was based on something rather obvious (if you understand American Football): a failure to separate correlation and causation due to an ignorance of the rules of the game. And compared to a global economy, football is a very simple game, with very simple rules. Applying reasoning to the example is very straightforward, but applying the same thing to a world of dynamic human behavior is much more subtle. Which is why students ought to be trained to question assumptions and sense where logic and math have separated themselves from the reality they are supposed to help us describe.

People will disagree about when things correlate to reality, and about what things make sense in parables. But almost anyone can learn to recognize when a number seems too specific, just like most decent coders learn to recognize "code smell." Just the other day, someone told me confidently that 65% of communication is non-verbal. Now, while I almost agree intuitively, I immediately asked where they heard that, and how someone could have arrived at that figure, which seemed oddly specific for something (communication) that I don't think is frequently quantitized. Every student of a soft science needs to have this skill strongly developed, or they will begin to take these kinds of things at face value.


"it has often been misapplied to other fields where the same controls are not hard to apply"

Do you mean "are hard to apply" ?


Yes, thank you.


"students can now solve problems that were previously too time-consuming to attempt, and can focus on underlying concepts."

The question should be whether the more time-consuming problems are helping to understand the underlying concepts.

In my experience this is not always the case. My high school maths teacher made us do extensive graph plotting by hand and it was slow and tedious.

At university, my physics course used software to visualise vector fields. We could tweak the inputs and see WHAT it did immediately. We could examine much more complicated scenarios and it seemed we were gaining an intuitive feel for the subject, but we didn't really learn HOW the inputs lead to the outputs so we couldn't apply the knowledge to other scenarios (even simpler ones) without the software.

2 decades on, and I wouldn't know where to start if I had to tackle even a trivial vector field but I can still picture reasonably complex graphs in my head just from looking at a formula which I find surprisingly useful in everyday life.

Perhaps mine was just not a very good course (in an otherwise excellent program) but care should be taken to ensure the technology really does aid the understanding of the underlying concepts and not distract from them.


Reminds me of the classic story of Richard Feynman vs the Abacus: http://www.ee.ryerson.ca/~elf/abacus/feynman.html

I never really memorized basic math tables in school and it annoyed me at a low level for years. So a few weeks ago I added addition (0+0 ~ 50+50) and multiplication (0x0 ~ 12x12) tables to my flashcard app to try and remedy that lacking. It's been depressingly and frustratingly difficult so far.


Out of all the comments this one made me the most depressed. 11x12 is something you need to think about?

At my high school we had these expensive graphing calculators that were supposed to last for four years. I of course lost mine in early grade 10. Best thing by far that happened to me.

I was always that the top of my class for mathematics, so I was usually pretty bored with the homework, but all of a sudden I thought of a new challenge: Do all the homework without any calculator whatsoever. Yes, even things like 3^4 + 5^(1/4). At first of course it was painful, but at least I was doing something. I had a much better understanding of actual graphing than anyone else as well, because I would boil the functions down to their key points and play with them until I saw what they did at infinity.

I ended up getting my calculator back in late grade 11 (funny story, I had to sub in for a math teacher at the last minute because there was a wave of sickness at the school, so I was teaching people one year younger than me) and I saw my trusty calculator on the desk in front of me (I had engraved my name with a knife). I picked up and said "hey thanks, I've been looking for this."

Since then I've always been able to do math in my head and it is unbelievably useful. When people say "oh so that is log(800), what is that" and I respond with "roughly 2.9"* they are flabbergasted, but I know how the curves actually work, I haven't outsourced that knowledge to a machine.

Programming is great and we should have graphing calculators in some part of the math curriculum, but students should be able to do this type of math by hand/in their head.

*I checked this after answering it in my head and was mildly amused. Here was my thought process: log(1000) = 3, log(100) = 2 a log graph has a consistently degrading slope, but one that will still take you to infinity, so the last 200 units in the 100-1000 range will be worth less than the first 200 units. And since halfway there is roughly log(300) and we assume a kinda-semi linear path form there (rounding down because 800 is higher on the scale) it comes out to roughly 2.9.


Out of all the comments this one made me the most depressed. 11x12 is something you need to think about?

I don't find that especially depressing. If you have to sit down with a pencil, yes, you need to play catch-up a bit, but is it so terrible not to have it memorized? I personally always get by with things like "12x12 = 144, so 144 - 12" or "12x10 = 120, so 120 + 12". I just don't have a use for memorizing the complete table, so while I knew it 2 decades ago, I have lost pieces of it.


To multiply 11 by 12 you just "split" the 12 and put a zero between the digits: 1_0_2. Then you add the digits in the 12: 1+2 = 3. Then you add the result of the 2nd operation to the zero in the first split: 1_0+3_2 = 1_3_2 = 132.

This technique works for any 2 digit number multiplied by 11.

What IS especially depressing is that the slowest and least efficient arithmetic techniques keep being passed on for generations. Mainly because it's easier to teach the dumb way and teachers are too lazy to learn how to teach anything new. The result of this is that most students end up hating arithmetic and calculation, and subsequently any further math, because they are never taught the fast and fun way to do things.

If you ever have a job interview at a hedge fund or trading firm, you will not get past the phone screen if you can't do this sort of arithmetic in your head.


Well, to each his own. I always hated the kind of math where you have to remember a basketful of little tricks, like your method of computing products of 11. I have always much preferred "the-method-I-invented-on-the-spot", my approximate method using nearby known numbers being one of them.

The most exciting math tests for me were always the ones where I couldn't remember the 'trick' for half the problems, and would re-invent the solution. It didn't always go well, but the GREAT SCOTT!!! moments were some of the best in all of college.


Approximate estimation is what you need to multiply other two digit numbers quickly. It just happens that there's a trick for multiples of 11. There's actually probably less than 20 other "tricks" that help with rapid calculations.


I don't understand why this trick is necessary. For 11x<number> you can just do 10x<number> + <number> which is easy until you get to 3 digit numbers, but even then continues to work.


Speed. The "trick" breaks down the problem so there's always a simpler summation to perform. In half of the possible products, one only has to add the sum of the digits to zero. For the rest, just slot the 2nd digit of the sum where the zero goes, and add 1 to the first digit. Most people can do either of those two simple additions faster than they can do sums like 210+21 or 390+39. If one is already quick with the latter kind of addition, then the trick is unnecessary.


That would make a fun blog post! Please write one.


  11x12 is something you need to think about?
No, I don't want to think about it, that's the whole point.

I believe that basic math is a fundamental tool of programming and it really bothers me that I'm not as competent as I'd like to be.


If you're working on it because you don't want to have to thunk about it, that implies you currently have to think about it.


Is there any evidence that being unable to do mental arithmetic has any correlation with understanding complicated mathematics?

All to often with articles like this I can't help but think of grouchy old people with congnitive dissonance who are now conviced that (a) young people are stupid and (b) things were better in their day and (c) since they had to learn mental arithmetic the young one should learn it too.


Understanding advanced mathematics comes with a love for math. People who love math are usually pretty good when it comes to mental arithmetic.

In accordance with your statement about grouchy old people: I was amazed to find out that they allow calculators on the SAT. Now get off my lawn.


So you're saying that people who are good at maths are good at mental arithmetic? If that is true, then it doesn't follow that teaching people mental arithmetic will make them good at maths. In fact since mental arithmetic is hard and time consuming to learn, and a decreasingly useful skill, teaching mental arithmetic might turn people off maths!


Math is about seeing patterns, arithmetic offers plentiful opportunities to hone your skill of seeing patterns.


Is there any evidence that being unable to do mental arithmetic has any correlation with understanding complicated mathematics?

That's a very good question. This question has not been studied as rigorously as it should have been. Here are some suggestive observations. Professor W. Stephen Wilson surveyed many colleagues (other research mathematicians) once to ask if they thought advanced mathematics could be learned without a basic understanding of arithmetic. The responses he received

http://www.math.jhu.edu/~wsw/ED/list

included comments such as "I am shocked that there is any issue here" and "That it is even slightly in doubt is strong evidence of very distorted curriculum decisions" and "One of my favorite attacks is that we are _helping_ the students by insisting that they do things by hand because otherwise they can waste a lot of time when the calculator would fail them." One especially thoughtful comment, by a mathematician who has long thought deeply about teaching mathematics, was "It might be argued that we do not really require students to fiercely add, subtract, multiply and divide in our university courses - which is true. But we do require an automatic understanding of these operations and why they work because WE BUILD FROM THERE."

The longer story about understanding arithmetic--REALLY understanding it--and how that relates to learning beyond arithmetic can be found in the book Knowing and Teaching Elementary Mathematics by Liping Ma

http://www.amazon.com/Knowing-Teaching-Elementary-Mathematic...

(well reviewed by two mathematicians who study mathematics education)

http://www.ams.org/notices/199908/rev-howe.pdf

http://www.aft.org/pdfs/americaneducator/fall1999/amed1.pdf

A classic article on the subject is "Basic skills versus conceptual understanding: A bogus dichotomy in mathematics education"

http://www.aft.org/pdfs/americaneducator/fall1999/wu.pdf

A recent effort to embody strong conceptual understanding of basic skills into a mathematics textbook is Prealgebra by Richard Rusczyk, David Patrick, and Ravi Boppana,

https://www.artofproblemsolving.com/Store/viewitem.php?item=...

which points to what young students should be able to do WITHOUT a calculator if they really understand mathematics well.

One example I know, related to me by an economics professor, is teaching a lecture on economics in which the professor (the one who told me this story) said, as part of a calculation, "20 percent," and then was interrupted by a student who asked, "You just said '20 percent,' but you wrote '.2' on the blackboard. Why did you do that?"


What it really all boils down to from my point of view is that the question is "Can we really teach mathematics (qua mathematics, as opposed to raw computation) without at some point creating a firm intellectual foundation?"... and there are some people arguing yes, which just boggles the mind.

Yes, you may never in life have to subtract 1/4 from 3/4, but even so, in the grand scheme of mathematics manipulating fractions is the easy part, and still useful if only as a really good place to practice and pick up useful manipulations on relatively concrete items, before you move on to more abstract versions of the same operations, which are used everywhere in mathematics.

(I'd also point out I phrased it as "at some point", not "at the beginning"; in general we can't solve the chicken-and-egg problem and just start Kindergartners on pure set and number theory. But at some point you need a decent foundation laid down or you will never build a strong structure.)


The argument isn't that we can teach mathematics without a firm intellectual foundation. The argument is that arithmetic is not that foundation.

No one would argue that you should know the turn by turn direction to drive from the White House to the Liberty Bell and use that to generalize how to read a map. We don't force kids to memorize Harry Potter and then later expect them to generalize that our into literacy. History courses don't start with Supreme Court cases and wait until students generalize out the Constitution.

By starting with the abstraction and moving toward the concrete, we give the information context and allow the students a good mental framework to build upon as they learn the details. Starting with the concrete and moving towards the abstract is needlessly confusing and robs the students of the firm intellectual foundation that they need to understand the subject.


I'm not arguing arithmetic is that foundation; I'm arguing its an unremovable part of the foundation. There's more to it than that, but for all the well-documented foibles of true mathematicians when it comes to simple arithmetic, how much math can someone really be learning when they are staring blankly at 3 times 5?

There's a lot more to programming than creating and calling functions, but I'm yet to see an interviewee struggle with how functions work in their putative "best" language but they're otherwise brilliant. You can't be progressing very far if you're burning that much mental effort on the very, very basics of the task at hand.

By starting with abstraction and moving towards the concrete, you're completely fighting everything about how we experience life and learn. Nothing else works that way. The only abstractions you can teach that way are series of words that students can mouth without understanding, but have no true comprehension of.... gee, that sounds a bit familiar.


I'll completely agree with you that a student who can't multiply three by five doesn't understand mathematics. However, I'd argue that I'd rather have a student spend ten hours figuring out that 3*5 = 5+5+5 = 1+1+1+1+1+1+1+1+1+1+1+1+1+1+1 = 15 than to have a trained parrot that simply squawks the answer. The first student was slower, but it's the parrot that's mouthing a series of words without understanding.

As for the rest, I disagree that I'm fighting that standard learning pattern. In nearly every subject, we start with a generality and then move into specifics. We learn about the Roman Empire before we learn about Cicero. We learn about teeth before we learn about incisors. We learn about protons before we learn about quarks. We learn about function calls before we learn the standard library. We learn about China before we learn about Beijing. Almost all learning is performed through iteratively refining a central abstraction with increasing detail.


If you're really worried about creating the foundation first, shouldn't you be teaching kids that SS0+SS0=SSSS0?


Did I not just spend an entire paragraph explaining how I'm not worried about teaching children the true foundations first, and consider it impractical?


It was in parenthesis, so I didn't see it due to Lisp-related emotional trauma.

Now that I know you're not in favor of it, though, I'm perversely inclined to wonder if a sufficiently skilled teacher might not introduce young tykes to ZFC; and what sort of mathematicians that might produce in another few decades.


"Here are some suggestive observations." With all due respect, those observations are all just non sequiturs. It means nothing. Kudos for the other references though.


People are born with a sense of numbers. Psychologists have confirmed this. Arithmetic is therefore our gateway into mathematics, in the sense that we build on the intuition we're born with and obtain greater intuition.

This is why it wouldn't surprise me if mental (or at least pencil-and-paper) arithmetic is correlated with understanding complicated mathematics. For one thing: to get to the complicated mathematics, you need algebra. A major part of algebra is arithmetic on polynomials. If I didn't understand arithmetic on integers, what hope have I on a more complicated entity?

Okay, forget polynomials. Maybe they're not important. How about logarithms? A good intuition of logarithms is that the log of a number tells you how many digits it has. Paired with a good intuition of multiplication (multiplying a 3-digit by a 2-digit results in at least a 4-digit) makes it feel obvious that log (MN) = log (M) + log (N).

Finally, how on earth can you understand calculus at all without realizing that division by a big number gives you a small number?

Using calculators certainly is one way to be "good at arithmetic" in the sense of always getting the correct answer without too much thought, but it does not enable arithmetic to be used as a building block for gaining intuition about more complicated things. And arithmetic is one of the best building blocks we have precisely because we're born with the beginnings of it.


> People are born with a sense of numbers. Psychologists have confirmed this.

https://secure.wikimedia.org/wikipedia/en/wiki/Subitizing


it wouldn't surprise me

At the heart of the modern emperical scientific revolution is the idea that you should seek confirmation from the real world for everything, because the world is much, much more stranger than our intuitions lead us to believe. After all, for a long time it would have suprised people to hear that the earth went around the Sun, that the mountain actually move, that there a tiny germs you can't see that can kill you, etc.


Alexander Grothendieck is one of the most famous and influential mathematicians of the 20th century, having largely single-handedly invented modern algebraic geometry.

He was fond of working in generality rather than in specific examples. Once during a talk, he said "let p be a prime" and so on until a member of the audience asked him to pick a particular prime for the purposes of illustration.

"Okay, let p be 57."

Someone in the audience had to point out that 57 = 3 * 19.


Grothendieck "died" in the sense meant by Paul Erdos (stopped doing new mathematical research) at a surprisingly young age, while Israel Gelfand kept going into his nineties, not "dying" (ceasing to be a mathematician) until he "left" (died in the usual sense of the term).

http://www.math.rutgers.edu/~zeilberg/Opinion62.html

Gelfand was a mathematician who also cared deeply about mathematics pedagogy, and his textbooks are delightful.

http://www.amazon.com/Algebra-Israel-M-Gelfand/dp/0817636773

http://www.amazon.com/Functions-Graphs-Dover-Books-Mathemati...

http://www.amazon.com/Method-Coordinates-I-M-Gelfand/dp/0817...

http://www.amazon.com/Trigonometry-I-M-Gelfand/dp/0817639144...

Gelfand poses delightful problems that give students a workout in arithmetic (and CAN'T be done with a calculator) and that build conceptual understanding and interest in higher mathematics.


One of the most famous mathematicians of 20th century? What about Einstein, Turing, Feynmann, Russell, Erdos, van Neuman, Gödel?


It's just another skill, it can help and make certain things go much faster if you are good at mental math, but in the end the only thing that matters is your understanding of the operations you are doing in either your head, or your calculator.

I am unsure how the traditional Calculus course is, but I took a slightly altered one at a technical institute, and we spent quite a bit of time learning the methods computers use to calculate integrals. I think it is safe to say that as long as I understand these methods and remember them, I am at no disability to use a computer, rather then solving integrals on paper.


Many of my friends doing phds in math are terrible at mental arithmetic. Higher level Mathematics does not have much to do with how fast you can calculate tip in your head.


It's well-acknowledged that those who don't understand the fundamentals of a subject are hampered.

Whether it's computer use, programming, or mathematics.

Of course, the prof would do well to become fully familiar with a TI-89 or a HP-50 graphing calculator. It's ignorant to bash what you don't understand.

Since having a sense of math is required for being a mature citizen in modern life - think about all the numbers we are flooded with regarding public policy - it is no surprise that as numeracy goes down (helped by our good friend Calculator), mass voting wisdom goes down with it.

Speaking as a US citizen.


I don't think literacy tests should be required before voting. But I would support a required test in probability theory and cognitive bias recognition.


The fundamentals of mathematics are logical deduction, reason and abstraction. Not begin able to do 134 * 23.


Unless you are a savant of sorts, being able to do 134 * 23 in your head requires all three.

For example, you note that 134 * 23 is really 134 * (20 + 3), which is just 2680 + (100 + 34) * 3 which leads to 2680 + 300 + 102 = 3082. And that is just one way of doing it.

I do agree that arithmetic ability isn't all that great. But there is more logic and reasoning involved than you would think at first sight.


Funny I feel the same way about IDEs as the author does about calculators. They have their place, but they can also be a crutch that contributes to mental weakness. I know devs who can't write code without Visual Studio prompting them every step of the way.


I once knew someone who held pretty much the same view about screen editors vs. line editors.

[Edit: I'm rather fond of crutches for my mental weaknesses].


You should at least explain why that is bad.


There is a contention in the subject article that "deep understanding of mathematical concepts is related to basic number sense" and that e.g. students who must use a calculator to compute 3 X 5 = 15 are lacking that.

I was implying my feeling that there is something similar going on in programming... if you are helpless without an IDE, then you are lacking some "basic sense" about your craft. Though I can cite no studies to back that up.


I think someone could understand object oriented concepts and event drivent programming concepts quite well, either form reading about them or being familiar with them in one language. That persone could them start trying to program in Java/Eclipse or C#/VS, and, with the aid of the IDE, effectively implement the program idea they have in their mind. Being able to program without an IDE is usually a function of being very familiar with the language's syntax, but syntax and programming concepts are different things.


Possibly a counterpoint, but I have no study to back this up. In the whole "deliberate practice" paradigm, they often say that immediacy of feedback strongly correlates with continued improvement. But maybe that presupposes you've developed the "basic sense" of your craft already.

For me: if you are helpless without an IDE, that probably says more about where you are as a programmer than it does about where you are going as a programmer.


Someone who can't develop outside of Visual Studio probably is missing some "basic sense" - but will that really do them any harm?


Absolutely. They'll never be able to experiment with a platform that Visual Studio doesn't support. Or with the many fine languages that don't have a fancy IDE.

They will artificially limit their career and opportunities based on a tool.


They will artificially limit their career

What languages provide more jobs opportunities (i.e. career options) than C/C++[1], C#, and Java?

[1] no, I'm not saying C and C++ are the same thing, but I haven't seen a C++ IDE which isn't also a C IDE.


Using only Windows is very limiting.


Eclipse is cross-platform. As is Mono, come to think of it.


I haven't memorized Taylor series for all known functions, so I occasionally have to reach for a calculator. I guess I am mentally weak and the sine of a broken education system.

Oh wait? We've arbitrarily choosen multiplication tables and fractions as "the thing" to memorize because that's what was done before calculators were ubiquitous. Thank God. Now I can go back to being smug and imposing my value system on people I've never met about when calculator use is appropriate. :-)


> the sine of a broken education system.

:D


Graphing the production function F(x)=ln(x) by entering the function into a graphics calculator and copying down the result just seems like cheating.

Well, yeah. But math courses that require a graphing calculator don't ask, "write down the graph for f(x) = ln(x)".

The problem here is two-way. This professor does not understand what the graphing calculators are capable of (which he admits), nor does he understand what the previous math courses were like that relied on them.


I've been in a few math courses that have not required a graphing calculator, we were required to memorize basic functions to help us solve problems more quickly, and to understand more of the intuition behind the math, and what the graph is actually telling us.


My intro to calculus class forced us to memorize the standard values for sin and cos for the quarter-values of pi for the same reason. For those quizzes, we could not use a calculator, of course. But for the rest of the class we used a calculator.


I may be old school, but I believe that technology can only very mildly help with mathematical understanding. Math is build of bricks, one on top of the others. And to understand advanced concepts, you need a deep and solid understanding of the lower layers. I think you only get this understanding by thinking hard, looking at examples and building your own.

Calculators are fine if you are dealing with numbers, but if you work in a quantitative field you often don’t deal with numbers directly: you deal with expressions involving variables. You manipulate the expressions, and at the very end you plug in numbers. A calculator is of no help, and I hope you did not use one when you learned how to deal with numbers. You better be able to manipulate fractions, and know the distributive & associative laws, and know when to complete the square, and the exp(ln(x)) trick, etc… What, you say I could use Mathematica? Of course, and I do -- when I know exactly what to compute. But generally I do not: I have these relations and I try to make sense of them. Moreover, when the final result is nice, concise and elegant, it means that I do not fully understand the problem. Examining the computation will help me understand what is happening: what part of the equations cancel with what other part, and do I understand why? Good luck doing that with Mathematica.

Another example: continuity. Nothing is simpler: you plot a few graphs, and the functions are discontinuous where there are jumps. Who need this epsilon/delta gibberish?

- Functions that are discontinuous everywhere? Ok, I can wrap my head around that. Still no need for epsilon deltas.

- A function that is discontinuous on the rationales and continuous on the irrationals [1]? Good luck understanding that with your graphic calculator.

- And the topological definition of continuity [2]? This is a beautiful definitions, that unifies all the definitions of continuity you have seen for all these functions of THIS space into THAT space. Well, thinking and well chosen examples are going to help you understand the definition, not technology.

[1] http://en.wikipedia.org/wiki/Thomae%27s_function

[2] http://en.wikipedia.org/wiki/Continuous_function#Continuous_...


I must sincerely and strongly disagree with you. Technology can help elucidate mathematics beyond plug and chug. Teaching is one of the best ways of learning. a rough paraphrase of something I read someone say is they write a new book every time they wish to learn something new.

Writing a computer program which embodies a mathematical concept is teaching the most retarded entity that is capable of more than just arithmetic. Certainly anything which is non-constructive falls outside of this, but in laying a motivation and providing a foundation that reduces the amount of problems you need to do by say an order of magnitude? Technology is unmatched. Although the requirement on constructive* maths seems restricted you would be surprised that both your [1] and [2] level of abstraction can be tackled with such tools. http://www.cs.bham.ac.uk/~mhe/papers/entcs87.pdf, http://haskellformaths.blogspot.com/, http://blog.sigfpe.com/2006/08/algebraic-topology-in-haskell...

You do it this way, vary enough examples and try to anticipate results, you will develop a number sense that is required to be comfortable with maths. It worked for me.

* I am skeptical in the reality of arbitrarily real numbers because I am skeptical in the reality of hypercomputation.


  Writing a computer program which embodies a mathematical
  concept is teaching the most retarded entity that is
  capable of more than just arithmetic.
I have to agree with this. Solving a problem - this isn't even limited to just math - with a computer program often means to generalize it. Generalization requires understanding. Thus, if you manage to generalize something, you have understood it.


I haven't been able to make up my mind about the role of technology in education (especially computers). On the one hand, given that he's a passionate scientist and educator, I take Clifford Stoll's opinion seriously when he says that he believes computers don't belong in classrooms. I think that the effects on developing brains of large amounts of time with computers is undeniable at this point. And, as an observer, I do not see that the greater and greater application of computers in classrooms is resulting in more intelligent students. (More informed, possibly.)

On the other hand, a very long time ago, I had a cousin who was visiting and was having trouble with some physics homework about the Doppler Effect. Being a proper nerd, I quickly whipped up a simple animation of an aircraft at varying speeds across the screen, with concentric circles at regular intervals. I thought I understood the Doppler Effect and sonic booms and all that; when I was done, and I saw it in action, I realized how poorly I'd actually understood what was going on.

So ... in the lower grades at least, I think there's a chance that computers could help with mathematical understanding. Younger students seem to have a lot of trouble with things like mathematical association and commutation, which seems to lead to a longtime discomfort with things like the mental math described in the article. Maybe some kind of multimedia demonstration early on would help that.


Economists could do students a huge favor by switching supply and demand graphs so that the independent variable is on the X axis, like every other graph in the world.


> And because I've heard these calculators are programmable,...

I'm happy to suggest that a student get himself a programmable calculator and use it like this: http://www.reddit.com/r/learnmath/comments/jl9gz/evaluate_th...

Notice though that I'm assuming that the student is completely on top of place value and arithmetic. I assume that they can look at 0.17157, 0.16713, 0.16671, 0.16667, and recognise 1/6 emerging from the murk.


Forget about "Recent research", look in the history books!

"However, this switching from counting rods to abacus to gain speed in calculation was at a high cost, causing the stagnation and decline of Chinese mathematics....In Ming dynasty, mathematicians were fascinated with perfecting algorithms for abacus, many mathematical works devoted to abacus mathematics appeared in this period, at the expense of new ideas creation."

http://en.wikipedia.org/wiki/Chinese_mathematics

Weee, its repeating!


I'm going to speak from the role of recent graduate in Software Engineering, a strong focus on math here...

1) Visualization of mathematical concepts is a skill that many students simply lack. When someone asks me to crunch out 1.5 * 3.67 mentally I don't see much of an issue because its a simple concept for me to break it down and go 3/2 * 11/3 and work from there. Many people don't make the connection and the root of that issue exists beyond the whole calculator discussion in my honest opinion.

2) The first reply on that page is a very important one, students with a strong mathematical foundation are found in the natural sciences. It is rare to find students practicing strong math skills outside of these faculties. Its the same argument we saw with Engineers lacking english language skills, the students who excel in that field are drawn that way so its only natural to have some deficiencies.

3) Profs from a different generation gap are part of the problem in the idea that they're slow to adopt a technology-centric system. My honest opinion is that formula sheets should be allowed in all exams with the criteria that: i) they're hand-written ii) they're of limited length (1 standard 8.5x11 sheet of paper, single sided) and iii) no photo-copies This forces students to look at the information they've been given and make rational decisions for what formula are important, which ones can be memorized, and enforces studying. My electrostatics course forced memorization on us and I spent more time shoving formulas into my noggin instead of applying concepts to solve actual problems. As soon as the final was complete those formuli leaked from my brain onto the floor and are now forgotten. I honestly wish I was allowed more time to solve problems then to attempt memorization.

Basically what I mean to say when I state that "profs are slow to adapt" is that we live in a technology centered world now. Information such as formula or constants are accessable almost instantly from google and the likes so instead of using valuable time to have students memorize these things time would be spent better if they allowed "cheat-sheets" and instead required a higher amount of work or knowledge per semester. I'd gladly trade the time spent memorizing different formula for magnetic flux in favor of another two or three chapters from the book giving more than the "ideal scenario" cases we were given.


Economics demands a strong background in math and any undergrad program worth its salt will require at least multivariate calculus (if not analysis), linear algebra and a fairly robust course in statistics. Beyond this, a proper understanding of metric spaces, fixed point theorems and rigorous stats are all useful and almost essential tools. All of these courses should be proof based. I've said it before and I'll say it again: If it's not proof based, it's not mathematics. It's computation (and I don't mean computer science).

Unfortunately, even programs at many "top" universities sometimes skip these requirements. There are very few undergrad economics degrees worth the paper that they are written on.

As to the theory that arithmetic is any determinant of success in real math, I have only anecdotal evidence that this is false. Having gone to an undergrad university with a top 5 math department, I can assure you that people who are some of the most significant contributors to their fields sometimes struggle at the chalk board when it comes to "basic math." I think it is not clear to most people that arithmetic is not math. It is not the "base" of math, and skill in this area does not provide a "strong foundation" for math. Math is the study of quantity, structure, space and change (to borrow a leaf from wikipedia) among other things. It is the most fundamental of the sciences in a metaphysical sense (it may be argued that philosophy is the real root, but I would say that at their most fundamental, there is significant convergence between math and philosophy). Arithmetic is not related in any form to this.


I used to work part time as IT support for a public school system back in College(07), and it shocked me to see every single student in a 6th grade class pull out TI-89s or better! to handle logs and exponentials. I'm not talking about solving them to get the decimal number, no, this was to do things like solve for x in: e^3 * e^4 = e^x Yes in the real world we all rely on calculators to some extent but if you can't piece together basic math you'll struggle to solve unfamiliar problems.


I think that learning to do mental computations vs. relying on technology is a trade-off. As hermannj314 pointed out [1], there's an issue whether memorizing multiplication tables/trigonometric tables is a matter of fundamentals, or it's just how we used to live before calculators to be able to compute anything. Well, I think it's both in some sense.

Human brain is good at some things (pattern matching, quick look-up, aka. intuition), and terribly bad at others (computation, explicitly running algorithms). I do believe that we should augment our mental skill whenever possible - we're already thinking using our environment [2], so unless one rejects pen and paper, or even looking at things around him/her, one is using his environment to increase his mental powers. So why stop at handwriting? Why not use slide rules, calculators and Google?

There's the other side of the trade-off though. Learning tables with numbers, or multiplication tricks, or doing lots of exercises with plotting functions by hand is the way we leverage the power of our brains. Our minds seem to be good at caching stuff and quickly looking things up, so the more such mental tools we incorporate, the more power we can get of our thought process.

In a very simplifying way:

Our cognitive powers = internal 'skills' in thinking * external tools we have.

The best strategy is to properly invest on both sides of the multiplication. Boosting only one side is suboptimal.

[1] - http://news.ycombinator.com/item?id=2924937

[2] - http://consc.net/papers/extended.html


Mathematics is a system of learning and reasoning, arithmetic is the process of actually computing an answer. While it might be nice to be able to compute compound interest in your head no one is seriously proposing that anyone do multivariate linear regressions with a pen and paper, we simply have much better tools for computing the answer.

In a modern society with large amounts of non-human computational power it's far more economical for humans to focus more on math and less on arithmetic.


I am baffled by the idea that today students can't do basic math problem without resorting to calculators. Big tedious problems, I can understand.

I am a 20 years old.


Just curious: where does 'basic math' end for you? 13 * 7? 43 * 76? 123 * 42? sin(PI/3)? 12534 + 3 * 26327?

Which ones do you need paper for? Which ones would you use paper for if it were readily available?


I'm sure this will vary by individual, but for me:

70+21

(40 * 25) * 3 + 43 + 3 * 75 + 3

(123 * 40) + 123 * 2

paper and/or calculator for the last

sin(PI/3) is an interesting case. I don't deal with trigonometry very often. If the problem is in the context of a project where I will be using trigonometry frequently, I'll go refresh my memory about the easier-to-retain points in addition to what already I remember: sin(N * PI) = 0, sin((2N-1/2) = -1 and sin((2N+1/2) * PI) = 1 for integer N; given a unit circle and an angle between the line segment [(0,0), (0,1)] and another line segment reaching the circumference of that circle, sin(angle) = y, cos(angle) = x of that point (x,y) on the circle (this latter used as a mental-vision sanity check on any answer proposed). If it's truly a one-off, I'll just use the calculator.


For me basic maths ends with

    (3.1^2 - 3^2)/0.1 = (9.61-9)/0.1 = 0.61/0.1 = 6.1

    (3.01^2 - 3^2)/0.01 = (9.0601-9)/0.01 = 0.0601/0.01 = 6.01

    (3.001^2 - 3^2)/0.001 = (9.006001-9)/0.001 = 0.006001/0.001 = 6.001
When you get to that point you are ready to start calculus and see that d/dx x^2 = 2x and that the derivative of x squared at 3 is 6.

There is a smooth transition from arithmetic to calculus with the laws of calculus having to be what they are or else approximations, such as delta-x = 0.001 are not going to approximate them.


I would need paper for all but the first, and could not solve the 4th without a calculator. Not sure what the norm is, here's my data point.


More of a curious question but are you saying you're incapable of solving the last one without paper or just that its really not worth the time. Any of those example questions I could do without paper (I'm 23 for a reference) but I certainly wouldn't attempt the last one without it unless someone was challenging me since it'd take more time than what its worth.


With the problem in front of me, I would not think doing the last one on paper would speed up things. On paper, it would feel like

    Load X
    Triple
    Store Temp
    Load Temp
    Load Y
    Add
I would optimize that store/load away. If, on the other hand, someone told me the problem, I probably would not remember Y by the time I needed it (seven plus or minus two is about two, apparently)


pi/3 is 60°, and the sine of that is √3 / 2. But yes, that's not something one calculates by head, that's memorization. And working out √3 / 2 into decimals also takes a calculator, though I can approximate by knowing the first two significant digits would be 1.7 / 2 = 0.85.

One can memorize trig tables like multiplication tables, and I do find that useful from time to time. I remember sin, cos, and tan for 0°, 30°, 45°, 60°, and 90°, and how to flip the signs to get all four quadrants from 0° to 360°.


"Calculators make fractions obsolete."

This is just nonsense. the digits after the dot are just fractions in powers of 10 and they are much less useful if you actually need to do something in real life with your result, like cutting pie to use a classic in fractions.




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: