Calculus is designed as a gateway for all of our quantitative students, our future engineers and physicists in addition to our computer scientists, and so it's already forced to wear many hats. Changing it isn't something we can do lightly, as it's integral in everything from standardized testing to college admissions (in the case of high school students); and nearly every quantitative track (in the case of college students).
What good does splicing Big-Oh into the process do, when computer science enrollment is in steady decline, and understanding Big-Oh is pointless without first understanding the costs of polynomial algorithms? It would be nice if our calculus students knew Lebesgue integration too, but I would offer that the calculus sequence is not the right time. If the kids are in college, they can take Intro. to Comp. Sci., and if the kids are in high school, they probably shouldn't worry about Big-Oh at all.
The biggest revision high school math needs is 1. A course which instills students with a basic statistical fluency, so they can read the newspaper and endure advertisements with a critical mind, and 2. A course for advanced students which teaches discrete math and basic non-geometrical proofs, for our increasingly discrete world.
In my experience, Calculus in the university is solely used as weeder material, designed to bludgeon students with the notion that mathematics is dull, tedious, and a necessary evil. Nothing did as much to destroy the joy I derived from math as did my college Calculus courses.
The current pedagogy of Calculus is atrocious, and any method which helps impart the beauty of it without battering the souls of the students is worth pursuing. Feynman's great strength was his disregard for orthodoxy in the search for greater truth. Big-O notation is for more than just profiling the costs of computational algorithms. It is for reasoning about inexact quantities and seeing the greater patterns in the relationships between numbers and functions. Is using Big-O as a method of teaching Calculus better than the current process? Honestly, almost anything would be an improvement.
Math and Calculus are both beautiful and practical. But the current way we teach them completely loses those notions in favor of rote memorization and techniques without context.
I couldn't agree more. People forget that the calculus of Newton and Leibniz was based on infinitesimals, not the "rigorous" epsilon-delta proofs and the concept of a limit. Calculus was found by intuition and experimentation (Archimedes' method of exhaustion to find pi), and not following the implications of "What happens if I arbitrarily define a limit?".
Current calculus education is like teaching kids about color theory, photons, how the eye works without just letting them fingerpaint. Those details will come, but the vast majority of people just remember calculus as a painful memorization exercise.
As an aside, e is the same way. Most people have it taught as an abstract limit concept: lim n->inf (1 + 1/n)^n without realizing it's actually about growth (and that's how it was discovered):
Excellent article. I noticed recursively growing series all seemed to have the same ratio, but I didn't realize this was related to e. However, I did see e show up all over the place, which I didn't understand. Anyways, it's obvious I'll need to read more of your articles. I'd much prefer to understand the intuition behind mathematics than just learn the formulae so I can "get things done."
Regarding calc, I had a strange experience where I kept insisting calculus only logically works if it is based on infinitesimals, which I thought was obvious, but the people I was talking with insisted that it was based on limits. They couldn't understand that an infinitely small value has to be more than zero if an infinity of them is to add up to anything more than zero.
In general, my math education seems to have been particularly bad at dealing with infinity.
If d is an infinitesimal value, I'm not sure what d * infinity would be. If d is defined as 1 / infinity, then it would equal 1, but that doesn't seem right.
This textbook teaches calculus using the infinitesimal approach. To do rigorous infinitesimal calculus, you have to define the hyperreal number system.
The tricky thing is that infinity isn't a number as we normally think of numbers. There are different kinds of infinity, so infinity / infinity does not necessarily equal 1. It could be any number between 1 and infinity, inclusive.
To get this intuition, think of the integers. There are obviously an infinite number of them. Now think of the rationals. There are now an infinite number of numbers between 1 and 2, so there are more rational numbers than there are integers; in fact infinitely more.
I believe set theory is the only area of mathematics where different orders of infinite actually mean anything.
So, in set theory, cardinality is a measure of the elements of a set. The cardinality of the infinite set of natural numbers is aleph-0. An infinite set has cardinality aleph-0 if it can be put in one to one correspondence with the naturals. http://en.wikipedia.org/wiki/Bijection This counterintuitively includes the rational numbers also. http://en.wikipedia.org/wiki/Cantor%27s_diagonal_argument
The real numbers can't be put into one-to-one correspondence with the rational numbers. So the infinite of the reals truly is bigger than the infinite of the rational numbers.
You don't think mathematics is unified, i.e. something true in one area may not be true in another? I don't think a lot of mathematicians believe this, and it isn't clear to me that "infinity" refers to two different things in the two different realms (the other possibility).
I couldn't disagree more. Just because a method "works", does not mean a rigorous approach isn't required. It seems you are also forgetting that most science was started as "intuition" but at some point point a rigorous approach was required to actually turn it into science.
From the field of mathematics, set theory comes to mind.
You forget that you are not teaching it to kids who are supposed to have fun with it - you are teaching it to people who will presumably use that info in a few years, and most won't have another chance to learn it again.
I also don't know exactly how it is taught in the US, but here (Israel) there are calculus courses which deal with with the practical side for physics students and such, and there are courses which deal with rigorous theory and proofs for match students and the like.
I have to add I'm taking my third calculus course this semester, and so far the second one was my favorite subject (Math major).
I think we have a false dichotomy here. What we call 'intuition' and 'rigor' are not necessarily opposed. I think the problem with math classes is not any lack or excess of rigor, but that they focus too much on details instead of the general principles from which those details are derived. Complicated formulas and procedures are presented for students to memorize without any understanding of their origin. It's mostly 'how,' a little 'what,' and never 'why.'
Oh, on that part I agree in principle. Maybe these things are taught differently here, but there is a strong enough focus on the "why" part for me to feel comfortable with it...
Quite right. I've been teaching calculus for several years, and for the most part it is a soul-sucking experience. Big chunks of it are utterly pointless. The main goal of the class is attrition.
Seriously, integrating 1/(1+x^2)? Surfaces of revolution? Trig/hyperbolic trig substitution? Calc is full of stupid tricks that buy you closed form solutions.
Adding Big-O notation to calc might help a little, especially when you get to series (with Big-O notation, you can explain divergent series). It's certainly a useful topic, and I can think of a dozen topics from calc I'd love to replace with it.
Eliminating calculus and designing a basic math class from scratch would probably help a lot more.
I agree with this. The same goes for physics as well. Both physics and calculus look like humongous legal code created by a bureaucracy. They were intended to solve 18th century problems. The idea of creating a new version of these from scratch is a good idea. Why not eliminate calculus altogether and teach computer science instead, including programming?
Calculus is a fundamental topic, and even in computer science. Trig substitution is the bathwater. Curvature, tangent and normal are the baby, and they can be handy when you need them.
They teach the material well, but when it comes to assignments and exams they ask questions that tend to force you to apply the things you learned with some sort of twist. I always thought I knew the material inside out and got crushed on exams compared to how I thought I did.
edit: Actually, 137/138 (which you will take first and second term) have recently been nerfed since they eliminated OAC and stopped teaching much calculus in high school.
I will agree with you on the point that mathematics is beautiful. I had always appreciated mathematics for its logic. A proof is either right (it proves) or it is wrong (it does not prove), however, it was not until I was introduced to double and triple integration that I realized the raw power and beauty of calculus.
I start working towards my computer science and math degrees this summer, and my registered courses include discrete mathematics and probability. I can be confident in the fact that I will learn some easily learned ideas that are able to do very powerful things.
It's a requirement for the CS portion of my degree. I already hold a finance degree from the University of Florida (2004), but the credit crunch sidelined me. Fortunately, I had the option of pursuing another two majors while still finishing in less than two years.
I really wish that a lot of the non computer-science people I have to work with knew at least a smattering of what it means that something is O(n^2) versus O(n log(n)). In fact, I sometimes wish that programmers who I've had to work with understood!
I was "interviewing" with the head developer at a media startup who thought he knew his stuff and after handing him his hat on a number of things, he tried one last "zinger" question on O(). Turns out that neither he nor the head of engineering knew the difference between O(n^2) and O(n log n) but they were so confident that they did. Unbelievable.
What good does splicing Big-Oh into the process do, when computer science enrollment is in steady decline, and understanding Big-Oh is pointless without first understanding the costs of polynomial algorithms? It would be nice if our calculus students knew Lebesgue integration too, but I would offer that the calculus sequence is not the right time. If the kids are in college, they can take Intro. to Comp. Sci., and if the kids are in high school, they probably shouldn't worry about Big-Oh at all.
The biggest revision high school math needs is 1. A course which instills students with a basic statistical fluency, so they can read the newspaper and endure advertisements with a critical mind, and 2. A course for advanced students which teaches discrete math and basic non-geometrical proofs, for our increasingly discrete world.