Hacker News new | past | comments | ask | show | jobs | submit login
Why I Love Computer Science (caltech.edu)
65 points by Jebdm on Jan 10, 2009 | hide | past | favorite | 25 comments



I think this guy has mathematics totally wrong. Maths just doesn't work today the way it did in Newton's time, and even back then people weren't satisfied with Newton's proofs, but they lacked an alternative so they had to use them anyway. If the author had his way, we would have refused to accept Newtonian physics for two centuries! Could you imagine the damage that would have done?

> [A]ny realistic mathematical proof will leave out a great many steps, which are considered to be the "required background knowledge"

Computer science papers are different how? Computer science != programs!

> [T]he inference rules are not always specified accurately, or are not believable if they are. This is why you will sometimes read a mathematical proof and say "I understand the reasoning, but I just don't buy it"

I think this guy just isn't too hot at mathematics. Omitting a trivial step (or domain specific knowledge) is not a lack of rigour, but a courtesy to the reader. The details can always be filled in cleanly. If you ever see a modern mathematical proof that is accepted by all mathematicians, but you don't "buy it", then I can assure you that it's you that's at fault, not the proof.

Oh, and computer science papers never leave out trivial steps or assume domain knowledge? Not the papers I've read. Once again, CS != programming!

> This is reminiscent of Whitehead and Russell's Principia Mathematica where hundreds of pages pass until the authors can prove that 1 + 1 = 2.

Surely Principia is a reductio ad absurdum of the argument that everyone should always spell out all the steps!


I think this guy just isn't too hot at mathematics. Omitting a trivial step (or domain specific knowledge) is not a lack of rigour, but a courtesy to the reader. The details can always be filled in cleanly.

A very astute comment. In principle, mathematical proofs are supposed to be every bit as completely described as computer programs, but perhaps not as explicitly expressed. Edsger Dijkstra used to refer to mathematical proofs as a model for computer programming.

http://en.wikipedia.org/wiki/Program_derivation


I'm not a mathematician, but I can imagine some sort of "library of derivation" whereupon, in order to prove something, say, based on arithmetic, you include (the language directive, not the verb) the specification of arithmetic derived in Principia Mathematica. Then the proving system follows your logic to its logic, and then to the logic they relied on, and so on, until everything is completely explicit. Basically, formal execution of bibliographies.


Dijkstra also used to talk about how computing science was a particularly difficult branch of mathematics, basically for the same reason that Vanier is asserting; he recommended that only particularly good mathematicians should switch to computing science, leaving the mediocre mathematicians to what they were already doing.


A mathematical proof is a series of steps each of which the listener is confident they could prove to be correct. The more difficult or surprising a result the more steps you will have to add to convince the listener.

Similarly when you are writing a program you don't know the exact inner workings of every function you call. You should, however, be confident that they work and that you could understand them if you need to. The source for a computer program omits plenty of contextual information thats needed to make it meaningful. The only completely unambiguous interpretation of the program is the machine code which is analogous to the proofs in the principia mathematica. Its nice to know how it works in principle but you dont want to work with it unless you really have to.


> Oh, and computer science papers never leave out trivial steps or assume domain knowledge?

That's irrelevant. The guy's argument is that computer programs don't leave out information: "If all this information isn't available in some form, the program simply will not work, as the interpreter/compiler will not know what to do with the program. This forces a certain intellectual honesty on the process of executing a program; nothing can be left unspecified."

Clearly, computer science papers are written by humans and will typically lack rigour, which has both advantages and disadvantages. Human communication can contain bullshit, but you can't bullshit a computer.


His conclusion is that computer science is more rigorous than mathematics. His argument is that programs written for computers are more explicit than proofs written for humans. The conclusion and the argument don't match up at all. Computer science, as a subject, is no more rigorous than mathematics.


Quick note: many of you are saying that this guy is totally out of touch with how math works, and that formal derivation of proofs is just fantasy. It's true that that's an extreme view, but it is in fact one that is held by some top mathematicians, even if a small minority. There are people who have built up research programs out of trying to make it a reality. See, for instance, Doron Zeilberger, who apparently describes himself as an "ultrafinitist":

http://en.wikipedia.org/wiki/Doron_Zeilberger

http://www.math.rutgers.edu/~zeilberg/OPINIONS.html

I'm not saying I agree with that, just that it's not lunacy. Carry on.


I agree. The original post is echoing many of the views expressed at more length by Sussman and Wisdom in their book "Structure and Interpretation of Classical Mechanics." There they showed that a computational expression of classical mechanics was much more explicit and rigorous than the standard mathematical or physical treatments. They also have a similar long paper on differential geometry from a computational perspective: http://groups.csail.mit.edu/mac/users/wisdom/AIM-2005-003.pd...


Their point in SICM was that traditional notation was not as ambiguous as their computational notation. That's because traditional notation is designed to be convenient rather than explicit. They are right, but this does not mean that physics lacks rigour. More explicit notations were always available and physicists, being more intelligent than computers, were capable of resolving the ambiguities to reveal the fully rigorous structure underneath.

There's a difference between using a convenient notation and lacking rigour. If you are always capable of discerning the true formal equations behind a convenient notation then everything is OK. If you can't, or people can't agree on what the formal meaning should be, then there is indeed a problem. The fact that the authors could derive a computational notation that no physicist would disagree with is proof that a lack of rigour never existed in the first place.


You write that "More explicit notations were always available and physicists, being more intelligent than computers, were capable of resolving the ambiguities to reveal the fully rigorous structure underneath."

I doubt that's true. For example, here's what Piet Hut, now a professor of physics at the Institute for Advanced Studies at Princeton, writes about his experiences with classical mechanics as an undergraduate in his review of SICM available at http://www.ids.ias.edu/~piet/publ/other/sicm.html :

"Soon I went through the library in search of books on the variational principle in classical mechanics. I found several heavy tomes, borrowed them all, and started on the one that looked most attractive. Alas, it didn't take long for me to realize that there was quite a bit of hand-waving involved. There was no clear definition of the procedure used for computing path integrals, let alone for the operations of differentiating them in various ways, by using partial derivatives and/or using an ordinary derivative along a particular path. And when and why the end points of the various paths had to be considered fixed or open to variation also was unclear, contributing to the overall confusion.

Working through the canned exercises was not very difficult, and from an instrumental point of view, my book was quite clear, as long as the reader would stick to simple examples. But the ambiguity of the presentation frustrated me, and I started scanning through other, even more detailed books. Alas, nowhere did I find the clarity that I desired, and after a few months I simply gave up. Like generations of students before me, I reluctantly accepted the dictum that `you should not try to understand quantum mechanics, since that will lead you astray for doing physics', and going even further, I also gave up trying to really understand classical mechanics! Psychological defense mechanisms turned my bitter sense of disappointment into a dull sense of disenchantment."


I actually agree with Sussman and Wisdom (and the author of this article) that physics is taught in a very sloppy manner, and that the computational notation has huge advantages. But that's not the point I (or they) were making. Lagrangian and Hamiltonian mechanics were perfectly rigorous before Sussman and Wisdom came along, even if they weren't taught very well. If mechanics had really been deep-down sloppy, then Sussman and Wisdom would now be heralded alongside Newton and Einstein for their incredible contribution to science! Rather than just creating a new notation, they would have advanced science incalculably, turning vague and untestable theories into hard empirical science for the first time!

But that's obviously not what happened. Physicists were always capable of making exact calculations and predictions from Lagrangian mechanics. SICM contains no new theories, theorems or proofs, just a more explicit way of representing old ones. Nothing new was discovered and no old notions were clarified. Instead, they just found a better way of teaching mechanics, one that didn't rely on the implicit knowledge that masters of the subject already possessed. They were only capable of doing this because Lagrangian and Hamiltonian mechanics were well-defined in the first place. If they hadn't been then they would have had to advance a new theory of mechanics to replace them, rather than just re-presenting an old one.


Do you mean, "was not as unambiguous"?


Indeed I do. Thanks.


It's not the claim that mathematics sometimes lacks rigour that's bothering anyone. It's the claim that computer science, as a subject, is somehow more rigorous than mathematics. It's the false equivalence of programs with computer science that's wrong.


This disdain for fields without rigor reminds me of the general attitude towards people using excel macros or vb. I understand it's fun & I used to feel the same way. But at some point you just have to stop playing with your tools & get some real work done.


A compiled computer program is certainly a rigorous description of something, God knows what, certainly not the programmer.

This is just funny:

"I think computer science has a tremendous amount to offer the fields of logic and mathematics. Specifically, I think that requiring all formulas to be executable by a finite, deterministic system (a computer program) could lead to a great increase in the level of rigor of these fields, would make it easier for students to learn existing results, would make it easier for practitioners to develop new results, and might possibly suggest whole new approaches...*

Also he keeps refering to computers as finite. I can only assume he means in a physical sense that there are a non-infinite amount of atoms making up his CPU.

I suppose I should critic his argument seriously but it's just too far away from any actual reality.


   > Also he keeps refering to computers as finite. I can only
   > assume he means in a physical sense that there are a
   > non-infinite amount of atoms making up his CPU.
No, this is meant as 'having a finite number of states' (dictated by finite memory, finite number of registers and so on) - as opposed to, for example, a Turing Machine [which has an infinite number of states].


What, with his comment about being able to eventually understand sociology using physics, I think this essay is easily 3 Cuil.


what does that mean?


The dichotomy he puts forward - algorithms (CS) vs. proofs (math) - is totally made up.


The dichotomy of CS vs Math is tenuous as well.



So much fail.


"I really loathe vagueness in science (or, for that matter, in philosophy and argumentation in general)."

Is he a computer, true or false no other answer? I think he's fooling himself, what happens if quantum computing becomes common and a result can be a 1, 0 or 'maybe'?




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: