Hacker News new | past | comments | ask | show | jobs | submit login

I'm inclined to agree that Russell was overrated. His most influential work was a mistake.

> he is still very much ignored by the tradition he created

Really? When I studied philosophy[0], reading at least some Frege was required for Philosophy of Logic and Philosophy of Language. Frege wasn't treated as obscure or unimportant; quite the reverse. Have things changed?

[0] University of London, early 80's. Philosophical tastes vary rather widely from one institution to another.




> His most influential work was a mistake.

Why? I think you're referring to Principia, which is a predecessor to Coq and Lean. While Coq and Lean don't use Principia's formalism, they do try and tackle the same problem. What did Principia fail at?

[edit]

Or are you referring to the programme of using formal logic to understand language and reliable everyday reasoning?


Principia was destroyed by Gödel, within about two years of its publication. That's what I'm referring to.


No, it wasn't. I'm sometimes not sure if philosophers understand Goedel's incompleteness theorem. What do you base your claim on?

GIT is more a statement about the limitations of the proof-driven approach to mathematics than it is about any formalisation of mathematics like Principia's. It shows that whatever assumptions proofs are based on cannot be shown consistent except by assuming something even stronger. Principia is affected to exactly the same degree as all proof-driven mathematics, and no more.

I'm not sure whether there has ever been a satisfactory answer to the 2nd GIT among the mathematical establishment. I guess mathematicians just assume that everything is consistent, and go about proving theorems as normal. It seems that almost no mathematician understands - let alone cares about - these foundational concerns.

Principia is the predecessor to modern software like Lean, Coq and Agda. Nothing got destroyed.


> I'm sometimes not sure if philosophers understand Goedel's incompleteness theorem

I'm usually not sure if I understand it (reverse: Im usually sure that I don't understand it). Also: I'm not a philosopher - I'm an old man that did a philosophy BA 40 years ago. I can't unpack this stuff.

I see a consonance between Gödel's theorems, Heisenberg's principle, Turing's work on computability, and some of the stuff in Frege and Wittgenstein. Maybe even Cantor. I wish I could to say this clearly-enough for someone to knock down. But I can't do the maths; I'm entirely dependent on the explanations of others.


[I wrote this yesterday, but didn't post it]

Let's assume that mathematics is a job that follows the following rules: You use two languages: Predicate Logic and Natural Deduction. You use Predicate Logic to write claims like "There are infinitely many prime numbers", and you use Natural Deduction to construct proofs of such claims. Both languages are formal. And you can even use a computer to verify that you're using them correctly.

To prove things, you need to assume that some things are true. These "assumptions" are called axioms, and are expressed using Predicate Logic. Philosophically speaking, the rules of Natural Deduction can be thought of as axioms for how to correctly construct proofs, but they're usually called "inference rules" and not axioms for minor technical reasons.

The methodology which we described arises inevitably from a proof-driven approach to mathematics, because mathematicians need a way to agree which methods of argumentation they think are correct, and also to prevent people from making things up. Admittedly, this is only true for those mathematicians who have looked into these foundational matters.

Some axioms can lead to an accident where you can prove a proposition and its negation simultaneously. (An example in set-theoretic foundations is Unrestricted Comprehension.) By a simple result called the principle of explosion, in such a situation you can prove every proposition and its negation. So such axiom sets are useless. We call them inconsistent.

A question arises: Given a set of axioms which is sufficient to serve as a foundation of mathematics - like ZFC set theory for instance, or less obviously Peano arithmetic - is it possible to construct a proof of their own consistency from those same axioms alone? The answer is no, and this is Goedel's 2nd Incompleteness Theorem.

You can - in a way - prove that Peano arithmetic (for instance) is consistent, but you need to assume axioms which are strictly more powerful than Peano arithmetic. And to show the consistency of those axioms, you need to assume yet stronger ones. This is quite a disturbing situation.

Ultimately, the theorem is a statement about the methodology for doing mathematics which we described. But there is no escaping it if mathematics is about stating claims and proving them.


Thanks - that's a succinct and interesting summary of something; but I'm not sure how it's a reply to my comment :-)


Well, I'm out of my depth. I didn't study Principia or Gödel at school, and I've never really looked into Principia. My understanding is that Principia was an effort to prove that (a) mathematics was logic; and (b) that mathematics was self-consistent. And that Gödel proved that this goal couldn't be achieved.

Is that wrong? If I've got that wrong, I have a big attitude adjustment coming down the track towards me.


I studied philosophy 10 years ago and we did several weeks on Sense and Reference in one of the modules. He also was discussed in other classes in relation to Kripke and other more modern philosophers.


Yeah, he is discussed in seminars as a kind of proto analytical philosopher.


We studied him too, he kicks off most of the major anthologies. Seems he's almost universally considered an overlooked founder of analytic philosophy. (There's the famous story about how Wittgenstein showed up on his doorstep to move in and study and Frege said, "no just go to Russell," he even underrated himself.)

So he is partly famous for not being famous, a paradox he would hopefully appreciate.


I read The Foundations of Arithmetic and On Sense and Reference for my analytic philosophy course. This was in the U.S. fifteen years ago. I agree that he was not treated as obscure at all. We wouldn't have Carnap (the analytic philosopher) without Frege.


I meant analytic philosophy, that hardly ever includes Frege.




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: