In the sci-fi space I'd argue that Ursula K. Le Guin is another must read. She was heavily influenced by taoism (and eastern philosophy). When you approach her work with that in mind, it adds a whole new layer of depth to everything.
I recall presenting a problem to my students a few years ago based on the 'Library Sort' algorithm. I still clearly remember the title of the original paper: 'Insertion Sort is O(n log n)'.
Hi, my two cents; you claim "Although mathematicians believe their proof is correct, it is too complex to verify without computer assistance", but I'm not sure "believe" is the correct verb since the proof has been formally verified (see for instance https://github.com/coq-community/fourcolor for a formal verification in Coq).
I understand that you want to emphasize the fact that no human can understand the proof with a full overview, but I wonder whether the current sentence will not make people think mathematicians are not perfectly sure of the proof.
I would consider a proof to be a "repeatable argument". One you could show to someone and expect them to be convinced by it. I think it is a defensible viewpoint that a proof in coq is not 100% convincing. If you think otherwise, then how can you reconcile the existence of falso (a coq verified proof of "false")?
Proof and belief I think are pretty strongly intertwined, but I'm not going to pretend to have a particularly rigorous philosophy on the matter. Similarly, when the proof of Fermat's last theorem was published, I don't know if I should consider that to be a proof because it is well beyond my comprehension. I have no reason to question it, but should I consider it a proof? I know that people smarter than me (e.g. Wiles) thought the original version of it was a proof, but it had a subtle error in it which required a fix. While I haven't looked at the proof and revision, I would be surprised if I could look at the two versions as labelled and tell which one is the correct version.
Nah, actually I agree with you. What counts as believe and what as fact is rather abitrary. Is 2+2=4 a fact? Is global warming a fact? What about man-made global warming? Ask 100 people whether something is a fact or a believe.
To top that up, it's fact that there have been "proves" that were wrong (or maybe that's just my believe? :^]) even for a long time.
Hence, I think we can say that there are 4 options for a theorem:
1) Some mathematician believes the theorem is correct (but can't prove it)
2) Some mathematician believes the theorem is incorrect (but can't prove it)
3) Some mathematician believes the proof of a theorem is correct
4) Some mathematician believes the proof of a theorem is incorrect
Proving that a proof is correct is kind of meaningless. At that point it's all believe anyways.
Well.. there is. Middle ground being a very complex, but somehow convincing argument that no one can reasonably check. There was one of these cases in number theory some years ago, can't remember the details. Proofs can be only true or false, but accepting proofs is in the end a social process.
A convincing argument that cannot be checked is not a proof. If you want to extend the definition of proofs you're welcome to do that, but for academic mathematics the meaning of proof doesn't contain a middle ground.
That Coq proof is not "without computer assistance". No Coq proof is, as Coq is literally an "assistant" that runs on a computer.
And those many jobXtoY.v and taskXtoY.v files sure look like they also do the same as the Appel and Haken proof, namely enumerate lots and lots of cases that are then machine-checked. So I don't think the computerized Coq proof is really qualitatively different from other computerized proofs that enumerate so many cases that a manual check would be impractical.
Thanks for posting this. I really enjoyed reading it. It'd be worth a top-level post (hint) - there's lots in it to discuss without side-tracking this thread.