Hacker News new | past | comments | ask | show | jobs | submit login
Errol Morris Refutes It Thus (slate.com)
57 points by tptacek on June 5, 2018 | hide | past | favorite | 35 comments



"The 18th-century Irish philosopher Bishop George Berkeley concluded that, since all we know of the universe is what our senses convey to us, things in the world exist only to the extent that we perceive them."

This is a mischaracterization of Berkeley's position.

Berkeley did not admit even the existence of the senses, much less that they convey anything to us, nor the unstated assumption of the above description that there's something outside our senses to be conveyed to us through the senses.

To Berkeley, everything that exists is either a perception in the mind or the mind itself.

Berkeley's writing is very accessible and clear, entertaining even. To anyone interested, I strongly recommend reading his "Three Dialogues Between Hylas and Philonous"[1] where he leads you through his thinking step by step. I wish more philosophers wrote as well as he.

[1] - http://www.earlymoderntexts.com/assets/pdfs/berkeley1713.pdf


Before anyone jumps into Berkelyian idealism like a conspiracy theorist into youtube, note that this whole line of thinking is fundamentally based on a genetic fallacy embed in an antiquated notion of causation.

Historically it was thought that the cause of something (the origin) must share properties with the event caused (the product). This is just a false assertion. Today we understand "emergence" in which properties arise in aggregate products that are not present in their constituent origins.

Without that every argument of the form "experience is a closed system" fails: the causal origin of experience does not "need" to be experience.

Light strikes a surface, it collides with your eye, your nervous system enters a state known as "visual perception" and that has properties not present in any prior step (ie., it feels like something). These properties emerge out of the whole interaction of the system, and are not present within any mere piece of it.

Idealism is very much an epistemological virus, a metaphysical conspiracy theory, that will quickly run a wildefire through your whole belief system with its plausible soundbites. "From experience, only experience, surely?!" No.

Berekely's Master Argument is the beginning of a line of "argument by amazement at how cool it all sounds" that ends up in Heidegger. Justified by the constant refrain that the experience is an epistemically closed system, a genetic fallacy whose plausibility is only ever achieved through rhetorically pleasing soundbites.


In what may be a case of Baader-Meinhof, I just came across the term "Whiggish" for the first time, reading Steven Weinberg's _To Explain the World_ (pretty good so far). He also has a dig at Kuhn, recounting a time when they met briefly and Kuhn gave a defense of Aristotle that he found incoherent.

I'm no expert in the field of scientific thought, but my sympathies fall with Morris and Weinberg and the like. I agree with Weinberg that Aristotle understood physics worse than many schoolchildren do today. Which is not to say that Aristotle was stupid, but just that your knowledge is a function of the times you live in and the volume of past human thought you've had the privilege to learn from.

There are legitimate social and ethical concerns for everyone to keep in mind with the advancement of scientific knowledge. But this shouldn't be conflated with the idea that we don't understand the natural world better than we used to. We do.


I reviewed Weinberg's book and took a somewhat different stance on the issue here:

https://www.chronicle.com/article/VialError/234826

Ironically, given the fact that Morris and Weinberg both prize clarity of thought, I think they are talking at cross purposes when they critique historians of science and are not properly defining their terms.

Take this quote from the OP for instance:

"While studying at Princeton, Morris soon learned that Kuhn held in particular contempt any view of science as a triumphal procession toward a more accurate description of the universe and how it works, a view called 'Whiggishness,' from British politics. The ultimate mouthpiece of Kuhn’s anti-Whiggish position in The Ashtray is an unnamed Harvard graduate student who insists that a new paradigm is not necessarily better than the old one, 'just different.'"

But Butterfield and other critics of the so-called "Whig" school of history are not contesting the proposition that we now understand the natural world better than a cave man (or a Hellenistic Greek) did. (They're also not relativists, incidentally). Instead, they're critiquing the idea that the specific path of progress is inevitable.

In other words, there is an underlying physical reality that sets constraints on what we can know and how we know it. But within those constraints there are innumerable potential branching paths. A Whiggish take on history of science might say, for instance, that it was inevitable that the first moon landing would be achieved by a nation with Enlightenment values. A non-Whig interpretation is not saying that the moon doesn't exist, or that we shouldn't care about the Apollo program. It's pointing out that the specific course taken by our timeline of history of science is not an inevitability. It could have been otherwise.


Your review is a very interesting take. It reminds me of someone's observation (I forgot whose) that mathematics is often presented as a set of successful proofs and derivations, without an explanation of the motivation behind them or how they were discovered. So proof tactics may seem somewhat magical, even though they might in fact be a result of a mathematician's tinkering and blundering around, including alternative approaches that didn't work.

An interesting example that someone gave from elementary mathematics is the derivation of the quadratic formula.

ax²+bx+c=0 (a≠0)

4a(ax²+bx+c)=4a(0)=0

4a²x²+4abx+4ac=0

4a²x²+4abx+4ac+b²=b²

4a²x²+4abx+b²=b²-4ac

(2ax+b)(2ax+b)=b²-4ac

2ax+b=±√(b²-4ac)

2ax=-b±√(b²-4ac)

x=(-b±√(b²-4ac))/2a

Someone discussing this pointed out that this derivation is easy to follow, but extraordinarily mysterious in terms of how we knew to do various things at various steps, such as mysteriously multiplying both sides by 4a or mysteriously adding b² to each side at some point. How did we know to do that?

Of course there are different explanations of the underlying motivations and the history of how people discovered this proof, but it's easy to be given the proof without any of that context, and that kind of thing is in fact the rule rather than the exception in many parts of math study.

Again, I think this was someone else's observation but I don't remember where I came across it.


>An interesting example that someone gave from elementary mathematics is the derivation of the quadratic formula...

A tangential point: this is indeed a horrible way to derive the formula, for the reasons you mentioned.

If anyone is curious, here's a better way to think about it. The graph of ax² + bx + c is just the graph of ax² translated. Keeping that in mind, let's investigate.

First, consider a very easy problem: find roots of ax² = 0. The graph intersects the x-axis at x=0, done.

Now, let's shift the whole graph down by Q, and solve the problem again. The equation for that graph is ax² - Q, and it intersects the x-axis at ±√(Q/a). Still easy.

Now, let's shift the whole graph again to the right by R. The equation for that new graph is a(x-R)² - Q.

What of the roots? Oh, we don't need to do much work here! The places where the graph intersects the x axis simply shifted to the right by R. So the roots are R±√(Q/a).

So, to recap: the roots of a(x-R)² - Q = 0 are R±√(Q/a).

What if our equation is written in the form ax² + bx + c? Well, now is the time for algebra. Open up the parentheses:

a(x-R)² - Q

=a(x² - 2xR + R²) - Q

= ax² + (-2aR)x + (aR² - Q)

= ax² + bx + c

Solve the following system for Q and R:

-2aR = b

aR² - Q = c

Obtain:

R = -b/2a

Q = b²/4a - c

Now plug these Q and R into the formula we already have: R±√(Q/a) - to obtain the all-familiar result

x=(-b±√(b²-4ac))/2a

What the formula is hiding is the simple idea that the roots of a parabola are easily found if you know where the vertex is. So assume you do, and work backwards from there.

A deeper idea is solving an easier version of the problem, and then changing the problem back to the more general original question, refining the solution on each step.

And this is, in fact, how mathematics is often done.

>and that kind of thing is in fact the rule rather than the exception in many parts of math study.

There's work done to change it[1]. Note that in the argument above, I could have left out all the "work", leaving only the questions, and many people would still be able to do the work. And with the right preparation, the student would be led to ask the same questions.

[1]https://en.wikipedia.org/wiki/Inquiry-based_learning


And that's just a special case of Completing the Square (but has a very handy visualization).

https://en.m.wikipedia.org/wiki/Completing_the_square


Not quite. "Completing the square" is an algebraic step; the argument is geometric (translation of the parabola). One can learn how to complete the square (symbolic manipulation) and be completely unaware of the geometry.

If you never learned the algebraic trick, would you invent it when solving this problem? It's not immediately clear from the algebra that a quadratic polynomial ax^2 + bx + c even can be rewritten in the form a(x-Q)^2 - R.

Conversely, while this technique involves algebra that amounts to completing the square, it exhibits a general technique.

If you want to say "..and that's just a special case of..", homotopy continuation methods[1][2] -

- because the idea here is transforming the simpler polynomial x^2 - C into a more complicated one, and seeing what happens to the roots.

[1]https://en.wikipedia.org/wiki/Numerical_continuation

[2] https://en.wikipedia.org/wiki/Numerical_algebraic_geometry#H...


Personally I prefer to break things into three steps:

First put the equation into the form

x² = 2ax + b

Now complete the square:

(xa)² = a² + b

Finally,

x = a ± √(a² + b)


Sure, it's short. But it doesn't get to why one would do these steps if you don't know they are going to lead to a solution. Completing the square is a non-obvious step to make.

The argument in my previous comment attempts to provide motivation for such a step, starting from simpler questions. It uses the geometry of the problem, and builds up from solving a simpler problem first.

That approach also uses the notion of transformation and invariance (seeing what happens to the roots when we move the graph around, and noticing that the distance between the roots doesn't change we shift horizontally).

Again, the important part here, is that you could lead someone to ask the same questions and have them answer them themselves.

"Let's solve this equation. Looks complicated. Can we solve a simpler problem first? What would a simpler problem be? Now how can we make it a little more complicated, and how does it affect the answer?".

And that's how math is done.

After all is said and done, one can extract "completing the square" as a shortcut technique. But that's what it is - a shortcut through the woods. Learning a shortcut won't teach you how to walk in the forest on your own.


Thanks! It's quite possible that my example came from Paul Lockhart or a proponent of somewhat similar ideas.


Indeed, and I hope these ideas gain more ground. Mathematics as a magic trick needs to end.


This comment is at least as useful as the article. Thanks!


> the idea that the specific path of progress is inevitable.

Seems like a straw man.


It isn’t if we’re talking about histories of science and medicine written between, say, 1850 and 1950. Nowadays I agree, few would argue this directly. But then again, that’s because of Butterfield and the turn away from Whig histories among hist sci types - which, itself, was far from inevitable. And if you read Weinberg, I think you do still see that point of view in play.


Is it?

The quoted proposition is really two rolled into one:

1. Progress is inevitable.

2. Conditional on progress occurring at all, it has to take something reasonably close to a specific path.

The first view seems to be quite widely held. I think it's false, but the second one is true. Is the version being critiqued here, the first one, second, or both?


It's also one of those typical philospher's tricks: you can get a scientist on record agreeing that such-and-such is reasonable, but they're thinking about something like "maybe the Hall effect could have been discovered before or after the completion of Maxwell's equations," while every reader is imagining natural science but based on drum beats.


> I agree with Weinberg that Aristotle understood physics worse than many schoolchildren do today. Which is not to say that Aristotle was stupid, but just that your knowledge is a function of the times you live in and the volume of past human thought you've had the privilege to learn from.

Would physics as a discipline exist in its current form without thinkers like Aristotle? That’s the more important question for me.


The course of science is shaped by the natural world at least as much by people. It needs people with the curiosity of Aristotle, but how it proceeds depends also on what nature reveals to them, and on what they can pry from nature by using what they have learned so far. My guess is that something like our physics has emerged (or will, relatively shortly) on every world populated with intelligent beings.


> My guess is that something like our physics has emerged (or will, relatively shortly) on every world populated with intelligent beings.

The anthropic principle states as much. I like that you used “intelligent beings” instead of humans, because I could imagine a scenario where synthetic or artificially intelligent beings would arrive to the same conclusions, upon making the same observations of their natural world.


I prefer the longer but more detailed writeup that LARB published: https://lareviewofbooks.org/article/the-ashtray-has-landed-t...!

The Janet Malcolm dispute is instructive in this regard. Malcolm's book on the MacDonald case is a classic, assigned reading for many aspiring journalists; her book is on the difficulties of the relationship between journalist and subject, and the impossibility of perfect objectivity. Morris's documentaries, sometimes downright hamfisted in their edits (he is cruel to, for example, his subjects in Fast, Cheap, and Out of Control), could be taken as an opposite extreme. Objectivity is hardly even intended.


Off-topic, but their GDPR popup just redirects me to the same popup.


Yep, tried again the next day on a desktop and going to slates sight and finding the article. Guess I'm not reading this one...


Same, and I'm not even in the EU so poor targeting to begin with.


Idem, using Firefox 60.0.1


Come on man, 'idem' doesn't even have fewer characters than 'same' :)

For other readers: 'idem' = 'the same'


Whether or not Kuhn was a radical relativist, a great many intellectuals today are.

Radical relativism rests on Cartesian dualism. This is a philosophical doctrine that, as anyone who has studied philosophy knows, is highly problematic. In the last century or so it has been rejected by a long string of major philosophers, include Husserl, Heidegger, Merleau-Ponty, Ricoeur, Whitehead, the later Wittgenstein, Strawson, Putnam, Dewey, Peirce, and James. Kuhn, alas, apparently was unaware of this.

Also, Kuhn believed that James was right when he said that the mind of the infant is a ‘a bloomin’ buzzin’ confusion’. However, this claim was made before there had been any scientific investigation of infant perception. In recent decades there has been a great deal such research, and it has been determined that infants do perceive the world, and according to universal principles.


The relativist position in modern philosophy of science has little or nothing to do with Cartesian dualism; it has more in common with the incompleteness theorem in certain respects.

The short version is that any theory of observations (i.e., scientific theory) necessarily defines the observational process and therefore the observations themselves, so that it is impossible to separate an explanation from its explanandum. What's relativist about this is that the very thing you are explaining by the theory is implicit in the theory, so that what you have are a series of paradigms or theories that replace one after the other.

This isn't to say that we don't predict things better, or even that there is no reality separate from the theory, only that it doesn't matter at some level.

All of this too is sort of distinct from what Kuhn is remembered for, which is the social nature of science, which is all too real.

Scientists are often in denial about these social factors, which is naive at best and dangerous at worst. Humans are not conduits of God; they are imperfect machines that are part of the system they are studying.

My sense is that Morris (who I respect immensely for his work) is kind of fighting a strawman argument whose depths he doesn't entirely understand, because of some personal conflict he hasn't come to terms with (ironically, given the nature of what he is arguing against).


>The short version is that any theory of observations (i.e., scientific theory) necessarily defines the observational process and therefore the observations themselves, so that it is impossible to separate an explanation from its explanandum.

What I am saying is that all the different scientific theories are ultimately based on a universal understanding of reality that is in turn due to the universal nature of the human organism and its engagements with the real world. That's why I included the last paragraph in my comment.


But this is ignoring what is is probably the most harmful influence of Kuhn -- the whole idea of "paradigm shifts". For a while nearly every new finding was hyped as a "new paradigm" thanks to Kuhn's SSR -- in the field of statistical genetics, Joe Felsenstein used to joke that he owed his success to the fact that he was the only one doing boring "normal science" in the 1970s rather than trying to shift paradigms.


I see contemporary radical relativism as being inevitable.


Recent HN discussion of a different review of this book: https://news.ycombinator.com/item?id=17217444


I am reminded of the Max Planck quotation, "science progresses one funeral at a time," which sort of makes both Kuhn and Morris correct.


I remember some form of this came out on opinionator, a thoroughly enjoyable read!

https://opinionator.blogs.nytimes.com/2011/03/06/the-ashtray...


Morris comes across as pulling a Neil deGrasse Tyson stunt.




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: