It saddens me to think that a polite exchange in which all parties are incorrect, come to a better conclusion, and leave amicably is so rare as to be remembered across a span of time greater than my entire lifetime.
Good conversations are so rare. It has been a long time, for me.
This exchange would be non-notable if it weren't for Feynman's stature and that he was basically called out by a musician. His familiarity with academia shines through by agreeing the answer deserves no marks, and his candor by stating that both of them were wrong.
(I enjoy abstract conversations about optimistic directions for technology -- although it often involves invoking and talking about -- joking about, if possible -- all of the possible dystopias to avoid along the way. we seem to be dipping our toes into almost all of the latter nowadays, but I tend to believe that doing that also helps expose the nature of the problems more clearly for a wider audience. I also really enjoy bicycle adventures despite rarely being that well prepared for them. recently I spent an hour on a hill "modifying" a front rim to accept a Schraeder valve rather than a Presta valve after suffering a flat and bringing the wrong kind of spare innertube with me; typical)
You sound awesome! I have my 2012 Specialized Allez flipped over in the living room. My wife will mention it any day now lol I had to replace a gear shifting cable, which was a nightmare. When I went to fill my tires, I realized the "auto" Serfas bike pump has some sort of weird logic where the Presta pump won't work if there's no pressure in the tire at all. Thankfully the little valve-stem-cap-cut-in-half trick actually worked!
Speaking of nightmares, my current pet dystopia is solar flare collapse. I'm generally highly, highly optimistic about what science and technology can accomplish, but this is the one thing that makes me doubt the future. It's led me to seriously consider taking on an organized survivalist/prepatory hobby, not only for the slight chance of societal collapse, but it sounds fun to systematize that sort of primitive-style safety net.
And you remind me of a friend of mine, and a funny story about them - apparently they once threw someone's food out of a window, apparently unprovoked. A bit jealous, irritable or aggrieved about something, I'd expect - those happen to all of us. Ah, the joys of youth, I suppose.
What's the model of Serfas pump you mention? That sounds like a good problem to be aware of.
Related question: I am reading a book by an author whom I know socially. The book is very good, but I have noticed a few minor typos. I also noticed that one example is explained backwards — that is, it mixes up the causes and effects for two phenomena. I have not independently researched the phenomena, but based on the earlier description (which is logical and I believe to be correct), the conclusion of a section is clearly mistaken.
Is it appropriate or helpful to point out any of these issues? The book came out a couple years ago, and I imagine other people who know the author may have pointed them out already. I don't want to make him feel bad (no one likes the bearer of bad news), but if it were me I would want to be told so that future print runs and digital editions could be corrected. (The book is popular and will likely remain relevant for years or even decades.)
Perhaps I should shoot him an email from an anonymous email address?
They'd probably be delighted to hear that you're reading the book (and it wouldn't hurt to mention that you think it's good!)
You could mention that, and ask about the causality problem, why it seems backwards compared to the earlier description, and how you've not researched it yourself and would appreciate clarification. You can also mention that you've noticed some minor typos and ask whether they'd appreciate them for a future printing. I think it's safe to say authors would rather have mistakes not continue to be reprinted.
Second this. A few years back I wrote a chapter for a book and then the editor screwed up. With one of the other authors, we took on editing and fixing things around.
As I had seen so many versions of it, my chapter was the one on which I could do the less editing around -- I was absolutely tired of it. Lo a behold, the book gets published, I get a physical copy on my hands, and I start seeing typos and small mistakes around. Nothing too serious, but things that I would've fixed.
I know those mistakes are there. Only a handful of people have pointed them out. Those people have not only read the book, the followed it and questioned it. It's nice to be reached out, even if I already know what's wrong.
Depending on how well you know this person, you could frame your "correction" as more of a question. "I was having trouble understanding what you meant in the conclusion, because it felt like it contradicted what you say earlier. I'm probably just confused or missing something, but could you explain it to me?" They might notice their error based on that and tell you they messed up. Or they might not, in which case you can "ask" if the way you're thinking about it makes more sense.
Don't use an anonymous email. Just email them from an account you use and see what turns up, if it is such a frequent question the author in this day an age could post an errata to their website. They might even have a reworked bit of prose for a second edition or some-related-next-project. The only people I wouldn't email are the people that clearly don't want to be found from a simple search.
It's just an email, don't overthink it. Don't write a rebuttal essay, just express what you are reading and why it doesn't line up for you and send it along. Maybe there's something you're missing or maybe this is a bit of manuscript that was revised a few times and now doesn't line up perfectly. It's no different than going to a friend and asking if they can make heads or tails from it. They wrote a book to share something they found interesting people asking questions and clarifications is what I would think to be one of the few true rewards.
I've been in similar situations (for much less well-known books, granted) and took the approach of writing a personal and praise-filled email with my list of errata as an attachment. It makes reading the errata a little more opt-in.
>Is it appropriate or helpful to point out any of these issues?
Only you can answer that question, as none of us here know them! Some people really appreciate it when you point out their mistakes (privately), others will hold a grudge against you for life.
I once had a similar disagreement with someone on a public internet forum. Their statement wasn't wrong per se, because the factors we were arguing about were based on a well defined mathematical relationship. I just thought they had the cause and effect relationship backwards. I put forth my best arguments but were unable to convince them. The same fate might befall you if you try to push this issue, although you have nothing to lose with a private communication.
Thanks for all the feedback. I do have lots of other questions that are positive in nature, and show that I've closely read the book. I'll compile all of those and it will dwarf the few nits that I'll include at the end.
"Arguments from authority carry little weight – authorities have made mistakes in the past. They will do so again in the future. Perhaps a better way to say it is that in science there are no authorities; at most, there are experts."
Yes, that's the implication of Sagan's quote! Nobody can be trusted to always be right; there are only people with different probabilities of being right on various subjects. Their words are Bayesian evidence of finite strength.
Take-home test in a survey course for nonmajors. And she referenced a book with a mistake.
In my thirties, wanting to test out of college algebra, I bought some math software to do drills and refresh my rusty high school math and the software had errors. It was fine for my purposes because I recognized the mistakes, but I was also homeschooling and I told my kids it probably wasn't a good resource for them because it has too many errors, so it wasn't something they should use for math practice. They didn't know enough math to recognize the errors and would have learned wrong.
People make mistakes. This kind of scenario is always a possibility for an outsider with insufficient background knowledge to go "Wait ...that doesn't seem right." and, instead, just goes "But (famous expert) said it, so it must be true."
> She said she never had any trouble initiating contact with authority figures when she had some business to discuss. “We’re all just people,” she says.
A professor of mine once told me about when he was in school he studied a book written by a famous expert in the field and found a bunch of errors. So he wrote back to the author with the error and, after some back and forth correspondence, the author offered him a job after graduation. So that was how he first got his foot in the door in the field.
I've found the same. If you contact an authority with serious questions about the subject they are an authority on, they will generally respond in a positive way. Keep in mind, this is very different than kids asking a math PhD for help with algebra homework - algebra is not their specialty and there is no reason to contact them specifically about such things.
I have also had similar experience cold-emailing domain authorities. I spent days writing 10s of words, double checking my facts, doing my homework and have gotten a response everytime. They are excited that someone is reading their work, thinking about it and then having the gumption to contact them. I think it is actually quite rare.
I contacted a guy about a paper he'd written on image de-blurring. Just some high level questions because the paper was behind a paywall. He asked for my snail-mail address and sent a full color printout of the paper in a large envelope! I still have it.
I don't understand. Of course an electric charge inside an ideal, closed conductor cannot produce or affect a field outside the conductor, just as Feynman originally stated (and just as my hardbound 1989 printing of Feynman/Leighton/Sands still says, on page 5-9.)
Otherwise, as he goes on to point out, electrostatic shielding wouldn't be possible. What am I missing?
"I must have assumed it was grounded" doesn't make much sense either, because a charge inside such a conductor has no return path to ground. Either a current will flow outside the enclosed conductor -- in which case where's the return path? -- or it won't.
Electrostatic shielding is actually about shielding the inside of a closed (hollow) conductor from the outside (the standard deriviation from Gauss's law).
If the conductor is non-grounded then a charge outside the conductor is fully affected by charges inside the conductor but not the position of all the charges (since the divergence of the field through a closed surface will produce a scalar field that varies with distance to the point at which we are measuring the electric potential so the only thing that matters is how far from the closed surface we are (and assuming all charges are enclosed within the closed surface we can represent them as a net charge at the center of the closed conductor).
If the conductor is grounded you are then basically changing the flux with which we can calculate the electrostatic potential at a point outside the sphere (and therefore changing how an external charge will be affected by an internal charge)
If you put a charge inside a spherical ungrounded conductor, charges will move to the inner surface of the conductor to make the electric field inside it zero. The excess charges go to the outer surface of the conductor, uniformly.
What happens differently when the conductor is grounded, though? From the point of view of a charge inside the conductor, grounding the outside just increases the available exterior surface area. The outer surface is now the whole planet, conceptually speaking. If field lines existed before, they still do, they're just spread out more.
A thought exercise: consider a van de Graaff generator. In operation, it moves charge from the 'ground' at the base of the belt to the top sphere. It can do this because there's a hole in the top sphere for the belt to pass through. If I enclose the entire generator in an even larger metal sphere, seamless and unbroken, and run it from batteries, can I draw sparks from that sphere? My intuition says 'no,' so it's interesting if that's not the case.
When the conductor is grounded, the charge on the conductor is no longer constant. It can change until the total enclosed charge in any sphere surrounding the conductor is zero.
Now take the charged sphere and connect it to ground with a wire. At the moment you connect it, what happens to the unbound charges that are on the surface of the sphere?
what happens to the unbound charges that are on the surface of the sphere?
That's what I'm asking. Are you're saying that yes, I can draw a spark from the exterior of a perfect, unbroken, hollow metal sphere containing a van de Graaff generator and some batteries, or otherwise detect that there's any electrostatic activity inside the sphere whatsoever?
(Actually I don't even see how a net charge can be produced inside such an unbroken conductor, thinking about it further. The VDG can only move charge around, it can't create a net imbalance inside the sphere. So the question becomes, what could? I have a feeling that while the corrected version of Feynman's statement is right in principle, it presumes a condition that can't actually exist in nature, like a magnetic monopole.)
> Are you're saying that yes, I can draw a spark from the exterior of a perfect, unbroken, hollow metal sphere containing a van de Graaff generator and some batteries, or otherwise detect that there's any electrostatic activity inside the sphere whatsoever?
I prefer to reason about the simplest cases but since you asked again: if the total charge inside an unbroken ungrounded conducting sphere is zero the field outside it will always be zero, no matter how many VdGs are inside it.
Not accepting authority is not the lesson I would take from this. Good scientists would accept their mistake and correct themselves is what I see as the lesson there. There is no way for a student who is still learning physics to know or trust her instincts. She was right in that instance but majority of the times she would be wrong.
Because in lessonable stories it aids understanding to have only one lesson to assist in reducing confusion. But it's hardly a hard-rule and is often broken.
Ah, William & Mary, the Harvard of the South. Feynman rightly should have said, "transfer to Yale. It's a lot easier to graduate and far less pretentious."
Thanks is a great story, but not about Feynman or Feminism.
It’s about class.
The reason she felt comfortable was because she was from the ‘upper’ class that belonged. And could question anyone, even Noble Laureates.
The story is about argument and evidence being compelling and authority not so at all.
> she felt comfortable... ...could question anyone, even Noble Laureates.
The idea that this is explained entirely by class does a disservice to people from more humble backgrounds who are capable of being confident. For example Feynmann himself has a large number of anecdotes about suddenly not caring who he was arguing with because thinking about the problem took over. Los Alamos with all the greatests physcists after Newton present, for example.
Putting class at the centre of everything really isn't vastly better than pretending it has no relevance and does not exist. The thing that should interest us more is where are the modest background geniuses coming from now? Is class distinction getting more important and becoming a stronger barrier now than in the 30s and 40s or from this story the 70s. US tuition fees and student debt would suggest it is worth looking at.
> Your instructor was right not to give you any points, for your answer was wrong, as he demonstrated using Gauss’ law. You should, in science, believe logic and arguments, carefully drawn, and not authorities. You also read the book correctly and understood it. I made a mistake, so the book is wrong. I probably was thinking of a grounded conducting sphere, or else of the fact that moving the charges around in different places inside does not affect things on the outside. I am not sure how I did it, but I goofed. And you goofed, too, for believing me.
I've heard of "non-apologies," which are sneaky ways of sounding like one is taking responsibility and apologizing without actually apologizing or taking responsibility.
But this is the opposite-- Fenyman takes zero responsibility for having led the student astray, and in fact chastises the reader for appealing to his own authority. At the same time, he gives evidence for why he should not be trusted as an authority-- he goofed and doesn't even know why!
It's like a variation of an old one-liner comedy insult, something like: "I got news for you, we could both do better!"
> Fenyman takes zero responsibility for having led the student astray
Sure he did. He said he goofed.
> and in fact chastises the reader for appealing to his own authority
And he's right. In science, there is no such thing as appeal to authority. If the student thought her answer was right, she should have produced an argument for why it was right--and of course she couldn't because her answer was wrong. She should not have appealed to an authority.
> No one has the resources to verify the whole of science through first-hand experience
Courses in science commonly include actual experiments, either done by the professor while students watch, or done by the students themselves in labs, precisely to give the students first-hand experience in the scientific phenomena being studied in the course.
> at some point you have no choice but to trust someone
You may have to trust other people for first-hand observations of things you didn't observe yourself. But that isn't what's involved here. Here the student had a theoretical law whose consequences she was perfectly capable of working out for herself. She did not have to trust anyone for that.
In this particular case, the student even saw the problem with Feynman's statement: her letter says, in reference to the statement in Feynman's book that turned out to be wrong: "This was confusing, as it seemed to contradict all your previous statements." So why did she base her exam answer on the statement she found "confusing"? She should have thought it through for herself.
Even with theory it’s not reasonable to expect someone to verify things “all the way down”. For example, it’s reasonable for a beginner in physics to trust that calculus works without fully understanding its foundations.
> it’s reasonable for a beginner in physics to trust that calculus works without fully understanding its foundations.
Even the beginner doesn't have to "trust" that calculus works. He can verify for himself that using calculus to manipulate equations in the theory yields predictions which are confirmed by experiment.
The main area where I see that "trust" would be required in science is reporting of raw data directly obtained from experiments that other people run. Yes, everyone else has to trust that the person who is reporting that data actually ran the experiment they claim to have run and recorded that exact data from that experiment in its entirety--that they didn't make up the data, or massage it, or cherry pick only certain runs, etc. That is why, when scientists are found to have violated this trust, the penalties are typically severe.
Other than that, though, you don't have to "trust" anything in science blindly. Whether a particular set of data is consistent with a particular set of theoretical predictions is something that can be verified independently. And since theoretical predictions are just mathematical derivations from certain stated axioms, those can also be verified independently. So no one ever has to just take someone else's word about those things.
In the UK in the 70s, "top stream" math students started studying the basics of calculus in what the US would call 8th grade. We did the foundations, and by the time I was in college occasionally using calculus, I didn't have to trust it blindly - we had studied it enough for me to understand how it works (to a point). The same was true of basic electricity, basic mechanics and lots more.
The US educational system seems to regard some of the foundational elements of STEM will enormous trepidation (I've known US high school students headed for STEM majors at quality universities who haven't studied calculus for more than a year (typically just covering differentiation, not integration)). Combined with it's greater breadth of educational goals, both in grade school and even for a bachelors, it's not really surprising that there will be students (in the US at least) who begin college level physics without fully understanding the tools (or history) of the subject.
I am trying to use an 80s sophomore level UK linear algebra text in a course for predominantly engineers and nonmajors. Imagine my surprise when I discovered some of the material is covered in US beginning graduate (in the math dept) algebra!
I’m also British, for what it’s worth. Of course anyone can study the foundations of calculus. The question is whether it’s reasonable for someone to choose not to and trust that it works. I think it surely is reasonable for someone to do so, unless they have a particular need for a deep understanding of calculus.
You always have to stop somewhere. Any given one of your assumptions may be verifiable in principle, but you won’t have the time and wherewithal to verify all of them. Science as it’s actually practiced is a huge pyramid of trust.
My entire undergraduate degree in Physics was derivation from first principles.
The verification issue comes about when you need to run the experiments yourself to prove that your equations don't just match with fudged data.
You can pretty much prove that calculus works from a purely logical basis (I might be using incorrect terminology here but you get the point, pure math is self-proving.).
> You can pretty much prove that calculus works from a purely logical basis (I might be using incorrect terminology here but you get the point, pure math is self-proving.).
Right, but it’s pretty clear from your use of terminology here that you personally haven’t studied the foundations of calculus deeply. Which is 100% fine. You understand roughly why it works and you trust that specialists have checked it out more thoroughly.
A lot of people on this thread keep pointing out that various things can be verified. But they are saying this precisely because they themselves have not verified the things in question and yet trust that it is possible to do so. The point is that this is perfectly reasonable in most contexts. You do not have to take on the burden of proving for yourself every mathematical theorem that you make use of.
There are shades between full formal verification of literally everything and then "trust the elders". You can know a little bit about a lot and use that as a sort of statistical verification because you're basically running a monte carlo sim to find holes in the logic, if you find no holes then you can reasonably assume that the rest is true and that's not based on trust.
It's a bit like having a 500-dimensional jigsaw puzzle, once you show that most of the pieces fit together the other ones have nowhere to go any more except to fit.
I don't think people in this thread have realized this, but the issue here is about an application of Gauss's law, not the law itself. If you write in a math test "d^2/dx^2 sin(x) = sin(x)" and complain that you have not been awarded points because a famous book on calculus has this equation in it, then you are appealing to authority instead of making an argument.
That paragraph is wrong. So it's pointless to wonder why Feynman wrote what he wrote (that is unless you are interested in the man himself). She specifically had a PS in the letter where she said she had "a devious motive in writing to [Feynman] because on the exam [she] answered with the explanation that [Feynman's] book gave". I'm sure Feynman would've had his share of experience with grade grubbers and probably (and a bit unkindly) assumed that this woman was one. She also clearly refused to believe her professor when he explained the problem to her. Yes, Feynman could have been more kinder and more charitable with his time, but if you have worked in academia, then you'll also know that the vast majority of professors would completely ignore correspondence of this sort from an undergraduate.
The student wasn't sure if the paragraph was wrong or if she'd misunderstood what Feynman was saying in it. So she asked him.
>She also clearly refused to believe her professor when he explained the problem to her.
There's nothing in the letter to support this conclusion. But in any case, you can't coherently criticize the student both for believing Feynman and for not believing her professor! I thought the point was that she wasn't supposed to 'believe' anyone.
And yes, of course the student was hoping that Feynman might have turned out to be right after all so that she could get some extra points on the test. So what?
> The student wasn't sure if the paragraph was wrong or if she'd misunderstood what Feynman was saying in it. So she asked him.
And he answered that question.
> I thought the point was that she wasn't supposed to 'believe' anyone.
I am presuming that her professor actually showed her why she was wrong, instead of asking her to take his word for it.
> And yes, of course the student was hoping that Feynman might have turned out to be right after all so that she could get some extra points on the test. So what?
You don't get it. Gauss's law is a very very elementary law (usually taught in high school). Her professor would have most definitely explained why he took off points from her exam. There are two explanations now as to why she included that PS: i) she didn't understand her professor's explanation, and hence also did not understand Gauss's law properly, ii) she was grade grubbing. I cannot sympathize with her for (i) since she most definitely did not make an effort to understand her professor's simple argument, yet she found the time to write a letter to Feynman. Also, W&M is a large research university with multiple physics professors and graduate students and it's unlikely that no one would have been able to help her with this. So she clearly didn't try hard enough to understand Gauss's law and that's not Feynman's fault. And I really cannot sympathize with her if it's case (ii).
a) Makes no difference to the argument. I'm not criticizing the student for writing to Feynman. I'm just saying that Feynman was absolutely right in his criticism, considering how elementary the subject matter is.
b) Also doesn't make any difference. I'm familiar enough with W&M to know that (i) it has a full-fledged graduate program and (ii) professors there collaborate extensively with JLab in both theory and experiment, though I admit that I don't know if that was the case when this student wrote the letter.
A student would be optimally correct in trusting authority as far as believing the axioms of the electromagnetic theory goes. However, as ones builds more and more theorems and results based on those axioms, the student should stop believing in authority and start verifying that all claimed results are consistent with the axioms.
> But all this begs the question. Why should we believe Gauss's law?
This particular incident had nothing to do with the validity of Gauss's law, which is nothing but a restatement of Coulomb's law in electrostatics, and something that has been extensively verified in experiments. Feynman's presentation of Gauss's law is crystal clear. The issue in question was an application of Gauss's law. Feynman likely goofed up (like most of us do) because he didn't think twice and went by his intuition. And he was absolutely right to criticize the student -- saying that something is written in famous book X written by famous author Y is completely irrelevant in science if you cannot make a good argument for why it is right. This is different from many social "sciences" where people often make such claims.
This isn't a really a fair reading of the student's letter. It's clear that she was puzzled by the apparent (and in fact actual) contradiction between the erroneous paragraph and the rest of the book. She just asked Feynman for an explanation of what the paragraph meant. It does seem reasonable to give Feynman the benefit of the doubt and assume that there might possibly be a non-erroneous interpretation of the paragraph in question.
I think Feynman was getting on a favorite hobby horse about not trusting authority and reading the letter a little uncharitably.
That is the point of the scientific method. You don't believe it because you read it once, you believe it when multiple people construct repeatable experiments that yield similar results. It is the essence of science.
The article quotes Feynman’s letter as saying “I am not sure how I did it, but I goofed.” What he meant is that he doesn’t know how he made such a simple error, not that he doesn’t know what the error was.
Like if I wrote in an article or book chapter that 2 + 3 = 7. I’d know what the error was when it was pointed out to me. But I wouldn’t be able to imagine how in the world I made such an elementary goof.
> Like if I wrote in an article or book chapter that 2 + 3 = 7. I’d know what the error was when it was pointed out to me. But I wouldn’t be able to imagine how in the world I made such an elementary goof.
And why would that make you less of an authority? I was replying to a comment that suggested that Feynman ought to be not trusted as an authority because he goofed. Please try to follow the discussion in the thread.
Sort of surprised to read this. I thought the verdict was pretty clear. Feynman was canceled. W&M putting out a press release like this is inviting Feynman to give a talk or a commencement address.
The commenter is probably referring to the various attempts [1] (some successful) to tarnish Feynman's image since his death. I agree that the comment is completely irrelevant to this particular discussion though.
Good conversations are so rare. It has been a long time, for me.