Hacker News new | past | comments | ask | show | jobs | submit login
Why is Maxwell's theory so hard to understand? (2007) [pdf] (cam.ac.uk)
292 points by sytelus on April 8, 2020 | hide | past | favorite | 144 comments



I actually have happened to delved quite a bit to "really" understand the Maxwell's equations. I've bought original treatise, books with its commentary and plain old "for idiots" sort of books. His original treatise is super dense and unapproachable. Right now we can wear Maxwell's equations on t-shirt but their original form were forbidding. Even with modern form you "really" need to get concepts of differential geometry if you want to just play beyond abstract. There are tons of hand wavy explanations of div and curl out there but almost all can be broken with crafting clever question like “ok, so what you think curl of that would look like?”. I don't think even today I can claim I really get these concepts.

In any case, some of my biggest takeaways were these:

1. There is no such thing as "proof" of Maxwell's equations. Just like Einstein's field equations, Newton's laws and many other things in Physics, Maxwell equations are also simply laid out as lets assume these. Vast majority of “greatness” in Physics is simply assuming something without needing to fully understand it and then cross your fingers to see if some good predictions comes out of it.

2. The major achievement of Maxwell's equations is that you can predict velocity of light using other physical constants that have seemingly nothing to do with light. A consequence that we only later realized was that this was literally a constant and not relative to who is measuring it! This is easily one of the most non-obvious achievement in Physics.


Regarding "fully" understanding physics, I like to refer people to Feynman's answer to a question about why magnets repel each other[0].

"Assuming something" is one integral part of the scientific method. You formulate a question, you build a hypothesis from prior knowledge, you make predictions, you test them, you analyze your results which might lead you to change your hypothesis or not. "Greatness" is finding simple hypotheses (like "the laws of physics are the same as viewed from any inertial frame and the speed of light is the same for every observer") that predict/"explain" effects that were inconsistent with prior hypotheses. Of course you can never really prove a model of the world is correct - another model that predicts differences that are too small for us to measure might be the "truth", but, until we can measure it, there isn't much use in pondering ...

[0] https://www.youtube.com/watch?v=36GT2zI8lVA


I've never really been that impressed with Feynman's responses in that interview. He seems to be uncomfortable with just saying that physicists don't really know on a fundamental level why magnets repel each other. Obviously, explanations can be more or less fundamental.


He's trying to get the point across that "X don't really know on a fundamental level why X" for any group of people X and observation Y, if by "on a fundamental level" you mean "is not based on something assumed to be true". He's not uncomfortable with it, anyone with a scientific or engineering background knows that (or at least should know that). That doesn't stop predictions obtained from well-tested models from being useful.


Well, maybe that's what he's trying to get across, but it seems largely irrelevant to the interviewer's question. He could just have given a physics textbook level explanation of how magnets work. I'm not convinced that there's some kind of profound misunderstanding evident in the interviewer's question that needs addressing in long and rambling terms.


There is actually a subtle point on why the question of "physicists don't really know on a fundamental level why magnets repel each other" is nonsense ultimately which doesn't stop physics from producing useful or interesting in some other way models.

Those who think that a textbook explanation could have been the answer are missing the point: it is not about magnets—it is about what it means to know anything in terms of anything else.

There is no need for useful models to form a nested hierarchy converging to a single "reality" i.e., it is not necessary for a model to be more fundamental than another model even if they relate to what we observe as the same phenomenon.


>it is not about magnets

But he was asked a question about magnets...


He might have been asked "why a kilogram is green"

the right answer is that the question is nonsense, and not that we can get a kilogram of cucumbers (we can but it is not the point)


I don't think you can seriously be comparing "Why do magnets repel each other?" to "Why is a kilogram green?".


Both question can have trivially true answers (a textbook one for magnets, cucumbers for the green kilogram), both are nonsense if you dig deeper.


What physics textbooks say about magnetic attraction is very far from being "trivially" true. It's a wealth of utterly non-obvious information resulting from centuries of scientific research.

Also, "cucumbers" is not an answer to the question you posed.


So you are telling me, you know the answer to my own question better then its author :)

I'm curious why do you think magnets repel? What terms would you use to describe it? How these terms are defined? What terms in turn are used in these definitions? How these terms are defined in turn? etc.


I love this interview it’s typical of Feynman; what do you mean by how magnet works is a fair question but it’s also a multi form one. We don’t really know we just know up to a certain level he had a lecture about how Mayans predicted astronomical events with beans, were the bean the answer to why?. I’ve learned a lot from this interview on how to look at science. Why is indeed a profound concept


> He could just have given a physics textbook level explanation of how magnets work.

But "how" was not a starting question of the interviewer. The questions were different, and I've marked them with numbers here:

https://www.lesswrong.com/posts/W9rJv26sxs4g2B9bL/transcript...

"Interviewer: If you get hold of two magnets, and you push them, you can feel this pushing between them. Turn them around the other way, and they slam together. <q1> Now, what is it, the feeling between those two magnets? </q1>

Feynman: What do you mean, "What's the feeling between the two magnets?"

Interviewer: <q2> There's something there, isn't there? </q2> The sensation is that there's something there when you push these two magnets together.

Feynman: Listen to my question. What is the meaning when you say that there's a feeling? Of course you feel it. Now what do you want to know?

Interviewer: What I want to know is <q3>what's going on between these two bits of metal </q3>?

Feynman: They repel each other.

Interviewer: <q4> What does that mean, or why are they doing that, or how are they doing that? </q4> I think that's a perfectly reasonable question.

Feynman: Of course, it's an excellent question. But the problem, you see, when you ask why something happens, how does a person answer why something happens? For example, Aunt Minnie is in the hospital. Why? Because she went out, slipped on the ice, and broke her hip. That satisfies people. It satisfies, but it wouldn't satisfy someone who came from another planet and who knew nothing about why when you break your hip do you go to the hospital. How do you get to the hospital when the hip is broken? Well, because her husband, seeing that her hip was broken, called the hospital up and sent somebody to get her. All that is understood by people. And when you explain a why, you have to be in some framework that you allow something to be true. Otherwise, you're perpetually asking why."

I think Feynman properly responded to the questions asked -- people do thing that they have to "find a meaning" and "why" and talk about "the feeling."

Feynman properly answers there "of course you feel it!"

Follow very carefully his whole response (I link the transcript) -- it's deeply thought through and applicable to much more than just "feeling -- meaning -- why -- magnets." It's about the "why questions" and "meaning" questions in general, from the view of physics.


>But "how" was not a starting question of the interviewer

As your transcript shows, the interviewer asks "why are they doing that, or how are they doing that?".

That seems like it would have been a good point to respond with an explanation of why magnets repel each other.

I don't buy all this stuff about 'how' vs. 'why' questions anyway. Lots of 'why' questions are perfectly sensible scientific questions. E.g., 'Why don't magnets stick to aluminum?'


Exactly, and that "why" is answered thoroughly, see the transcript.

The part of "how" is also there:

"If you're somebody who doesn't know anything at all about it, all I can say is the magnetic force makes them repel, and that you're feeling that force.

You say, "That's very strange, because I don't feel kind of force like that in other circumstances." When you turn them the other way, they attract. There's a very analogous force, electrical force, which is the same kind of a question, that's also very weird. But you're not at all disturbed by the fact that when you put your hand on a chair, it pushes you back. But we found out by looking at it that that's the same force, as a matter of fact (an electrical force, not magnetic exactly, in that case). But it's the same electric repulsions that are involved in keeping your finger away from the chair because it's electrical forces in minor and microscopic details. There's other forces involved, connected to electrical forces. It turns out that the magnetic and electrical force with which I wish to explain this repulsion in the first place is what ultimately is the deeper thing that we have to start with to explain many other things that everybody would just accept. You know you can't put your hand through the chair; that's taken for granted. But that you can't put your hand through the chair, when looked at more closely, why, involves the same repulsive forces that appear in magnets. The situation you then have to explain is why, in magnets, it goes over a bigger distance than ordinarily. There it has to do with the fact that in iron all the electrons are spinning in the same direction, they all get lined up, and they magnify the effect of the force 'til it's large enough, at a distance, that you can feel it. But it's a force which is present all the time and very common and is a basic force of almost - I mean, I could go a little further back if I went more technical - but on an early level I've just got to tell you that's going to be one of the things you'll just have to take as an element of the world: the existence of magnetic repulsion, or electrical attraction, magnetic attraction.

I can't explain that attraction in terms of anything else that's familiar to you. For example, if we said the magnets attract like if rubber bands, I would be cheating you. Because they're not connected by rubber bands. I'd soon be in trouble. And secondly, if you were curious enough, you'd ask me why rubber bands tend to pull back together again, and I would end up explaining that in terms of electrical forces, which are the very things that I'm trying to use the rubber bands to explain. So I have cheated very badly, you see. So I am not going to be able to give you an answer to why magnets attract each other except to tell you that they do. And to tell you that that's one of the elements in the world - there are electrical forces, magnetic forces, gravitational forces, and others, and those are some of the parts. If you were a student, I could go further. I could tell you that the magnetic forces are related to the electrical forces very intimately, that the relationship between the gravity forces and electrical forces remains unknown, and so on. But I really can't do a good job, any job, of explaining magnetic force in terms of something else you're more familiar with, because I don't understand it in terms of anything else that you're more familiar with."


Yeah, he answers the question. I just think people are giving Feynman too much credit here. He seems to have been in a grumpy gramps mood that day and taken some time to actually get round to answering the question.


> He seems to have been in a grumpy gramps

No -- he used the question to demonstrate the basic premises of physics -- that the "whys" can never end, as long as somebody is not "satisfied" with the answer, and that to even understand "how" needs some precondition to be useful in any way to the one who asked, and that otherwise it's just "cheating" or practically giving somebody false sense that he'll know something because the analogies popularly used are just wrong.

Like he said, the bigger marvel is that, that the same electromagnetic forces are what keeps us from falling through the floor. Or what keeps the apple hanging on the tree.

Or, only specific to the human uses, how the movement of water is transformed to supply remotely the electrical machines with the power.

But that's what nobody asks, because they don't "feel" it unusual. The magnets are just a small manifestation of the same forces that "feels" unusual to the people.

It's his answers to "philosophers" who earn the points asking "whys" which, from his point of view, are too wrong to ask, having a false context.


Everyone understands that you can keep asking "why?". Four year old kids understand this.

What the interviewer obviously wanted was an explanation of a particular physical phenomenon targeted at the level of someone without any background in physics. Everyone, Feynman included, has been in that position.

It's quite wrong to suggest that physicists don't, or shouldn't, ask "why" questions. They do it all the time: https://scholar.google.com/scholar?hl=en&as_sdt=0%2C5&q=%22w...


> What the interviewer obviously wanted was an explanation of a particular physical phenomenon targeted at the level of someone without any background in physics.

How do you know that? I would claim that it's what you expected and even if you received that (as quoted before!) you double down on showing the dissatisfaction in what preceded that explanation, namely, Feynman explaining that the "satisfaction" impression of every answer depends on the already existing knowledge of the person who asks.

But the answer was completely honest: there aren't any intuitions about electromagnetic fields present in someone "without any background in physics" which would allow the decent (non-cheating) answer.


I actually think his explanation is right on. Physicists are practical. They make model to quantitatively describe their experiments. Understanding fundamental truth is kind of a by-product.

They generally follows a reductionist approach, so a simpler model that can explain a wider set of experiments is a better model. But there's no guarantee that a simpler explanation is the "truth". All we can say is this approach "makes sense".


> Obviously, explanations can be more or less fundamental.

That's not obvious, in fact, it's not even true. Feynman's point is that people bring context to a question. People will accept different things as 'fundamental'; different assumptions and axioms. Can you sometimes explain things using fewer axioms? Sometimes; but is it more fundamental? Is it better to build a theory on fewer axioms, even if they can't be directly validated? Would you prefer a theory based on 4 axioms, none of which can be directly validated, over one with 6, but you can measure all six directly? What if the four can't be re-derived from the 6, but all observable statements can? What if either set can be re-derived from the other?

Feynman isn't saying that "physicists don't really know on a fundamental level why magnets repel each other". He's alluding to the idea that that question isn't well defined. He's encouraging the questioner to figure out what sort of answer would satisfy them, and why.


I think it's obviously true that some explanations are more fundamental than others. We have, e.g. a more fundamental explanation of tides than anyone did in 500AD. That is not to deny that it is difficult to make precise what exactly it is that makes one explanation more 'fundamental' than another. As always, individual cases are clear; the general principle is elusive.

>He's alluding to the idea that that question isn't well defined.

It's well-defined enough to answer. Hundreds of millions of school children learn a perfectly sensible answer to the question every year. (Would this answer satisfy someone with a PhD in physics? Obviously not. But that's not the point.)

It's in any case bizarre to insist that a layperson ask a question that's well-defined according to the standards of a particular field.


The odd thing is, we physicists actually have a pretty solid understanding of why magnets repel each other. It's a fun story, one that I try to include every time I teach the subject, and I'm pretty sure that Feynman knew it! So it's curious to me to see Feynman basically dodging this question. (His point that "I can't explain it in terms of anything else you're already familiar with" is entirely valid! But I kinda feel like that's what questions are for: learning new things, that we may not already be familiar with.)

With that in mind, I probably would have answered in a different way, though the questioner might get bored and regret asking if I didn't find a way to be unusually efficient about it: this answer relies on a lot of knowledge that seems at best tangentially related. :) I'd try to give some rough description of the way that moving currents generate magnetic fields (especially the field generated by a current loop), and the way that moving currents feel a force due to magnetic fields (especially the force on a current loop). That's enough to argue that current loops will interact with each other in just the same ways that magnets do. And then I could tell at least a sketch of the story of how regions of aligned spins in magnetic materials act on average like current loops. (But that's quite a long story to answer a simple question, so again, I can sympathize with Feynman for saying, "There isn't a straightforward answer that a non-expert would understand.")


I don't think you understand the issue here. He's not dodging the question. On the contrary, he's answering on the most fundamental level.


Didn’t his answer boil down to “The math is the physics. That’s the explanation. That’s how it works. You can’t just make up an analogy because the math is the analogy”


That seems a bit extreme and not really consistent with the attitude he took in his public lectures and in Surely You're Joking. It amounts to giving up and saying that you can't explain basic physical phenomena to laypeople on any level.


I'd love to have others high grade researcher pitch in about the art of good theory. Often it's more a matter of disregarding your own perceptions, and aim for the out-of-the-box stupid.


You should try to understand differential forms, if you have those then Maxwells equations in vaccuum really aren't strange any more.

But Differential forms aren't that strange really. They are the mathematical objects that allow you to integrate along surfaces and curves. Of course their theory hadn't been developed when Maxwell wrote them. And Maxwell was very much concerned with EM in matter, which mixes the properties of the EM fields and materials, and that can thus be expected to get a bit messy.

But I don't see what you are striving for when you say "really" understand them. I think this is a psychological category, rather than a hard criterion. Can you apply the formalism to calculate consequences? That's the main issue. Maybe you can have a better or worse intuition about the consequences, but that is often mainly due to practice. You can't expect to correctly intuit all possible consequences of a system as rich as EM.

> Young man, in mathematics you don't understand things. You just get used to them. -- John von Neumann


I don't think the equations of electrodynamics themselves are difficult to understand and accept at an intuitive level, as one can easily describe them in plain words.

https://en.wikipedia.org/wiki/Maxwell's_equations#Conceptual...


And a bit more deeply than just differential forms and the exterior derivative: connections on vector bundles and their curvature. Deeper still: connections on a principal bundle and the induced connections on associated bundles.


Could you recommend a good (accessible) reference on differential forms?


I'm late to the party and not the person you asked, but I've been trying to work through https://smile.amazon.com/gp/product/0817683038 It's good so far.


Differential forms aren't necessary for this though.


Similarly, if you look at the Newton's first papers of differential calculus, they are very hard to understand and he explains them very confusing way.

Whoever discovers these things must do so without concepts and formulations the people who came later made to simplify and understand them. The amount refinement that happens between invention and teaching the concepts to undergraduates is huge.


In the case of differential calculus Leibniz (a German) came up with it at roughly the same time and his notation and approach is what actually stuck. In the case of Maxwell's equations it was Heaviside who came up with the vector formulation as four equations and it took Grassman, Cartan and Hodge to arrive at the modern two equation formulation in terms of differential forms.


Also, calculus was much messier on the theoretical side until Weierstrass, Riemann etc. "fixed" it.


Speaking of Leibniz, this paper on the relation between the logarithm and the chainette is interesting: https://www.maa.org/sites/default/files/pdf/awards/college.m...

(via https://fermatslibrary.com/s/how-to-find-the-logarithm-of-an... )



Thank you


One of the things that really struck me reading The Structure of Scientific Revolutions was the observation in it that you could almost always distinguish scientific fields from other by whether they taught students using the original works of those who made important discoveries or if they re-wrote them into easier to understand textbooks. If you can't separate the truth of what someone said from the way they said it you might be doing something useful but you don't have the tools to be making actual progress.


Try looking at his Chemistry notebooks though...


I would add that, unlike Newton:

3. Maxwell's equations are difficult to view as a whole since they are 6 or 7 separate things that should each be understood individually. This is true of Newton's 3 laws too, but they are easy enough to bring together that you can teach them to high school students. While it is true that Newton's laws are about different things too, I find it personally more coherent to call them "Newton's Laws". Maxwell's equations are to me more like "Maxwell's List of Equations". Maxwell also served a similar role that Euclid did: He did a lot of curation of contemporary results.

Edit: In terms of point 2. I think as you say that Einstein was in fact not the first to make that assumption. Maxwell already knew about it, and his contemporaries did, but Einstein was successful in taking the assumption (about light being constant) further.


> Maxwell's equations are difficult to view as a whole since they are 6 or 7 separate things that should each be understood individually

How so? You can't understand electricity and magnetism separately because they both affect each other; the clearest triumph of Maxwell's equations is that they describe electromagnetic waves, but you need the complete system of equations to do that.


I guess this description is subjective in terms of wholes or parts, but to me the each of the equations in isolation makes sense too. They all handle separate parts of electricity and of magnetism. For me, personally, I find it difficult to visualise them all at the same time. I have to look at each one carefully and then relate that to the whole which would be what we now call electromagnetism.


If someone taught you that way they failed as a teacher. I am sorry to say that but they are a whole (with the exception of displacement current which needs Faraday tensor (which you can get relativistic invariants from too) to understand)


Maybe one can approach this from the other side. Knowing electromagnetism today, what is the most complete and simple framework for modelling it?

The set of equations from Maxwell's book are to me a heterogeneous presentation. If there is a category of electromagnetic objects or perhaps some other pure mathematics framework that synthesises everything together, then I would call that thing the whole.


Sure. You can start from relativity and show that for example the antisymmetric Faraday tensor (P in the link [1] below) can get you a lot of phenomena (not all the way, accelerating charges are hairy, even for Feynman, see the link [2])

To note, the reference to finding the relativistic invariant from the tensor in the first link goes back to the first edition of Landau and Lifshitz. The problem was removed from later editions because only a masochist wants to find invariants of 4x4 matrices by hand.

[1] https://www.mathpages.com/home/kmath647/kmath647.htm [2] https://www.mathpages.com/home/kmath528/kmath528.htm


IIRC, the modern formulation of Maxwell's equations is due to Oliver Heaviside. He formulated the laws concisely using vector calculus. We probably need to look at Heaviside as well to get the full historical picture.


Heaviside is well worth looking at anyway. I don't know much about him personally, aside from the fact that the unit step function is named after him (in a classical case of "let's reduce a profound body of work to one incidental concept to which we attach the name"; see also the Kronecker delta), but I do know that his work was deep and, like much work destined to be of importance in applied mathematics, initially offended mathematicians by its emphasis on practicalities rather than mathematical purity. https://en.wikipedia.org/wiki/Oliver_Heaviside


I've been reading a book about Heaviside (Forgotten Genius). Something I had not appreciated was that he was an Engineer. I suppose I imagined a nerdy dude in his mother's basement trying to reformulate Maxwell's work. But in reality he was a guy who spent his working life trying to do things like find the location of breaks in subsea cables; figure out how to transmit higher data rates through long cables, and so on. It was in trying to solve these engineering problems that he ended up becoming interested in the mathematics of transmission line propagation and hence Maxwell's work.


I agree - there is a great book by Paul Nahin. I bought it since I wanted to understand how someone got the idea of using complex numbers in electrical engineering. But I have found the book to be very hard to skim. I really have to work along with it to understand the technicalities.


Agreed as well. Nahin is comprehensive and authoritative, but a good biography of Heaviside for people who aren't already physicists or engineers has yet to be written (or at least yet to be discovered by me.)


You may be interested to know that both Maxwell's and the Einstein Field Equations can be derived by minimizing an action, cf.

https://en.wikipedia.org/wiki/Einstein%E2%80%93Hilbert_actio...

https://en.wikipedia.org/wiki/Electromagnetic_tensor#Lagrang...

Enjoy!


Well, sure, but that only pushes the question back one step further: who told you which action to pick? Why pick R and not R^2, for GR, for example?

[ Just to avoid someone spending too much time on an explanation IAAPhysicist; yes I understand effective field theory and the preference for Lagrangians built from relevant/marginal operators. ]


Can't you get there by assuming electrical effects propagate at the speed of light as well? And magnetism falls out as a result?


Most things in physics are not really provable at a mathematical level, and mathematical certainty is fundamentally not the goal whatsoever. The idea behind physics is to work as a detective, and use experimentation to establish if some mathematical model holds true or not. It is the experimentation that shows "proof" of a physical statement. Even if the mathematics behind some model shows a supposed proof of something, it has no meaning unless that "proved math" is tested against experiments.



>The major achievement of Maxwell's equations is that you can predict velocity of light

This is true, but the other major achievement of Maxwell's equations was his formulation of the displacement current:

https://en.wikipedia.org/wiki/Displacement_current

which underlies the wave equation and which "symmetrizes" the loop laws for electric and magnetic fields. This was the "missing link" between the previously known laws of electromagnetism (Ampere's law, Gauss's law, and Faraday's law) and the full theory given in Maxwell's equations.


One more point you may want to internalize:

If you integrate the Faraday / Ampere equations, the divergence of the Magnetic field will never change, and the divergence of the Electric field ("charge") will obey a continuity equation with the current.

So the two Gauss Laws don't really have any Dynamic content, they're just requirements on the initial conditions of the Fields.


and many other things in Physics, Maxwell equations are also simply laid out as lets assume these.

I'm not sure I understand what you mean by this. These are all experimentally verifiable.


That's the point - they're concise descriptions of observable behaviour in terms of a specific set of mathematical metaphors.

They have no independent mathematical derivation from some underlying set of first principles.


That sounds like a definition of 'just assumed' by which most things are just assumed. That would be broad to the point of uninterestingness and I wonder if that's what the poster really meant.


>most things are just assumed.

ah! but thats the problem - most assumed things are wrong. it is non-trivial to generate a set of assumptions that matches observations with the current technology and with all future technology, with no base from which to derive your new laws.


most assumed things are wrong.

That sounds suspiciously like an assumed thing. Compared to Maxwell's equations, at a minimum.


One of my best memories in college was going through wedge product, differential forms and connections, and its application in the derivation of Maxwell's equations. When going through it, I always assumed someone came in and wrote those generalized forms (for students of differential forms: no pun intended ;) of his equations 50-100 years later. It's crazy to think that's how he approached E&M from a preliminary perspective.


> When going through it, I always assumed someone came in and wrote those generalized forms (for students of differential forms: no pun intended ;) of his equations 50-100 years later.

That's true though? Someone else here said the modern formulation is due to Heaviside; Maxwell's original version had dozens of equations because we didn't have all that vector calculus notation at the time.


Derivation of Maxwell equations from the corresponding Lagrange form does not require to know differential geometry. But some understanding of that is useful to realize that it is the Maxwell equations that sort-of follows from the constness of the speed of light which in turn reflects the underlying geometry of the model of space-time.

And the speed of light is constant only so far as that model matches the reality.


> There is no such thing as "proof" of Maxwell's equations.

Is there proof of any "physics"?

Physics is the science that studies "physical models", which are just that, "models". Whether a model is useful (or not), or how accurate a model is, is determined by using the model to make a prediction, and then doing an experiment to check how bad the prediction is. Measuring a prediction is hard, and typically requires repeating the same experiment many many times, which at best produces a probability density distribution of the prediction the model should produce.

This process is called "model validation", but it does not prove the model correct. For example, we can validate Newton's laws to predict the weight of many objects on earth relatively accurately. But this does not mean that Newton's laws are correct, that they would produce correct outcomes if you were moving at the speed of light, etc.

When people talk about "proving physical models correct", I honestly have no idea what is it that they want, or how do they expect that this is done. If God was real and would answer to us, I guess we could ask God if a particular model is correct. But that's the only way I can think of to deliver one of these "proofs".


> Physics is the science that studies "physical models"

This is a completely wrong thing to say. Physics studies (some aspects of) Nature by creating models and testing their predictions.

(You as a student might well be studying "physical models", but it would be a funny thing to say that for example "biology is a science where the teacher is yelling at students while trying to attract their attention to some nasty-looking posters.")


I'm not sure that God understands and can explain physics.


> Is there proof of any "physics"?

Arguably there is `disproof`.


> There is no such thing as "proof" of Maxwell's equations.

Yes. But still there is the problem of self-consistency which can be difficult especially if boundary conditions come into play. This is of course a purely mathematical affair.


No such proof... except for the T-shirt, right?


A lot of the early days failure of Maxwell was his notation was insane. As others notice below; Heaviside notation[1] made it a lot easier to follow and calculate things. Notation is underrated as increasing human power: I'm not sure General Relativity would have been possible without Christoffel notation. FWIIW fun stuff to read; Whittaker's theories of Aether and Electromagnetism gives the detailed history of all this. Cheap from Dover books. Mind blowing in how theoretical physics used to work.

[1] https://books.google.com/books?id=nRJbAAAAYAAJ&pg=PA109#v=on...


Einstein Notation is similar in this regard. It makes the 'mechanics' of tensor mathematics much easier to deal with. Kinda like how arabic numerals are a lot easier to manipulate than roman numerals.

I'm not sure sure, but I believe that Helen Dukas, Einstein's secretary, is the one who came up with Einstein notation originally. She was just trying to get through his notes faster when writing things up and that method of notation later became accepted. I can't find that citation though, so calling it Dukas Notation is for the moment erroneous.


To hell with it. So much great work from lowly support, frequently women, has been robbed of its rightful place. Call it Dukas notation until proved otherwise. Crick and Watson still have their ill-gotten novel prizes and most are unaware they are little more than thieves. (Wildly racist and sexist, sure, hard to deny that nowadays so we don't talk about it as loudly as we should).

tl;dr

"Dukas notation" just sounds good.


I believe that you have misunderstood. I do not know if Helen Dukas in fact is responsible for this.


I don't think so, fwiw.

"Call it Dukas notation until proved otherwise."


The 1st edition of his "History of the Theories of Aether and Electricity" is available for free here: https://archive.org/details/historyoftheorie00whitrich/page/...


Yes! Even the Pythagorean Theorem is a mouthful if you express it in Latin prose rather than algebra. And I assume universal gravitation is the same way.


Maxwell's theory is hard to understand because it's based on an almost-but-not-quite-appropriate algebra. It's like stringly-typed programming. It works, but it's messy and hard to understand.

Maxwell's set of coupled differential vector electromagnetic equations simplify to a hilariously short single equation in Geometric Algebra.

A random Google turned up this paper comparing classic EM and the GA formulation. It's not even the simplest possible representation, because that uses natural units and a 4D GA to basically condense the entire set of EM theory into about 4 characters worth of equation that is fully relativistic for free: https://www.researchgate.net/publication/47524066_A_simplifi...

It's almost a joke. To me it's reminiscent of looking at beginner programmers. You see them do crazy things like calculate a date "next month" by taking apart the pretty-printed date string, parsing to find the month, realising that sometimes the day part is one digit and sometimes it's two, then having to worry about m-d-y or d-m-y formats, building little tables of "days per month" and leap years... and so on.

They can write pages and pages of error free code and it's still Wrong because the correct way is to just call "thedatevar.AddMonth(1);" and be done with it.

PS: 3D game engine vector algebra libraries have all of this in common with the physics maths. Things like the cross product being bizarre, having to pick a basis, not being able to interpolate rotations, gymbal lock, rounding error, having different maths for 2D and 3D, a bunch of special cases to worry about, and so forth...

Watch Enki Mute's Siggraph 2019 presentation on Geometric Algebra. It's mind-blowing how many stupid little quirks of vector algebra just evaporate if you're prepared to step outside of your comfort zone: https://youtu.be/tX4H_ctggYo


The speaker in the video runs a community for people interested in geometric algebra.

https://bivector.net/

Check out the demo https://observablehq.com/@enkimute/animated-orbits

Job the discord https://discord.gg/vGY6pPk


GA seems to be getting more popular, but I still see vector algebra used in areas such as robotics where GA is clearly superior. In that application the small performance difference between GA and vector algebra is irrelevant, but the advantages of GA are huge.

In game engines the vector algebra method is slightly faster, so the elegant programming model is often sacrificed in the name of performance.

That, I can understand.

Why physicists use at least 4 separate formulations of the EM equations I can't understand, especially considering the vector version is the worst yet the most popular.


Reducing Maxwell's equations from the 4-equation Heaviside form to a single equation doesn't seem like necessarily a good thing. Each of the four has a clear physical meaning in the vector calculus formulation, which may be lost when moving to a single equation.



It's all about inertia. Everybody either already knows vector algebra or will have to learn it anyway, to understand the vast majority of existing literature.

Geometric Algebra may be superior, but it's "one more thing" to learn/teach. It might catch on, but that might take another century of preaching.


How can it be faster considering that they are equivalent?


Computers don't care about mathematical equivalence. For instance, dividing by a number is generally slower than multiplying with the inverse of that number. Also, memory access patterns can make a difference of night and day.

I'm not sure though where exactly the idea that GA is slower comes from. It's all down to the application and implementation. Perhaps it's because GA generalizes so well to higher dimensions that many libraries are overly generalized and thus slower than vector algebra libraries that need special cases anyway.


The 3D case is often handled using a conformal projective geometric algebra, which is 5D and requires 2^5 = 32 elements for a general multivector. This is twice that of a 4x4 matrix, and and eight times the size of a 4-element homogeneous vector as typically used in most 3D engines.

Of course, there are all sorts of optimisations. Most GA libraries are actually based on code-generation and support various "subsets" of the full GA to efficiently represent things like vectors only. From what I've seen, even a well-tuned library has approximately a 25% overhead for hot paths and 50-100% is typical.

This is a bit like representing a simple rotation like the hand of a clock, with either just one angle, or a two-element unit vector pointing in the direction of the rotation. The one-element angle has a bunch of special cases, like have to be checked to see if it goes past 360 degrees and then reset back to the 0..360 range. Representing this with two numbers just requires multiplication with a matrix, which involves no conditionals or modulo arithmetic.

This is analogous to the vector vs geometric algebra. Typically, GA has no special cases and "just works", but it does so by "uncompressing" the compact vector representation. It's going to be slower.


> How can it be faster considering that they are equivalent?

Computing `ln(e^x)` and `x` are equivalent, but the latter is faster.


Well, to be fair, these are equal, not just equivalent.


Find a good laugh in 40 minutes of the YouTube about people in the old days do not have -ve no. and 0. Also, the surprise way to do algebra by geometry is fun. Studying pre-socrates at the moment and there is a speculation whether the "unbounded" guy is also a geometric guy.


Do you know if they've done work on formulating the Vlasov equation (coupled evolution of EM fields and phase-space density of populations of charged particles) in this way?


I know that Glasser and Qin at Princeton are working on geometrical view of plasma physik. They aim to get a view of particle-in-cell methods (basically a way to discretize the Vlasov equation using sparse sampling) using a geometric view point expressed using discrete exterior calculus. But I am not sure how much of that is ready or published.


I did research on PIC in grad school.

In the process I gained ... a healthy skepticism of the Qin group's papers. The man himself is sharp, though. Interested to see what they come up with.


Just for reference re that "Siggraph 2019 presentation":

https://en.wikipedia.org/wiki/Geometric_algebra

"The geometric product was first briefly mentioned by Hermann Grassmann" (1809 – 1877) "In 1878, William Kingdon Clifford greatly expanded on Grassmann's work to form what are now usually called Clifford algebras in his honor (although Clifford himself chose to call them "geometric algebras"). For several decades, geometric algebras went somewhat ignored, greatly eclipsed by the vector calculus." "The term "geometric algebra" was repopularized in the 1960s by Hestenes."

The background image on:

https://bivector.net/

is William Kingdon Clifford.


>It's almost a joke. To me it's reminiscent of looking at beginner programmers. You see them do crazy things like calculate a date "next month" by taking apart the pretty-printed date string, parsing to find the month, realising that sometimes the day part is one digit and sometimes it's two, then having to worry about m-d-y or d-m-y formats, building little tables of "days per month" and leap years... and so on.

This illustrates the entire field of programming as we know it. It actually happens among all programmers but for less obvious things. Especially with design patterns. I've seen programmers use so much structure for the sake of a design pattern on something that could otherwise be 10x simpler.


That's awesome


It appears no one here has actually read the essay. He's not talking about mathematical complexity of Maxwell's equations (and no, you can use differential forms, Clifford algebras, quaternions, vector calculus, but no matter how you write it, it's still the same thing, and at the end of the day, you will end up solving exactly the same set of coupled partial differential equations, to the letter --there's no magical mathematical notation that makes this go away).

The difficulty he's referring to is in the physics (and not mathematics) associated with the idea that fields are real, fundamental physical entities, and cannot be reduced to mechanical models with "gears and wheels" permeating the space (which was a popular idea back then).


You prompted me to read the actual essay. It was considerably more interesting than the comments here would have suggested; which is not a surprise, given it was penned by Freeman Dyson.

His insight that Maxwell's contemporaries lacked even the language to fully describe what a transformative idea he had is acutely interesting. Once again language both shapes and traps thinking.

He makes the same point with quantum mechanics as well - that we're constrained by not having the proper language to describe the most fundamental behaviour. I think it's fair to say that point still stands today.


Modern notation has contributed greatly to making Maxwell's equations more understandable. I dare you to have a look at Maxwell's original paper (https://royalsocietypublishing.org/doi/10.1098/rstl.1865.000...) and try to understand it, I'd say even for a seasoned physicist it's not trivial. Compare that to the modern differential form of the equations (e.g. https://en.wikipedia.org/wiki/Maxwell%27s_equations), which (IMHO) are really easy to understand even for second-year Physics students (given that you understand the underlying vector operators).

Personally, I found the "flow analogy" always the most intuitive, and there are some books that teach electrodynamics in that way. Typically one would start with electrostatic problems and work one's way to the more complicated stuff like magnetic fields.

I guess to derive a "complete" understanding of classical electrodynamics you need to understand the concept of relativity as well. Magnetism is a consequence of relativity and the finite speed of light, so if you accept that it becomes easier to understand (IMHO). Of course you then have to "understand" relativity, which just moves the problem to a different area. But then again, it's always like that in Physics :)


> Magnetism is a consequence of relativity and the finite speed of light

That doesn't make any sense.


Here's a video explainer: https://www.youtube.com/watch?v=1TKSfAkWWN0

Magnetism affects moving charges, but not stationary charges. A magnetic field in your reference frame is really just an electric field in the moving charge's reference frame.


With a lot of theories I find the breakthrough in discovery is separate from the breakthrough in explanation.

How quickly was it that the familiar four vector equations were published? When you come across them now in an undergrad class, it seems so simple because a lot of work went into distilling the insight into the bits that matter. When the theory was still being fleshed out, there were probably a load of intermediate steps acting like a scaffolding.

Likewise with relativity, you can pick up a book and read about it because a lot of people have looked at it over the years and figured out which explanations actually work.


I think it was Oliver Heaviside who made Maxwell’s equations accessible, maybe even in something like the vector calc forms frequently used today.


Explanations that work for whom? I can't speak for electromagnetism or physics. What I can say is that for years I felt like I didn't have the least bit of clarity on color modeling until I referred to Maxwell's primary materials, which are cited by just about no one sadly.


Could you explain this a bit? (Excuse me if I'm asking too much, I don't really know what you mean by color modeling, and how it connects to Maxwell - or maybe you are talking about wavelengths?)



One of Modest Maxwell's hobbies was color photography. I wouldn't want to rob you of the joys of discovering the rest.


This is a wonderful essay, and I wish I had come across it as a freshman. For example:

"Just as in the Maxwell theory, the abstract quality of the first-order quantities is revealed in the units in which they are expressed. For example, the Schrödinger wave-function is expressed in a unit which is the square root of an inverse cubic meter. This fact alone makes it clear that the wave-function is an abstraction, for ever hidden from our view. No-one will ever measure directly the square root of a cubic meter."

That, of course, is an observation about quantum mechanics, and it is only on rereading the essay today that I noticed its structure: Dyson begins with an anecdote about Maxwell giving an address that spent most of its time discussing some features of Helmholz's "splendid hydro-dynamical theorems" (with a nod to Kelvin, who I guess was present), before briefly drawing analogies to his electrodynamical theory. Dyson brings this up within an essay in which he uses Maxwell's theory to make a point about coming to understand quantum mechanics, though he spends rather more time on his ultimate topic than Maxwell did in his talk.


Maxwell's equations are an artifact of the Mathematical Formulation and they can be written in a perhaps easer way. If one describes them in Geometrc Algebra terms it is just one equation [0] and one can understand it in a natural way.

During the nineteenth century and the discussions on different formal ways to describe vector fields Hamilton's quartenion system prevailed. Maxwell had reservations and presented his treatise of electricity and electromagnetism as "the introduction of the ideas, as distinguished from the operations and methods of Quaternions" [1], thus the formulation we know.

[0] Mathematical descriptions of the electromagnetic field https://en.wikipedia.org/wiki/Mathematical_descriptions_of_t...

[1] The vector algebra war a historical perspective https://arxiv.org/pdf/1509.00501.pdf


Currently with Maxwell's equations you can point to each one and couple it with an experiment.

If you simplify the equations any further, it will be a result of mathematical elegance rather than fundamental undergraduate-level physics.


> and couple it with an experiment.

Only a handful of special cases that have obvious "paradoxes" that cannot be explained with Maxwell's equations.

It's not even a fully general set of equations classically, it can only handle a certain constrained motions at low velocities, short distances, and generally without accelerations.

I mean sure, a map is simple to understand and is fine for navigating a city, but let's not pretend the Earth is flat and then only introduce its spherical geometry in 2nd year studies. That's not the right pedagogical approach.


> Only a handful of special cases that have obvious "paradoxes" that cannot be explained with Maxwell's equations.

What are you talking about?

> It's not even a fully general set of equations classically, it can only handle a certain constrained motions at low velocities, short distances, and generally without accelerations.

Huh? Maxwell's Equations are relativistically invariant and cover all classical electrodynamic phenomena. The only thing they don't cover is quantum mechanics (although in quantum field theory Maxwell's Equations are still the field equations of the quantum electromagnetic field, so even there they play a role).


Maxwell's equations only apply to static states, including steadily circulating currents. It's not generally applicable without extensions to more complex scenarios such as handling time delays and arbitrary movement.

https://en.wikipedia.org/wiki/Li%C3%A9nard%E2%80%93Wiechert_...

https://en.wikipedia.org/wiki/Jefimenko%27s_equations

The apparent "paradoxes" are literally the reason Einstein started on his journey to develop Special Relativity.

https://en.wikipedia.org/wiki/Moving_magnet_and_conductor_pr...

https://en.wikipedia.org/wiki/Relativistic_electromagnetism


> Maxwell's equations only apply to static states

Nonsense. Maxwell's equations are the equations of classical electro dynamics.

> It's not generally applicable without extensions to more complex scenarios such as handling time delays and arbitrary movement.

The equations for the Lienard-Wiechert potentials are mathematically equivalent to Maxwell's Equations (when you put those equations in potential form instead of field form and make an appropriate choice of gauge).

Jefimenko's equations are also mathematically equivalent to Maxwell's Equations; their originator believed that the causality properties of those equations would be clearer when put in his preferred form. Whether or not he was right is a matter of considerable debate.

> The apparent "paradoxes"

Are a result of lack of understanding on the part of the people claiming and promoting them.

> are literally the reason Einstein started on his journey to develop Special Relativity

Nonsense. The problem Einstein had when he developed SR was not Maxwell's Equations; it was Newton's equations. He realized that Maxwell's Equations and Newton's equations were inconsistent. Every other physicist at the time who realized that (and there were many) believed that the way to fix that problem was to modify Maxwell's Equations and leave Newton's equations the same. Einstein, however, realized that the way to fix the problem was to modify Newton's equations and leave Maxwell's Equations the same. The result was SR, and the rest, as they say, is history.


Thankyou very much for 1. As someone who thinks mathematically and spatially, the Clifford algebra therein discussed is a vital connection.


This is a lovely essay, but I think it goes a little too far in claiming that the electromagnetic field is intangible or immeasurable, in slightly forced analogy with quantum mechanics.

> We now take it for granted that electric and magnetic fields are abstractions not reducible to mechanical models. To see that this is true, we need only look at the units in which the electric and magnetic fields are supposed to be measured. The conventional unit of electric field-strength is the square-root of a joule per cubic meter... This does not mean that an electric field-strength can be measured with the square-root of a calorimeter. It means that an electric field-strength is an abstract quantity, incommensurable with any quantities that we can measure directly.

A more conventional way to think of the dimensions of the electric field is [force]/[charge] (e.g. units of Newtons/Coulomb), and you can observe the electric field by observing the force it exerts on a charged particle through

f=qE

for example by observing the trajectory of an electron in a cloud chamber.

Dyson says that the square of the field is a measurable energy density, but it’s arguably harder to measure an energy density than it is to measure a force.

Dyson’s point is much more true for quantum mechanics, where the only measurable things seem to be quadratic combinations of the wave function, and I do like the analogy he points out with electromagnetism, but I think he oversimplifies a bit to make his point.


> not reducible to mechanical models [...] is much more true for quantum mechanics

Yes, but... a caveat.

How big is an atom? "Unimaginably small" is an oft repeated phrase. What is an atom? "Definitions [...] models [...] skill at switching between models". Electron behavior? Quantum... "unintuitive... the equation is understanding".

So how well is "small" taught? Horribly, even by the lackluster baseline of current science education research. Asking first-tier medical school graduate students how big cells are, is not happy thing. But hey, maybe cells are "unimaginable" too.

So how well are atoms taught? One challenge in teaching high-school stoichiometry, is students not thinking of atoms as real, as physical objects. But hey, maybe that's a failure to "switch models".

So how well is electron behavior taught? Well, when students use the many realistic interactives emphasizing molecular electron density... oh wait. Well, when students view the many molecular dynamics videos showing electron density... oh wait. They do exist... now find them without using google scholar and sending people email. :/ But hey, if students ever do see them some year, maybe no understanding will result, given how unintuitive it all is.

Punchline? Teaching things badly seems associated with failure attribution errors. As with education research that's "we taught atoms really badly... surprisingly that didn't work... so we draw the obvious conclusion... students of this age aren't developmentally able to understand atoms".

And physics side... there often seems a blurred vision of objectives and their properties. There are a great many plausible learning objectives between "atoms are real" and "i∂_{t}ψ=Hψ". And the usefulness of "mechanical" models varies greatly among them. So "the equation is understanding" gets repeated, in contexts where it's inappropriate, and where it distracts from a broad long-term societal failure to improve wretched science education content.


This paper captured what I encountered as a TA in the 90's: students could solve the integral forms, but had trouble with the differential due to fields. I think this is because some students have a very difficult time going from the mental image of fields as 2D arrows, to div, curl and grad in practice. There's a big step function there in manipulating the equations. I wasn't a very good TA because I didn't have much luck explaining to the struggling students (still feel bad about that, sorry folks, I was getting my master's degree and had no choice to be a TA). And students who didn't understand this in second-year physics had even more trouble when their EE classes went into field and waves classes (conductors, antennae, etc.).

TL; DR: I think this is more a problem with teaching advanced differential calculus concepts than Maxwell.


Once you understand VC operators (grad, div and curl) and the associated intergals, Maxwells Equations are actually really intuitive and quit beautiful. I was lucky that the maths course I was on covered those a few weeks before we started electrodynamics. Plus I had Mary Boas Mathematical Methods in the Physical Sciences which teaches this material (and Greens theorem etc) very well.


Theory of electromagnetic field (of which Maxwell's is a part) was one of my favorite topics in the university. The other being digital signal processing. I don't have very good memory, so I have to rely on things having a lot of consistency and internal structure so they could be efficiently "compressed" and "recovered". Both of these fields have that - the math is consistent, beautiful, and somewhat more intuitive than in other fields. Theory of electromagnetic field does assume that your integral and differential calculus is pretty good though.


> Mathematics is the language that nature speaks.

Nature seems to speak in forces and had been doing so long before we arrived and started honing our math skills.

Nature made an imprint on our math. Newton came up with new math to make more accurate physical predictions.

There is a good reason to be an adherent to the mathematical descriptions. We might want the photon to go through only one slit, To help make nature more classical and understandable, but that’s not the nature we know and observe. So we keep to the math and let it speak to us, lest our work-a-day understandings of the world lead us astray.


I've seen the "another theory which I prefer" quote mentioned elsewhere (possibly with reference to this paper) as being a form of modesty. I would say that it looks to me like typical British understatement, which is often misunderstood (particularly by Americans) and may have been simply an attempt at humour on his part.

https://en.wikipedia.org/wiki/English_understatement


The author of this piece, though, is British.


So not so much hard to understand (at least for trained physicists at the time), rather too much modesty from Maxwell himself to shout about the importance of his discovery.


Excerpt:

"Maxwell explained how the ancient theory that matter is composed of atoms ran into a logical paradox.

On the one hand, atoms were supposed to be hard, impenetrable and indestructible.

On the other hand, the evidence of spectroscopy and chemistry showed that atoms have internal structure and are influenced by outside forces.

This paradox had for many years blocked progress in the understanding of the nature of matter. Now finally the vortex theory of molecules resolved the paradox. Vortices in the aether are soft and have internal structure, and nevertheless, according to Helmholtz, they are individual and indestructible.

The only remaining task was to deduce the facts of spectroscopy and chemistry from the laws of interaction of the vortices predicted by the hydrodynamics of a perfect fluid."


This is a lovely essay. Wonderful layers-of-reality analogy to quantum mechanics.


Yes. That's the reason why I saved it. I don't know much about quantum mechanism, but I could feel a vague intuition developing after reading this essay.


We've all gotten used to the idea that Electric and Magnetic fields are real but then we learn about the Aharonov-Bohm effect and suddenly, maybe it's the electric potential and magnetic vector potential are what is 'real'. But then, there's all the issue with gauges.

I think that the best way of teaching electromagnetism might still be in the future.


This is fundamentally a sociological (or philosophy of science) essay rather than about why Maxwell's equations are still so hard to understand for the student. In that it's quite interesting but I found the jump to a parallelism between the model of Maxwell's and quantum mechanical models a bit of a stretch.


Very interesting article, I think there's a lot of parallels there to business and startups. I'm a technical person who has always struggled to promote, and it seems scientists/researchers face the same issues. The ideal is to be a showman and technically brilliant.


Previous interesting HN discussion on the same paper: https://news.ycombinator.com/item?id=18837677


" We, with the advantage of hindsight, can see clearly that Maxwell's paper was the most important event of the nineteenth century in the history of the physical sciences. If we include the biological sciences as well as the physical sciences, Maxwell's paper was second only to Darwin's ``Origin of Species''."

By what metric? How has Darwin's ``Origin of Species`` been even half as prolific as Maxwell's work? Maxwell's work revolutionized our understanding of the physical world and had lead to the creation of thousands upon thousands of technological advances. It has literally forged the modern world.


One answer to your question:

https://en.wikipedia.org/wiki/Nothing_in_Biology_Makes_Sense...

Darwin:Maxwell :: biology:physics is a very apt comparison.


I think Darwin gets special note for the fact that while there are many incredible works of theory in math and physics, there's really only one in biology: On the Origin of Species by Charles Darwin.


Certainly not saying that it's not noteworthy, but if I had to make a top 5 list he wouldn't be on it. Apparently, my metric and the authors metric are different.


Does the professor’s claim that the “origin of species” is more important than Maxwell’s equations got any real takers? Or is it just his personal opinion?


what happened to Maxwells original quaternion based equations. the changes made against his will by lorrentz and others to simplify after his death were an affront to science and have set back humanity a hundred years.


Ah, yes. It's not the ignorant physicists' fault that Maxwell's theory hasn't got the attention it deserved. It's Maxwell's modesty's fault that set back the science of physics two decades. What a ridiculous notion.


Try SU(3) Yang Mills!


Why not SUSY then? Once you understand the formalism you can derive any field theory to your liking.


> Modesty is not always a virtue

Seriously... where and when in the last 50 years in modern society was modesty ever a virtue? I certainly wouldn't expect anything like that at Princeton in the US.

Otherwise an interesting essay nevertheless, but I certainly don't agree with all the points.

> Mendel's modesty setback the progress of biology by fifty years.

Hilarious conjecture.




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: