Hacker News new | past | comments | ask | show | jobs | submit login

This is one of those links where just seeing the title sets you off, thinking about the implications.

I'm going to have to spend more time digesting the article, but one thing that jumps out at me, and maybe it's answered in the article and I don't understand it, is the role of time. Generally in physics, you're talking about a quantity being conserved over time, and I'm not sure what plays the role of time when you're talking about conserved quantities in machine learning -- is it conserved over training iterations or over inference layers, or what?

edit: now that i've read it again, I just saw that they described in the second paragraph.

I'm now wondering if in something like Sora that can do a kind of physical modeling, if there's some conserved quantity in the neural network that is _directly analagous_ to conserved quantities in physics -- if there is, for example, something that represents momentum, that operates exactly as momentum as it progresses through the layers.




In physics, the conserved quantity isn't always time. Invariance over time translation is specifically conservation of energy. Invariance over spatial translation is conservation of momentum, invariance over spatial rotation is conservation of conservation of angular momentum, invariance of electromagnetic field is conservation of current, and invariance of wave function phase is conservation of charge.

I think the analogue in machine learning is conservation over changes in the training data. After all, the point of machine learning is to find general models that describe the training data given, and minimize the loss function. Assuming that a useful model can be trained, the whole point is that it generalizes to new, unseen instances with minimal losses, i.e. the model remains invariant under shifts in the instances seen.

The more interesting part to me is what this says about philosophy of physics. Noether's Theorem can be restated as "The laws of physics are invariant under X transformation", where X is the gauge symmetry associated with the conservation law. But maybe this is simply a consequence of how we do physics. After all, the point of science is to produce generalized laws from empirical observations. It's trivially easy to find a real-world situation where conservation of energy does not hold (any system with friction, which is basically all of them), but the math gets very messy if you try to actually model the real data, so we rely on approximations that are close enough most of the time. And if many people take empirical measurements at many different points in space, and time, and orientations, you get generalized laws that hold regardless of where/when/who takes the measurement.

Machine learning could be viewed as doing science on empirically measurable social quantities. It won't always be accurate, as individual machine-learning fails show. But it's accurate enough that it can provide useful models for civilization-scale quantities.


> In physics, the conserved quantity isn't always time. Invariance over time translation is specifically conservation of energy.

That's not what i meant.

When you talk about "conservation of angular momentum", the symmetry is invariance over rotation, but the angular momentum is conserved _over time_.


> It's trivially easy to find a real-world situation where conservation of energy does not hold (any system with friction, which is basically all of them)

Conservation of energy absolutely still holds, but entropy is not conserved so the process is irreversible. If your model doesn't include heat, then discrete energy won't be conserved in a process that produces heat, but that's your modeling choice, not a statement about physics. It is common to model such processes using a dissipation potential.


Right, but I'm saying that it's all modeling choices, all the way down. Extend the model to include thermal energy and most of the time it holds again - but then it falls down if you also have static electricity that generates a visible spark (say, a wool sweater on a slide) or magnetic drag (say, regenerative braking on a car). Then you can include models for those too, but you're introducing new concepts with each, and the math gets much hairier. We call the unified model where we abstract away all the different forms of energy "conservation of energy", but there are a good many practical systems where making tangible predictions using conservation of energy gives wrong answers.

Basically this is a restatement of Box's Aphorism ("All models are wrong, but some are useful") or the ideas in Thomas Kuhn's "The Structure of Scientific Revolutions". The goal of science is to from concrete observations to abstract principles which ideally will accurately predict the value of future concrete observations. In many cases, you can do this. But not all. There is always messy data that doesn't fit into neat, simple, general laws. Usually the messy data is just ignored, because it can't be predicted and is assumed to average out or generally be irrelevant in the end. But sometimes the messy outliers bite you, or someone comes up with a new way to handle them elegantly, and then you get a paradigm shift.

And this has implications for understanding what machine learning is or why it's important. Few people would think that a model linking background color to likeliness to click on ads is a fundamental physical quality, but Google had one 15+ years ago, and it was pretty accurate, and made them a bunch of money. Or similarly, most people wouldn't think of a model of the English language as being a fundamental physical quality, but that's exactly what an LLM is, and they're pretty useful too.


It's been a long time since I have cracked a physics book, but your mention of interesting "fundamental physical quantities" triggered the recollection of there being a conservation of information result in quantum mechanics where you can come up with an action whose equations of motion are Schrödinger's equation and the conserved quantity is a probability current. So I wonder to what extent (if any) it might make sense to try to approach these things in terms of the really fundamental quantity of information itself?


Approaching physics from a pure information flow is definitely a current research topic. I suspect we see less popsci treatment of it because almost nobody understands information at all, then trying to apply it to physics that also almost nobody understands is probably at least three or four bridges too far for a popsci treatment, but it's a current and active topic.


This might be insultingly simplistic, but I always thought the phrase "conservation of information" just meant that the time-evolution operator in quantum mechanics was unitary. Unitary mappings are always bijective functions - so it makes intuitive sense to say that all information is preserved. However, it does not follow that this information is useful to actually quantify, like energy or momentum. There is certainly a kind of applied mathematics called "information theory", but I doubt there's any relevance to the term "conservation of information" as it's used in fundamental physics.

The links below lend credibility to my interpretation.

https://en.wikipedia.org/wiki/Time_evolution#In_quantum_mech...

https://en.wikipedia.org/wiki/Bijection

https://en.wikipedia.org/wiki/Black_hole_information_paradox


Is there any way to deduce which invariance gives which conservation? I mean for example: how can you tell that time invariance is the one paired with conservation of energy? Why is e.g. time invariance not paired with momentum, current, or anything else, but specifically energy?

I know that I can remember momentum is paired with translation simply because there's both the angular momentum and the non-angular momentum one and in space you have translation and rotation, so for time energy is the only one that's left over, but I'm not looking for a trick to remember it, I'm looking for the fundamental reason, as well as how to tell what will be paired with some invariance when looking at some other new invariance


The conserved quantity is derived from Noether's theorem itself. One thing that is a bit hairy is that Noether's theorem only applies to a continuous, smooth (physical -> there is some wiggle room here) space.

When deriving the conservation of energy from Noether's theorem you basically say that your Lagrangian (which is just a set of equations that describes a physical system) is invariant over time. When you do that you automatically get that energy is conserved. Each invariant produces a conserved quantity as explained in parent comment when you apple a specific transformation that is supposed to not change the system (i.e remain invariant).

Now in doing this you're also invoking the principle of least action (by using Lagrangians to describe the state of a physical system) but that is a separate topic.


The key point is that energy, momentum, and angular momentum are additive constants of the motion, and this additivity is a very important property that ultimately derives from the geometry of the space-time in which the motion takes place.

> Is there any way to deduce which invariance gives which conservation?

Yes. See Landau vol 1 chapter 2 [1].

> I'm looking for the fundamental reason, as well as how to tell what will be paired with some invariance when looking at some other new invariance

I'm not sure there is such a "fundamental reason", since energy, momentum, and angular momentum are by definition the names we give to the conserved quantities associated with time, translation, and rotation.

You are asking "how to tell what will be paired with some invariance" but this is not at all obvious in the case of conservation of charge, which is related to the fact that the results of measurements do not change when all the wavefunctions are shifted by a global phase factor (which in general can depend on position).

I am not aware of any way to guess or understand which invariance is tied to which conserved quantity other than just calculating it out, at least not in a way that is intuitive to me.

[1] https://ia803206.us.archive.org/4/items/landau-and-lifshitz-...


But momentum is also conserved over time, as far as I know 'conservation' of all of these things always means over time.

"In a closed system (one that does not exchange any matter with its surroundings and is not acted on by external forces) the total momentum remains constant."

That means it's conserved over time, right? So why is energy the one associated with time and not momentum?


Conservation normally means things don't change over time just because in mechanics time is the go to external parameter to study the evolution of a system, but it's not the only one, nor the most convenient in some cases.

In Hamiltonian mechanics there is a 1:1 correspondence between any function of the phase space (coordinates and momenta) and one-parameter continous transformations (flows). If you give me a function f(q,p) I can construct some transformation φ_s(q,p) of the coordinates that conserves f, meaning d/ds f(φ_s(q, p)) = 0. (Keeping it very simple, the transformation consists in shifting the coordinates along the lines tangent to the gradient of f.)

If f(q,p) is the Hamiltonian H(q,p) itself, φ_s turns out to be the normal flow of time, meaning φ_s(q₀,p₀) = (q(s), p(s)), i.e. s is time and dH/dt = 0 says energy is conserved, but in general f(q,p) can be almost anything.

For example, take geometric optics (rays, refraction and such things): it's possible to write a Hamiltonian formulation of optics in which the equations of motion give the path taken by light rays (instead of particle trajectories). In this setting time is still a valid parameter but is most likely to be replaced by the optical path length or by the wave phase, because we are interested in steady conditions (say, laser turned on, beam has gone through some lenses and reached a screen). Conservation now means that quantities are constants along the ray, an example may be the frequency/color, which doesn't change even when changing between different media.


my understandinf is that conservation of momentum does not mean momentum is conserved as time passes. it means if you have a (closed) system in a certain configuration (not in an external field) and compute the total momentum, the result is independent of the configuration of the system.


It certainly means that momentum is conserved as time passes. The variation of the total momentum of a system is equal to the impulse, which is zero if there are no external fields.


In retrospect: the earliest recognition of a conserved quantity was Kepler's law of areas. Isaac Newton later showed that Kepler's law of areas is a specific instance of a property that obtains for any central force, not just the (inverse square) law of gravity.

About symmetry under change of orientation: for a given (spherically symmetric) source of gravitational interaction the amount of gravitational force is the same in any orientation.

For orbital motion the motion is in a plane, so for the case of orbital motion the relevant symmetry is cilindrical symmetry with respect to the plane of the orbit.

The very first derivation that is presented in Newton's Principia is a derivation that shows that for any central force we have: in equal intervals of time equal amounts of area are swept out.

(The swept out area is proportional to the angular momentum of the orbiting object. That is, the area law anticipated the principle of conservation of angular momentum)

A discussion of Newton's derivation, illustrated with diagrams, is available on my website: http://cleonis.nl/physics/phys256/angular_momentum.php

The thrust of the derivation is that if the force that the motion is subject to is a central force (cilindrical symmetry) then angular momentum is conserved.

So: In retrospect we see that Newton's demonstration of the area law is an instance of symmetry-and-conserved-quantity-relation being used. Symmetry of a force under change of orientation has as corresponding conserved quantity of the resulting (orbiting) motion: conservation of angular momentum.

About conservation laws:

The law of conservation of angular momentum and the law of conservation of momentum are about quantities that are associated with specific spatial characteristics, and the conserved quantity is conserved over time.

I'm actually not sure about the reason(s) for classification of conservation of energy. My own view: we have that kinetic energy is not associated with any form of keeping track of orientation; the velocity vector is squared, and that squaring operation discards directional information. More generally, Energy is not associated with any spatial characteristic. Arguably Energy conservation is categorized as associated with symmetry under time translation because of absence of association with any spatial characteristic.


I'm a bit skeptical to give up conservation of energy in a system with friction. Isn't it more accurate to say that if we were to calculate every specific interaction we'd still end up having conservation of energy. Now whether or not we're dealing with a closed system etc becomes important but if we were to able to truly model the entire physical system with friction, we'd still adhere to our conservation laws.

So they are not approximations, but are just terribly difficult calculations, no?

Maybe I'm misunderstanding your point, but this should be true regardless of our philosophy of physics correct?


It is an analogy stating that dissipative systems do not have a Lagrangian, Noether's work applies to Lagrangian systems

Conservation laws in particular are measurable properties of an isolated physical system do not change as the system evolves over time.

It is important to remember that Physics is about finding useful models that make useful predictions about a system. So it is important to not confuse the map for the territory.

Gibbs free energy and Helmholtz free energy are not conserved.

As thermodynamics, entropy, and entropy are difficult topics due to didactic half-truths, here is a paper that shows that the nbody problem becomes invariant and may be undecidable due to what is a similar issue (in a contrived fashion)

http://philsci-archive.pitt.edu/13175/

While Noether's principle often allows you to see things that can often be simplified in an equation, often it allows you to not just simplify 'terribly difficult calculations' but to actually find computationally possible calculations.


A nice way to formulate (most) data augmentations is: a family of functions A = {a} such that our optimized neural network f obeys f(x) ~= f(a(x)).

So in this case, we're explicitly defining the set of desired invariances.


I think the most profound insight I've come across while studying this particular topic is the insight that information theory ended up being the answer to conserving the 2nd law with respect to Maxwell's demon thought experiment. Not to put too fine a point, but essentially the knowledge organized in the mind of the demon, about the particles in its system, was calculated to offset the creation of the energy gradient.

I found the thinking of William Sidis to be particularly thought provoking perspective on Noether's benchmark work, in his paper The Animate and the Inanimate he posits--at a high level--that life is a "reversal of the second law of thermodynamics"; not that the 2nd law is a physical symmetry, but a mental one in an existence where energy reversibly flows between positive and negative states.

Indeed, when considering machine learning, I think it's quite interesting to consider how the organizing of information/knowledge done during training in some real way mirrors the energy-creating information interred in the mind of Maxwell's demon.

When taking into account the possible transitive benefits of knowledge organized via machine learning, and its attendant oracle through application, it's easy to see a world where this results in a net entropy loss, the creation of a previously non-existent energy gradient.

In my mind this has interesting implications for Fermi's paradox as it seems to imply the inevitibility of the organization of information. Taken further into my own personal dogma, I think it's inevitable that we create--what we would consider--a sentient being as I believe this is the cycle of our own origin in the larger evolutionary timeline.


I like the connection to Fermi. It seems to me eventually there has to be a concrete answer to the following question: Given the laws of physics, and the "initial conditions" (ie the state of the Universe at the moment of the Big Bang), what is the statistical likelihood of advanced (ie technological) civilizations occurring over time, and what is the likelihood that they go extinct (or revert to less technologically savvy conditions)? ISTM there are intrinsic numbers for this calculation, though it is probably impossible for us to derive them from first principles.


>at a high level--that life is a "reversal of the second law of thermodynamics";

Life temporarily displaces entropy, locally.

Life wins battles, chaos wins the war.

>Indeed, when considering machine learning, I think it's quite interesting to consider how the organizing of information/knowledge done during training in some real way mirrors the energy-creating information interred in the mind of Maxwell's demon.

This is our human bias favoring the common myth of ever-expanding complexity is an "inevitable" result of the passage of time; refer to Stephen Jay Gould's "Full House: The Spread of Excellence from Plato to Darwin"[0] for the only palatable refute modern evolutionists can offer.

>When taking into account the possible transitive benefits of knowledge organized via machine learning, and its attendant oracle through application, it's easy to see a world where this results in a net entropy loss, the creation of a previously non-existent energy gradient.

Because it is. Randomness combined with a sieve, like a generator and a discriminator, like the primordial protein soup and our own existence as a selector, like chaos and order themselves, MAY - but DOES NOT have to - lead to temporary, localized areas of complexity, that we call 'life'.

This "energy gradient" you speak of is literally gravity pulling baryonic matter foward thru space time. All work requires a temperature gradient - Hawking's musings on the second law of thermodynamics and your own intuition can reason why.

>In my mind this has interesting implications for Fermi's paradox as it seems to imply the inevitibility of the organization of information. Taken further into my own personal dogma, I think it's inevitable that we create--what we would consider--a sentient being as I believe this is the cycle of our own origin in the larger evolutionary timeline.

Over cosmological time spans, it is a near-mathematical certainty, that we are to either reach the universe's Omega point[1] on "our" own accord, perish to our own, by our own creation, or by our own son's, hands.

[0]: https://www.amazon.com/Full-House-Spread-Excellence-Darwin/d...

[1]: https://www.youtube.com/watch?v=eOxHRFN4rs0


A convolutional neural network ought to have translational symmetry, which should lead to a generalized version of momentum. If I understood the article correctly the conserved quantity would be <gx, dx>, where dx is the finite difference gradient of x.

This gives a vector with dimensions equal to however many directions you can translate a layer in and which is conserved over all (convolutional) layers.


Exactly right! In fact, because that symmetry does not include an action on the parameters of the layer, your conserved quantity <gx, dx> should hold whether or not the network is stationary for a loss. This means that it'll be stationary on every single data point. (In an image classification model, these values are just telling you whether or not the loss would be improved if the input image were translated.)


Everything in the paper is talking about global symmetries, is there also the possibility of gauge symmetries?


Yeah, I've been thinking about similar concepts in a different context. Fascinating.

Regarding the role of time, the idea of a purely conserved quantity is that it is conserved under the conditions of the system (that's why the article frequently references Newton's First Law), so they're generally held "for all time that these symmetries exist in the system".

Specifically on time: the invariant for systems that exhibit continuous time symmetries (i.e. you move a little bit forward or backward in time and the system looks exactly the same) is energy.


Here's my ELI5 attempt of the time/energy relation:

imagine a spring at rest (not moving)

strike the spring, it's now oscillating

the system now contains energy like a battery

what is energy? it's stored work potential

the battery is storing the energy, which can then be taken out at some future time

the spring is transporting the energy through time

in fact how do we measure time? with clocks. What's a clock? It's an oscillator. The energized spring is the clock. When system energy is zero, what is time even? There's no baseline against which to measure change when nothing is changing


Symmetry exists abstractly, apart from time.

There are many machine learning problems which should have symmetries: a picture of a cow rotated 135 degrees is still a picture of a cow, the meaning of spoken words shouldn't change with the audio level, etc. If they were doing machine learning on tracks from the LHC the system ought to take account of relativistic momentum and energy.

Can a model learn a symmetry? Or should a symmetry just be built into the model from the beginning?


Equivariant machine learning is a thing that people have tried... Tends to be expensive and slow, though, and imposes invariances that our model (a universal function approximator, recall) should just learn anyway: If you don't have enough pictures of upside down cows, just train a normal model with augmentations.


Ha, my previous comment was before your new edit mentioning Sora. There is a good reason why the accompanying research report to the Sora demo isn't titled "Awesome Generative Video," but references world models. The interesting feature is how many apparently (approximations to) physical properties emerge (object permanence, linear motion, partially elastic collisions, as well as many of the elements of grammar of film), and which do not (notably material properties of solid and fluids, creation of objects from nothing, etc.)


Time is not special regarding symmetries and conserved quantities. In general you can consider any family of continuous transformations parametrised by some real variable s: be it translations by a distance x, rotations by an angle φ, etc. These are technically one-parameter subgroups of a Lie group.

Then, if your dynamical system is symmetrical under these transformations you can construct a quantity whose derivative wrt s is zero.


> now wondering...if there's some conserved quantity in the neural network that is _directly analagous_ to conserved quantities in physics

Isn't the model attempting to conserve information during training? And isn't information a physical quantity?


"I'm now wondering if in something like Sora that can do a kind of physical modeling, if there's some conserved quantity in the neural network that is _directly analogous_ to conserved quantities in physics"

My first thought on reading that was that if there was it would be interesting to see if there was some way it tied into the concept of us living in a simulation, i.e. we're all living in a complex ML network simulation.




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: