> A lot of physics seems unnecessarily expensive to compute.
When you make a simple simulation of rigid bodies with classical physics you often get numerically unstable results - bodies jerking against each other, slowly passing through, etc. One common way to solve this is to introduce a "frozen" state. When objects are close enough to be at rest with balanced forces - you mark them as frozen and don't compute them every frame to save computing power. You only unfreeze them when some other unfrozen object interacts with them.
Additionally hierarchical space indexing algorithms are often used to avoid n^2 comparisons calculating what interacts and what doesn't. And these algorithms often use heuristics and hashing functions with collisions to subdivide the problem, which might result in objects becoming unfrozen without actually touching each other.
The result from inside this simulation would be weird, nonlinear, nonlocal and look a little like wave function collapse (if particle A whose coordinates hashed through this weird function are the same as those of particle B happens to unfreeze - the particle B unfreezes as well despite not interacting in any way). And this would be probably considered "hard to compute" compared to the simple equations the system developer wanted to simulate.
Example that might be more relatable for scientists - it's much easier and cheaper computationally to make a numerical simulation for 3-body problem than to make an analytic simulation of it. But describing this numerical simulation behavior in terms of physical equations requires much more complex model than the equations that you wanted to compute in the first place. You have to include implementation details like floating point accuracy, overflows, etc. And if you go far enough you have to add the possibility of space ray hitting a memory cell in the computer that runs your simulation.
I'm not saying this is the reason QM is weird - I don't understand QM well enough to form valid hypotheses ;), but I'm saying we might be mistaking the intention of The Developer with the compromises (s)he made to get there. If you take any imperfect implementation of a simple model and treat it as perfect - the model becomes much more complex.
When you make a simple simulation of rigid bodies with classical physics you often get numerically unstable results - bodies jerking against each other, slowly passing through, etc. One common way to solve this is to introduce a "frozen" state. When objects are close enough to be at rest with balanced forces - you mark them as frozen and don't compute them every frame to save computing power. You only unfreeze them when some other unfrozen object interacts with them.
Additionally hierarchical space indexing algorithms are often used to avoid n^2 comparisons calculating what interacts and what doesn't. And these algorithms often use heuristics and hashing functions with collisions to subdivide the problem, which might result in objects becoming unfrozen without actually touching each other.
The result from inside this simulation would be weird, nonlinear, nonlocal and look a little like wave function collapse (if particle A whose coordinates hashed through this weird function are the same as those of particle B happens to unfreeze - the particle B unfreezes as well despite not interacting in any way). And this would be probably considered "hard to compute" compared to the simple equations the system developer wanted to simulate.
Example that might be more relatable for scientists - it's much easier and cheaper computationally to make a numerical simulation for 3-body problem than to make an analytic simulation of it. But describing this numerical simulation behavior in terms of physical equations requires much more complex model than the equations that you wanted to compute in the first place. You have to include implementation details like floating point accuracy, overflows, etc. And if you go far enough you have to add the possibility of space ray hitting a memory cell in the computer that runs your simulation.
I'm not saying this is the reason QM is weird - I don't understand QM well enough to form valid hypotheses ;), but I'm saying we might be mistaking the intention of The Developer with the compromises (s)he made to get there. If you take any imperfect implementation of a simple model and treat it as perfect - the model becomes much more complex.