>There are numerous arguments along the line of Turing's halting problem that restrict what that kind of thing can do. As it uses a finite amount of time it can't do anything that could require an unbounded time to complete or that could potentially not terminate.
I have used a similar argument to show that the simulation hypothesis is wrong. If any algorithm used to simulate the world takes longer than o(N) time, then the most efficient possible computer for that is the universe which computes everything in O(n) time where n is time. In other words, you never get "lag" in reality no matter how complex the scene you're looking at is. Worse than that, some simulation algorithms are exponential time complexity!
That doesn't prove or disprove anything. What we experience as time would be part of the simulation, were such a hypothesis true. As such, the way in which we experience it is fully independent from whatever costs it might have to compute.
So you're saying that an exponential time complexity algorithm with N of every atom in the universe will complete before the heat death of the other universe that the simulation is taking place in? Sorry, not plausible.
Our laws of physics are space partitioned so the algorithm for simulating it isn't exponential.
If the containing universe has like 21 dimensions and otherwise have similar tech computers as we do today then you should be able to simulate it on a datacenter just fine as computation ability grows exponentially with number of dimensions. 3 dimensions you have 2 dimensions of computation surface, 21 dimensions and you have 20 dimensions of computation surface, so our current computation to the power of 10. GPT3 used more than a petaflop real time compute during training, so 10 to the power of 15. Using the same hardware in our fictive universe would give us 10 to the power of 150 flops. We estimate atoms in the universe to be about 10 to the power of 80, with this computer we would have 10 to the power of 70 flops of compute per atom, that should be enough even if entanglement gets a bit messy. We have around that much memory per atom as well, so can compute a lot of small boxes and sum over all of it etc, to emulate particle waves. We wouldn't be able to detect computational anomalies on that small scale, so we can't say that there isn't such a computer emulating us.
Heat death would be a part of our simulation, not necessarily of the host. But since all of this simulation theories are basically religious beliefs, it is not very interesting reasoning about it.
I have used a similar argument to show that the simulation hypothesis is wrong. If any algorithm used to simulate the world takes longer than o(N) time, then the most efficient possible computer for that is the universe which computes everything in O(n) time where n is time. In other words, you never get "lag" in reality no matter how complex the scene you're looking at is. Worse than that, some simulation algorithms are exponential time complexity!