Because deterministic compilation lets you (or someone else) do this automatically.
If you introduce a backdoor into the compilation step, you run a much greater risk of detection. As long as there are multiple machines compiling packages and verifying whether the checksums match, any single backdoored machine will immediately be caught. This is much more important for package managers that do their own builds and ship their own binaries than for those who just ship whatever they got from the developer.
Without deterministic compilation, two builds of the exact same code might differ. This makes backdoors very hard to detect unless you have prior suspicion that one is present in a particular program.
Deterministic compilation forces people to embed backdoors directly in the source code repository, which creates an audit trail, is very visible in diffs, much easier to catch in reviews and so on. You can still get away with it (see the XZ situation), but it requires far more work.
See also a related comment in the overall thread saying "crates.io does not host compiled artifacts. If packages on crates.io differ from their Git repository it's because of a custom pre-build step of that particular package, so a deterministic compilation toolchain won't help here."
Prove me wrong! I'm open to it.
Or be that person who is too lazy to respond with an actual comment, downvotes, and probably assumes they are right.
It means we can build the same thing and check the output hashes. In terms of deriving a method for trusting build infra, it's basically the end all-be-all. I almost don't know how to answer it.
"the best way"? Please make the argument for why. To do it properly, you must steel-man the alternatives (not shoot down straw-men)