They are both interesting systems and I agree they try to capture every single dependency.
Now try adding Ragel into the mix, which doesn't have `-M` or any other kind of dependency output. There are plenty of other tools like this. Practical applications eventually run into limitations like this as they grow.
> I agree they try to capture every single dependency.
No... they do capture every dependency. They don't try. They do. Try getting something to build that has dependencies outside the nix store. You can't, because you can't access the local file system, the internet, or anything else while putting things into the nix store.
Nix doesn't care about `-M` output. It's a package manager that captures dependencies. If your package's build system doesn't have `-M`, then you must declare your dependencies yourself. This is how people used to do it before `-M`.
Do they even capture things like the C++ headers that your compiler uses? I think that's what he meant. When you compile a C++ file the compiler goes off and reads all kinds of files that are totally unknown to your build system.
Simpler languages like Go don't really have this problem, but they also have sane build systems so I don't know why you'd need CMake or whatever.
This is a long-ago solved problem, and the solutions are only getting better. GCC has the -M family of options which are explicitly designed to feed the header dependencies into the build system, and these flags have been around since the dark ages. More modern systems like Bazel will sandbox the build so if you try to #include a file that's not listed as a dependency you'll just get a compiler error. If you want, you can specify the compiler itself as a dependency.
Hermetic builds are the way to go for a ton of reasons.
They are not nessasarily totally unknown. If your build system also handles dependencies, then it might be the one providing those header files in the first place.
The problem comes when some part of your toolchain wants to access something outside of the build environment.
That's not at all how it works. The compiler emits dependency files that the build system can then read in on the subsequent passes to know if any dependencies changed. There's no special knowledge of "these are all of the headers in the world we care about".
I disagree. The compiler knows about possible include paths from the -I directive. Any include is searched using these. There are a few assumed directories on Unix such as "/usr/include", but the compiler doesn't inherently know about "all includes". If you doubt me, do an strafe on a compilation and just look at how it finds headers. It doesn't know anything, it searches the include paths.