While there are valid arguments against vendoring dependencies, I’m not convinced this is one of them in the typical case. It’s exceptionally easy to ignore certain directories when reviewing PRs in GitHub (although I still wish this was available as a repo-level preference), and I’d hope at least this would be the same in Gitlab, BitBucket, etc. I don’t review vendored dependencies, and I wouldn’t expect anyone else to, although the utility of that is admittedly domain-dependent.
Go also has the benefit that its dependencies tend not to be in deep chains, so the level of repo bloat when vendoring is usually not too terrible, at least relatively speaking.
Yeah, if you have a problem with it split it into two separate commits to review separately.
But WTF is this about not reading your dependencies. Read your dependencies! It is the most amazing superpower for someone to be like “Uh I don't know how Redux handles that and you can just tell them because you have read Redux. And that's also how you'll know, hey, do they have tests, are they doing weird things with monkeypatching classes or binaries at runtime, “oh the request is lazy—it doesn't get sent unless you attach a listener to hear the response,” what would it look like for the debugger to step through their code and is that reasonable for me to do or will I end up 50 layers deep into the call stack before the code actually does the thing.
I get it, this dependency is 100,000 LOC and if you printed it out that's basically 5 textbooks of code, you'd need a year to read all of that and truly understand it... Well don't use that dependency! “But I need it for dependency injection...” I mean if that's all then use a lightweight one or roll a quick solution in a day or explicitly inject your dependencies in 5 pages or or or. My point is just that you have so many options. If that thing is coming at 5 or 50 textbooks or whatever it is, what it actually means is that you are pulling in something with a huge number of bells and whistles and you plan on using 0.1% of them.
In this context, what would be useful is something like a linker-pruning at the source level.
That is, when your code is compiled, the linker can prune code that is never called. Then a feedback mechanism could show which part of the code is actually used (like looking in the .map of the linker).
Google's Closure compiler was doing this for JavaScript, where it matters because network bandwidth is a limited resource in some places. There it was called “tree shaking” if you want the jargon name for it.
While there are valid arguments against vendoring dependencies, I’m not convinced this is one of them in the typical case. It’s exceptionally easy to ignore certain directories when reviewing PRs in GitHub (although I still wish this was available as a repo-level preference), and I’d hope at least this would be the same in Gitlab, BitBucket, etc. I don’t review vendored dependencies, and I wouldn’t expect anyone else to, although the utility of that is admittedly domain-dependent.
Go also has the benefit that its dependencies tend not to be in deep chains, so the level of repo bloat when vendoring is usually not too terrible, at least relatively speaking.