Products should be made of programs. Programs should be able to be simple enough to be "complete" while how they get built, replaced, planned, and orchestrated however, is free to be as fluid as the problem to be solved.
Many systems tried to be this, but people too frequently side-step these solutions for decomposition because they aren't complete enough. Unix wasn't enough, now we have Docker, etc.
OOP tries to achieve this from a different level on the inside of complex systems, but still falls short. (e.g. you have to do IPC or have a message-passing VM where objects~=processes to be able to do live code replacement and allow multiple independently-developed implementations of the same concept to run simultaneously and collect behavior on which does a better job, so this type of thing is not common.)
It'd be nice if the progress towards building smaller systems that avoid complexity begins to outpace the progress toward building larger systems which defend against complexity within my lifetime, but I can't say I'm hopeful.
It's mutual. Unix didn't solve problems for developers of large systems that stand on the shoulders of open source software libraries by failing to provide dependency resolution or versioning + namespacing source/built files. (An obvious [non-]decision given the timeline of its inception and intended usage, but a failure nonetheless.)
So then we got some of the former with system package managers designed for system administrators and some of the latter with version control/build systems/dependency tools designed by and for software developers, but they don't meet in the middle.
Docker is mostly used to just bridge the gap and force the system package manager to be a per-app-namespaced dependency/build support tool.
> Unix didn't solve problems for developers of large systems
That's a big problem right there. Why are systems large in the first place? http://vpri.org/ already showed we can do smaller —likely by 2 to 4 orders of magnitude. Solving the problems of large systems will only mitigate the symptoms. The root cause is size. We should solve that instead.
Sometimes problems really are just large. Sometimes you need results faster than a single microcomputer's CPU(s) can produce them and UNIX was invented too early to come with solutions for doing that in ways that prevent complexity. HPC is paying the price.
We're fixing it (efforts like Kubernetes, Hadoop, Mesos), but it's tough and we have a long way to go.
Many systems tried to be this, but people too frequently side-step these solutions for decomposition because they aren't complete enough. Unix wasn't enough, now we have Docker, etc.
OOP tries to achieve this from a different level on the inside of complex systems, but still falls short. (e.g. you have to do IPC or have a message-passing VM where objects~=processes to be able to do live code replacement and allow multiple independently-developed implementations of the same concept to run simultaneously and collect behavior on which does a better job, so this type of thing is not common.)
It'd be nice if the progress towards building smaller systems that avoid complexity begins to outpace the progress toward building larger systems which defend against complexity within my lifetime, but I can't say I'm hopeful.