Hacker News new | past | comments | ask | show | jobs | submit login

Sure, but that feels like a band-aid on an existing problem. And sometimes a band-aid is what you need, especially with legacy systems, so I'm not judging that. But I think for a new project, these days, it's a mistake to plan on needing Docker from the outset.

Modern languages with modern tooling can automatically take care of dependency management, running your code cross-platform, etc [0]. And several of them even have an upgrade model where you should never need an older version of the compiler/runtime; you just keep updating to the latest and everything will work. I find that when circumstances allow me to use one of these languages/toolsets, most of the reasons for using Docker just disappear.

[0] Examples include Rust/cargo, Go, Deno, Node (to an extent)




How do you mean?

If the new project without any tests with a PHP 7.2 and MySQL installation and I need to upgrade it to PHP 8.2, I first need to write tests and I can't use PHP 8.2 features until I've upgraded.

Composer is the dependency manager, but I still need PHP to run the app later on. And a PHP 7.2 project might behave differently when running on PHP 7.2 or PHP 8.2. And sometimes PHP is just one part of the equation. There might be an Angular frontend and a nginx / php-fpm backend, maybe Redis, Postgres, logging, etc. They need to be wired together as well. And I'm into backend web development and do a bit of frontend development, but whoa, I get confused with all the node.js, npm, yarn versioning with corepack and nvm and whatnot. Here even a "yarn install" behaves differently depending on which yarn version I have. I'd rather have docker take care of that one for me.

I feel like "docker (compose)" and "make" are widely available and language-agnostic enough and great for my use cases (small to medium sized web apps), especially since I develop on Linux.

Something language-specific like pyenv might work as well, but might be too lightweight for wiring other tools. I used to work a lot with Vagrant, but that seems to more on the "heavy" side.

Edit: I just saw your examples, unfortunately I've only dabbled a bit with Go and haven't worked with Rust yet, so I can't comment on that, but would be interested to know, how they work re: local dev setup.


> and I can't use PHP 8.2 features until I've upgraded

Yeah- so in Rust, the compiler/tooling never introduces breaking changes, as (I think) a rule. For any collection of Rust projects written at different times, you can always upgrade to the very latest version of the compiler and it will compile all of them.

The way they handle (the very rare) breaking changes to the language itself is really clever: instead of a compiler version, you target a Rust "edition", where a new edition is established every three years. And then any version of the Rust compiler can compile all past Rust editions.

Node.js isn't quite as strict with this, though it very rarely gets breaking changes these days (partly because JavaScript itself virtually never gets breaking changes, because you never want to break the web). Golang similarly has a major goal of not introducing breaking changes (again, with some wiggle-room for extreme scenarios).

> I get confused with all the node.js, npm, yarn versioning with corepack and nvm and whatnot. Here even a "yarn install" behaves differently depending on which yarn version I have.

Hmm. I may be biased, but I feel like the Node ecosystem (including yarn) is pretty good about this stuff. Yarn had some major changes to how it works underneath between its major versions, but that stuff is mostly supposed to be transient/implementation-details. I believe it still keys off of the same package.json/yarn.lock files (which are the only things you check in), and it still exposes an equivalent interface to the code that imports dependencies from it.

nvm isn't ideal, though I find I don't usually have to use it because like I said, Node.js rarely gets breaking changes. Mostly I can just keep my system version up to date and be fine, regardless of project

Configuring the Node ecosystem's build tools gets really hairy, but once they're configured I find them to mostly be plug and play in a new checkout or on a new machine (or a deployment); install the latest Node, npm install, npm run build, done. Deno takes it further and mostly eliminates even those steps (and I really hope Deno overtakes Node for this and other reasons).

> maybe Redis, Postgres, logging, etc. They need to be wired together as well

I think this - grabbing stock pieces off the shelf - is the main place where Docker feels okay to use in a local environment. No building dev images, minimal state/configuration. Just "give me X". I'd still prefer to just run those servers directly if I can (or even better, point the code I'm working on at a live testing/staging environment), but I can see scenarios where that wouldn't be feasible


I'm not gonna lie, I didn't read all that, but the node example alone proves you either didn't read the guy you replied to or haven't been coding long enough to grok the problem.

What if you want to start a new project using the latest postgres version because postgres has a new feature that will be handy, but you already maintain another project that uses a postgres feature or relies on behaviour that was removed/changed in the latest version? You're going to set up a whole new VM on the internet to be a staging environment and instead of setting up a testing and deployment pipeline you're going to just FTP / remote-ssh into it and change live code?

you define an apps entire chain of dependencies including external services in a compose file / set of kube manifests / terraform config for ecs. Then in the container definition itself you lock down things like C library and distro versions: maybe you use specially patched imagemagick on one project or a pdf generator on another, and fontconfig defaults were updated and it changed how aliasing works between distro releases and now your fonts are all fugly in generated exports... stick all those definitions in a Dockerfile and deploy onto any Linux distro / kernel and it'll look identical to it does on local

nevermind this, check out this thread to destroy your illusion that simply having node installed locally will make your next project super future proof https://github.com/webpack/webpack/issues/14532 and note that some of the packages referencing this old issue in open new bug reports are very popular!

if you respond please do not open with "yeah but rust", I can still compile Fortran code too


Your comment makes clear you didn't read the guy you replied to, and I'm not sure it would've been in good faith even if you had, so I'm not going to spend time writing a full response.


I do work with Go and it doesn’t preclude the utility of containers in development for out-of-process services.


I'm sorry but a complex system goes WAY beyond just the language being used. Not using docker (or something very similar) would only result in a massive waste of time right out of the gate.

I have to assume you work by yourself or in an extremely small company to be able to handle project complexity without docker.




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: