Hacker News new | past | comments | ask | show | jobs | submit login

* I never skip (minor) versions.

* I have automated backups created every 2 days stored via S3, I've done a full restore twice in 5+ years of uptime.

* I run Gitlab at home and at work.

None of the points touch on a maintenance burden ... Just saying. Skipping versions while updating any software is just being a lazy sysadmin and praying it works. Typically skipping to major versions during upgrades always comes with breaking changes so operator beware.




> Skipping versions while updating any software is just being a lazy sysadmin and praying it works. Typically skipping to major versions during upgrades always comes with breaking changes so operator beware.

It's nice that GitLab actually prevent you from doing that and give you messages that it's unsupported and direct you to their documentation, which describes the supported upgrade paths...

However, at the same time one cannot help but to wonder about why you can't go from version #1 to version #999 in one go. Most of the software at my dayjob (at least the one that i've written) absolutely can do that - since the DB migrations are fully automated, even if i had to create a lot of pushback against other devs going: "Well, it would be easier just to tell the clients who are running this software to just do X manually when going from version Y to Z."

But GitLab's updates are largely automated (the Chef scripts and PostgreSQL migrations etc.), it's just that for some reason they either don't include all of them or require that you visit certain key points throughout the process, which cannot be skipped (e.g. certain milestone releases, as described in their docs).

Of course, i acknowledge that it's extremely hard to sustain backwards compatibility and i've seen numerous projects start out that way and the devs give up on the idea at first sign of difficulty, since it's not like they care much for that and it doesn't always lead to clear value add - it's a nice to have and they won't earn any less of a salary for making some ops' person's life harder down the line.

I also have automated backups with BackupPC, however i expect software to remain reasonably secure and stable without having to update that often - props to GitLab for disclosing the important releases, but i'm migrating over to Gitea for my personal needs as we speak, even if having someone else manage a GitLab install at work is still like having a superpower (with GitLab CI, GitLab Registry etc.).

I actually wrote an article about how really frequent updates cause problems and lots of churn: https://blog.kronis.dev/articles/never-update-anything (though the title is a bit tongue in cheek, as explained by the disclaimer at the top of the article).


Your db migrations may support updates from #1 to #99 but your OS does not directly support updates of MySQL 5 to MySQL 8 with issues. For example there are plenty of examples of deprecated my.cnf configuration values. Similarly APT on Ubuntu will prompt how to handle a my.cnf that differs from the Distribution release when upgrading Versions. Often times this is more painful than minor version updates.

I think the version milestones in Gitlab are akin to dependency changes for self-hosted Gitlab. An example is the Gitlab v9 (?) Postgres upgrade to Postgres v11 I think, it was opt-in for a prior version of Gitlab than required at that version milestone. It's difficult to make db migration scripts for Gitlab,as in your example, that may depend on newer Postgres idioms not available in the legacy db version. So you can't simply support gitlab updates from X to Y version due to underlyng dependency constraints...

Thanks for the insightful discourse.


> Your db migrations may support updates from #1 to #99 but your OS does not directly support updates of MySQL 5 to MySQL 8 with issues.

That's just the thing - more software out there should have a clear separation between the files needed to run it (binaries, other libraries), its configuration (either files, environment variables or a mix of both) and the data that's generated by it.

The binaries and libraries can easily have breaking changes and be incompatible with one another (essentially treat them as a blob that fits together, though dynamic linking muddies this). The configuration can also change, though it should be documented and the binaries should output warnings in the logs in such cases (like GitLab actually already does!). The data should have extra care taken to make it compatible between most versions, with at least forwards only migrations available in all other cases (since backwards compatible migrations are just too hard to do in practice).

Alas, i don't install most software on my servers anymore, merely Docker (or Podman, basically any OCI compatible technology) containers with specific volumes or bind mounts for the persistent data. GitLab is pretty good in this regard with its Omnibus install, though there are certainly a few problems with it if you try to do too many updates or have a non-standard configuration.

I actually wrote more about it and why i just migrated away from GitLab to Gitea, Sonatype Nexus and Drone CI on my blog: https://blog.kronis.dev/articles/goodbye-gitlab-hello-gitea-...

Of course, i'll still use GitLab in my company because there it's someone else's job to keep it running with a hopefully appropriate amount of resources to keep it that way with minimal downtime and all the relevant updates. But at the same time, for certain circumstances (like my memory constrained homelab setup), it makes sense to look into multiple lightweight integrated solutions.

You can actually find more information about what broke for me while doing updates in particular, seemingly something cgroups related with gitaly related stuff not having the write permissions needed inside of the container, which later lead to the embedded PostgreSQL failing catastrophically. In comparison, right now i just have Gitea for similar goals which is a single binary that uses an SQLite database, as well as the other aforementioned tools for CI and storage of artefacts, which are similarly decoupled.

It's probably all about constraints, drawbacks and finding what works for you best!




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: