Hacker News new | past | comments | ask | show | jobs | submit login

Great article, wish I had something like that 3 years ago.

Adding my personal tips:

- Do not use GitLab specific caching features, unless you love vendor lock in. Instead, use multi stage Docker builds. This way you can also run your pipeline locally and all your GitLab jobs will consist of "docker build ..."

- Upvote https://gitlab.com/gitlab-org/gitlab-runner/-/issues/2797 . Testing GitLab pipelines should not be such a PIA.




In a previous life, I set up CI runner images (Amazon AMIs) that had all of our docker base images pre-cached, and ran custom docker cleanup script that excluded images with certain tags. This meant that a new runner would be relatively quick off the blocks, and get faster as it built/pulled more images.

You can get better cache hit from tagging your gitlab runners and pinning projects to certain tags.

Also this: https://medium.com/titansoft-engineering/docker-build-cache-...

... Sharing cache on multiple hosts using buildkit + buildx


I use GitLab's private registry + scheduled pipelines to prebuild our base images, but that's definitely some extra spice. Thanks for sharing!


Great tips, thank you!

> Instead, use multi stage Docker builds. This way you can also run your pipeline locally and all your GitLab jobs will consist of "docker build ..."

There's a section in the pipeline efficiency docs with more tips and tricks for optimizing Docker images: https://docs.gitlab.com/ee/ci/pipelines/pipeline_efficiency....




Join us for AI Startup School this June 16-17 in San Francisco!

Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: