I'm the Founder at Distelli and we've been helping our customers deploy to VMs for almost a year.
Now we've gotten several requests from folks asking us if we can build pipelines to their Kubernetes Clusters and we're excited to share what we've built.
We're a small group of engineers and we value honest feedback so tell us what you think.
Wercker will build your docker images but deployments from wercker to k8s require kubectl, which makes it hard to setup pipelines that deploy from dev -> test automatically and then promote to prod manually once test is working fine.
In the scenario above dev and test are assumed to be different namespaces on same or different clusters.
Finally we have a complete cluster management dashboard and UI so you can launch new clusters and administer existing ones right in the UI.
So before this weekend I'd only setup continuous deployment of a simple Rails app & database to a cluster using CircleCI [0]. This was quite a lot more complicated than I'd have liked and this weekend I thought I'd give wercker and distelli a go (based on seeing this post).
I found the wercker docs a little lacking. The wercker.yml docs in particular felt incomplete [1] - I already know how to write yaml... However, when I looked around a little with the help of Google I found they had some better tutorial style content on their blog and followed this guide [2]. While the guide might be easier to follow basically that was what I needed and I've been able to get one of their sample apps up and running ok on a test cluster.
After this being a little awkward/unclear at times I thought I'd give distelli a go at the same task:
- Nice being able to take containers from different repos
- Good to be able to create clusters but I wasn't able to connect to an existing cluster on GKE.
- Wasn't able to find anything in the documentation [3] (but managed all the same - credit to the UI?)
- Not clear how I create other Kubernetes artefacts (services etc)
- I actually quite like the kubernetes yml file system; distelli seems to abstract this away.
So sorry for not responding sooner - I really do appreciate you following up like this. I've finally had a bit of time to look into this again & while I didn't get quite as far as I'd have hoped, I do have a new comments / questions.
I'd like to push this to github; then have a CI service pick it up; run the tests for each service; build all the separate images; push these to GCR; and trigger a kubectl apply.
So far the closest I've been to getting this working is with Wercker. However, their I've found it hard to get into their no-dockerfile system and have found the `wercker dev --expose 3000` workflow to be frustratingly slow. I'd rather not have dockerfiles for development and then have wercker build them in a different way for production.
Hope that all makes sense; if you've got the time to explain how this might work on distelli or point me in the direction of a relevant guide that'd be awesome. If not, I hope the comments are helpful.
Thanks again for keeping in touch & sorry for taking so long to review your changes.
We have recently written a series of tutorials that I believe addresses your use case.
The tutorials go over taking a simple application composed of 2 services that can be brought up together locally using docker-compose and using distelli to facilitate the following workflow:
1. Commit to GitHub to trigger builds of both services that run tests and on completion push both images separately to GCR image repositories.
2. Set up pipelines that apply the new images to an existing kubernetes deployment specification and deploy to 1 or multiple deployment destinations (a combination of a cluster/namespace/name)
3. Receive notifications once actions are complete for builds, deployments etc. and monitor the status of deployed containers, including logs etc.
Hi, I'm an engineer at Distelli. I have recently written a series of tutorials that I believe addresses your use case.
The tutorials go over taking a simple application composed of 2 services that can be brought up together locally using docker-compose and using distelli to facilitate the following workflow:
1. Commit to GitHub to trigger builds of both services that run tests and on completion push both images separately to GCR image repositories.
2. Set up pipelines that apply the new images to an existing kubernetes deployment specification and deploy to 1 or multiple deployment destinations (a combination of a cluster/namespace/name)
3. Receive notifications once actions are complete for builds, deployments etc. and monitor the status of deployed containers, including logs etc.
Here are the tutorials. Each is pretty short on it's own:
Now we've gotten several requests from folks asking us if we can build pipelines to their Kubernetes Clusters and we're excited to share what we've built.
We're a small group of engineers and we value honest feedback so tell us what you think.