Hacker News new | comments | ask | show | jobs | submit login

(disclosure: I do work for Codeship :P )

There are a lot of really great reasons to use Docker, or any container technology for CI.

First off containers give you a standard interface to a reproducable build. This means you can run something in a container and expect it to behave the same way as something a co-worker runs on their workstation, or something run in the staging or production environments. For CI this is an absolute necessity. Rather than running tests locally, and expecting a CI server closely tracking the production/staging environments to catch issues with different version of the OS or libraries you can expect any build that passes locally to also pass on CI. This cuts down on a lot of potential back and forth. The only shared dependency between CI/local/prod/staging is docker itself.

Another benefit is (almost) complete isolation. This means rather than having different vm images tracking different projects, you can have a single vm image with docker, and have each container running on the vm for any version of any build across your system. From a CI perspective you can abstract most of the complex configuration for your applications into "docker build -t myapp_test ./Dockerfile.test && docker run myapp_test".

Containers use a differential filesystem, so N running containers for an application will take up 1 X the size of the container image + N x the average space of changes made in the running containers on top of that base image. This makes larger images highly space efficient without having to worry about different instances treading on the same folders.

The line between dev and ops blurs a little (devops), but clear responsibilities. Ops becomes responsible for maintaining the docker infrastructure, and dev is responsible for everything inside the container boundary, the container image, installed packages, code compilation, and how the containers interact. A container mantra is "no more 'well it worked on MY machine'". If it works for the dev, it really will work in prod.

Besides this, there a number of benefits around speed, accessibility, debugging, standardization, the list goes on. There are also a ton of great and varied Docker CI solutions out there, from specific Docker based CI like us (codeship), Shippable, Drone, Circleci, as well as standard solutions like jenkins via plugins. Many hosting solutions are supporting docker redeploy hooks for CI purposes. The standardized nature of containers make it trivial for vendors to provide integrations. Even if you don't use docker yourselves, this is certainly a great space to watch.

Technically you can use docker for CI/CD without using it for deploying your app. When you do this you lose some of the benefits listed, but not all. You lose the cohesion between CI/local and prod, but you still gain a whole lot in terms of speed and complexity within your CI infrastructure.

Thomas Shaw did a great talk at Dockercon on introducing Docker to Demonware for CI across a variety of projects. I don't think the video is up yet, but it's well worth a watch if you're thinking of bringing it into your company. In the meantime we wrote a blog post on his talk: http://blog.codeship.com/dockercon-2015-using-docker-to-driv....

We are just starting a beta for our new CI flow which follows the container paradigm very closely. It allows you to build docker compose stacks for your various application images, and run your CI/CD pipeline locally, exactly as it would get run on our hosted platform.

If anyone is interested in joining our beta, just drop me an email: brendan at codeship.com.

Guidelines | FAQ | Support | API | Security | Lists | Bookmarklet | Legal | Apply to YC | Contact