Hacker News new | past | comments | ask | show | jobs | submit login
Ask HN: How do you handle CI/CD for databases
16 points by totaldude87 33 days ago | hide | past | favorite | 7 comments
How do you handle the database setup / scripts and other nominal DML DDL steps via CICD pipelines , looking for suggestion on tools and software (both oss and commercial ) and for any major databases like MySQL or oracle or postgres



Take a look at Database Lab: https://postgres.ai/products/database-migration-testing

The idea is to use block-level copy-on-write (ZFS by default, LVM2 is an option too, though less powerful) to be able to get thin clones really fast. Currently supports only Postgres. The engine is open source. Result; single machine with Postgres DB of any size, you pay 1x for storage and compute, and can have dozens of independent DBs, provisioning takes a few seconds.

This is used in CI/CD context so getting temporary full-size for any tests is not a problem any more.

We constantly add new features and looking forward to new use cases.

Some case studies with numbers are here: https://postgres.ai/resources

Here is how GitLab uses it for DB testing: https://docs.gitlab.com/ee/architecture/blueprints/database_...

(disclaimer: I'm the founder of Postgres.ai, the company that develops Database Lab).


Previously I used an ephemeral database for all environments except stage/prod. The ephemeral DBs just ran in docker containers. In all environments, we used flyway to manage the database. Flyway can run again and again and won't do anything if your database is already up to date. For the ephemeral databases it could initialize them from scratch in the same way.


We created and use SchemaHero (https://schemahero.io) for parts of this. We use Terraform to create the database (RDS, etc), and then SchemaHero to manage the schema in a declarative method.


Locally, docker compose will spin up N mysql instances and relevant services under test. Makefiles or bin scripts init things and run tests. Some teams have seed scripts or a snapshot of some sql state they want. When you run local integration tests, data is cleared/reset.

We use buildkite for CI/CD. It runs the same docker compose stuff as we do locally. It runs unit and acceptance tests. If green, it will auto deploy to our k8s staging cluster (though it worked similar when it deployed to vms). If staging is green, it can auto promote to a canary node in prod, and, depending on team set up, w ill auto roll out or allow manual gating for any additional QA. Version numbers are auto bumped by some internal tooling tied into the build system.

When we push to stage, the staging env has db instances but they are usually 1-2 nodes where in prod we will have more (3-30 nodes or more depending on the service). In staging, we run a suit of tests that exercise the end user experience. We also regularly run a slimmed down version of those tests in prod.

When it comes to unit tests, most of our teams do not mock the db. We use Go, and we will test by using test fakes - a struct that matches an interface is created in the test file and passed around. These tests validate error handling, log generation, sometimes metric generation, and, lastly, the happy path.

TL;DR - docker compose and scripts.


We use sqx: https://github.com/launchbadge/sqlx. The devs create and commit the up/down migration files and the cicd runner runs the migrations against the db.


IAC + Migrations is the easiest way:

IAC defines the database hardware + setup

Migrations make sure that the data you need to start with is in there and any changes are captured and applied on each deployment


We use Fluent Migrator for out C# based applications.

https://fluentmigrator.github.io/




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: