PreProd for Database

Deploying changes to the production database is really risky. It is enough to forget about one single simple detail and the migration will fail. Customers will start complaining and even worst they will go to the competitors.

In this article, we’ll solve this issue by creating a PreProd environment to run migrations and tests. Then, we’ll detail the challenges of this approach.

In a previous article, we have detailed the main options on how to apply migrations to the database from the CI/CD pipeline. We recommend you go through it first, as this one is a sort of continuity to it.

Our goal is to make sure database migration will always work on production.

The PreProd environment pipeline for database

To make sure database migrations will work on production, we’ll create a similar environment. We’ll call it PreProd. Similar here means the same RDBMS version, settings and offer. Because migrations might work for one setting but not for another. We want to be as close as possible to the prod settings. This makes our tests more efficient.

Technically speaking, this means creating a new Resource Group in Azure dedicated for this environment. Then creating a new instance of SQL Azure similar to the one running in production.

We can automate the creation of PreProd using IaC through ARM templates or Cloud Formation or Terraform…

So that we can create and delete it on the flight. Because we need this env temporarily, just the time to run migrations and tests.

We can test database migrations on a new database containing fake simple data. But, again migrations might succeed here and fail on prod. That is because sometimes the data itself can fail the migrations.

Again, the key point here, is to be close as much as possible to the prod. So, we’ll import the same database from prod to PreProd. We could do this through cli tools. On Azure, that could be done through the following powershell script: more details here.

Azure PS script to copy database

We can copy the entire database to PreProd. And we can also import only the data and sanitise it on the flight.

We copied the prod database with all our sensitive customers data. Allo GDPR :) GDPR have some rules here. So we need to make our data anonymous, i.e we won’t be able to read certain customer’s data. In addition to that, we need to hide sensitive data like credit cards.

Masking data could be done using SQL scripts for simple scenarios like just hiding the data, like replacing email addresses with static characters. But, in other cases, we need customers emails, we might have queries using them in our tests. Or may be email is used as foreign key. In this case, we can use a Data Masking tool to substitute our data. Instead of having xxxxx shown on all email properties, we’ll have values like email01@email.com.

Data masking/sanitisation

At this stage, we have a database server with same settings as in prod and a database with same/sanitised data as in prod. We now have an env really similar to prod, but totally isolated. If the changes will fail on prod, they will first fail in PreProd. We can now run the migrations securely and confidently.

Even if migration runs successfully, we still want to run tests to make sure the database behaves as the app expects. For this case, we can run unit and integration tests. Frameworks like tSqlt makes it easy to write unit tests against the database. Cave Tester makes sure you have the required data for each unit/integration test. It guarantees that we will end up with the exact same data as we started with before running the tests. It uses snapshots to save data before each test and restore it after each test.

After running migrations and tests, if one of them fails, then we can investigate the problem in PreProd and we can clap because we detected it before it happens in prod. Then, we can delete the entire environment to save money. We use IoC, so we can recreate the infrastructure within seconds when needed.

Typically, PreProd needs to run just before going to production.

The steps described above are the main lines. But, they can be optimised. For example, copying the database could take too long if it is too big. In addition to that, the database performance will degrade during the copying. A solution for this might be to run a job at the time we have lowest traffic, to copy the database to a cloud storage. So that the PreProd will run faster as the database will be ready.

Video on PreProd for database

Join our community Slack and read our weekly Faun topics ⬇

FAUN Publication

The Must-Read Publication for Creative Developers & DevOps Enthusiasts

FAUN Publication

The Must-Read Publication for Creative Developers & DevOps Enthusiasts. Medium’s largest DevOps publication.

Houssem Dellai

Written by

Premier Field Engineer at Microsoft, ex MVP, Cloud & DevOps enthusiast. I share what I learn in my professional experience through articles and videos.

FAUN Publication

The Must-Read Publication for Creative Developers & DevOps Enthusiasts. Medium’s largest DevOps publication.