Run EF Core Integration Tests in a repeatable and isolated way using a public Docker Image with a… (2024)

Before we get our hands dirty, you may ask: why a real database and not an InMemory one?

Well, if you running or want to start writing tests that involve InMemory databases in Entity Framework (EF) Core, it is likely that, at some point in time, you may want to run those tests against a real database. You may want to have more accurate and valuable tests. In Microsoft words:

The EF in-memory database often behaves differently than relational databases. Only use the EF in-memory database after fully understanding the issues and trade-offs involved, as discussed in Testing code that uses EF Core.

Especially if you are using SQL Server, you may want to use InMemory databases with SQLite as it closely matches common relational database behaviors but it is still important to remind that is not exactly the same. SQLite provider has a number of migration limitations. Most of these limitations are a result of limitations in the underlying SQLite database engine and are not specific to EF.

With that being, you may still have valuable integration tests using an InMemory DB but if you can run those in an acceptable time against a real database then you have nothing to lose and something to win. For example, in my case, I’m using Postgres DB with built-in JSON support. Neither SQLite nor InMemory DB supports Postgres JSON types so I have no choice than using a real database for the tests I want to have it running!

Run EF Core Integration Tests in a repeatable and isolated way using a public Docker Image with a… (1)

Just clone the eShopOnWeb repo and get a Postgres database running locally. Get a Postgres DB instance running locally is simple as running one command using docker (same concept applies to SQL Server if that is your preference):

docker run --name some-postgres -e POSTGRES_PASSWORD=mysecretpassword -d postgres

Next, I just update the SetQuantities.RemoveEmptyQuantities integration test to use my local Postgres DB (and replace InMemoryDB for Npgsql Entity Framework Core Provider):

If you now run the test (dotnet test tests/IntegrationTests/IntegrationTests.csproj), it succeeds the first time but fails on consecutive executions, no matter how many times you retry. It fails because, unlike InMemoryDB, the database is not destroyed after the test run so it is polluted with data from previous runs that is affecting new runs. In other words, tests are not repeatable.

Before move forward, let’s agree that:

First, let’s agree that, all things being equal, we’d like for our automated tests to be isolated, repeatable, and fast. It shouldn’t matter what order we run them in. It shouldn’t matter if we run them in parallel. It shouldn’t matter when we run them. Every test should live in its own little branch of reality in which only it exists, and setting up and destroying that reality should take zero (or, preferably, negative) time and resources.

It’s unlikely we’ll be able to hit this ideal state, but it’s worth calling out some of the things we’re going for as we evaluate different options.

A way to achieve the ideal state is to spin and shutdown a new database every time I run an integration test. They will be isolated and repeatable for sure and we will evaluate “how fast” later on. So, lets first create a BaseEfRepoTestFixture:

and our SetQuantities integration test now extends BaseEfRepoTestFixture :

As you can see:

  1. BaseEfRepoTestFixture.cs it will always ensure that a new unique DB per test class is created (var dbSuffix = $"{testName}_{Guid.NewGuid()}" );
  2. This database is automatically deleted when the test finishes running so you don’t pollute your local DB instance. This is only relevant when you run locally because in AWS CodeBuild, like in many other CI tools, the container that contains the database will be destroyed after the build ends (see below);

At this point, we have a way to run our DB Integration tests running against a real database in an isolated & repeatable way regardless of the CI tool you may end up using. However, most of the value of the test is on having running in our CI/CD pipeline in an automatic way (like every time someone pushes code).

Simple! 😎. All we need is “just” run dotnet test tests/IntegrationTests/IntegrationTests.csproj in a server that has a Postgres DB installed! Most CI tools allow you to configure the server/host which is used to run your builds. In AWS CodeBuild there are multiple ways of doing:

  1. By customizing the built environment:

You can bring your own build environments to use with AWS CodeBuild, such as for the Microsoft .NET Framework. You can package the runtime and tools for your build into a Docker image and upload it to a public Docker Hub repository or Amazon EC2 Container Registry (Amazon ECR). When you create a new build project, you can specify the location of your Docker image, and CodeBuild will pull the image and use it as the build project configuration.

2. By specify build commands that makes sure that a Postgres DB is installed and running:

You can define the specific commands that you want AWS CodeBuild to perform, such as installing build tool packages, running unit tests, and packaging your code. The build specification is a YAML file that lets you choose the commands to run at each phase of the build and other settings. CodeBuild helps you get started quickly with sample build specification files for common scenarios, such as builds using Apache Maven, Gradle, or npm.

I‘ll go with 1. because:

So, let’s create an AWS CodeBuild project:

Run EF Core Integration Tests in a repeatable and isolated way using a public Docker Image with a… (2)

At this step, you can configure webhooks to trigger builds automatically in response to events like push orpull request (which I’m omitting by brevity). Next, at the environment step, be sure that you choose an official Microsoft Ubuntu docker image that contains .NET 5 SDK installed. This image does not contain a Postgres DB so we will take care of installing one in the next step. Less important but still relevant to mention is that docker image is public so you don’t need to set up any credentials to be able to download:

Run EF Core Integration Tests in a repeatable and isolated way using a public Docker Image with a… (3)

Last but not least important, we need to specify a build spec file where we will tell AWS CodeBuild to run our tests:

The build specification is a YAML file that lets you choose the commands to run at each phase of the build and other settings

Run EF Core Integration Tests in a repeatable and isolated way using a public Docker Image with a… (4)

As you can see, I’m (by this order):

  1. installing a Postgres DB. This takes time so in the future I may want to use a docker images that has a Postgres DB already installed 😉
  2. be sure that DB is running and postgres user has the password in connection string;
  3. run the tests!

and voilà! If you now trigger a build, tests run successfully. Every time you do it! No matter how many times you do it. Because they are repeatable and isolated by using an independent real database per test. Congrats! 👏👏🚀

Run EF Core Integration Tests in a repeatable and isolated way using a public Docker Image with a… (5)
Run EF Core Integration Tests in a repeatable and isolated way using a public Docker Image with a… (6)

Well, above all:

Don’t write integration tests for every possible permutation of data and file access with databases and file systems. Regardless of how many places across an app interact with databases and file systems, a focused set of read, write, update, and delete integration tests are usually capable of adequately testing database and file system components. Use unit tests for routine tests of method logic that interact with these components. In unit tests, the use of infrastructure fakes/mocks result in faster test execution.

Secondly, you can see from the screenshot above, that it takes less than 3.5s to run 2 integration tests in a 15 GB memory, 8 vCPUs AWS CodeBuild instance. With simple math and assuming similar complexity in future integration tests, I guess it will take around 175s to run 100 for example.

Thirdly, you should also consider if the server is capable of running all of those tests in parallel (as it is by default). For example, I had a customer/project with 35 tests, and running those locally in my 2,3 GHz 8-Core Intel Core i9 / 32GB mac was fast but when I try to get those running in AWS CodeBuild 15 GB memory, 8 vCPUs instances, build failed by timeout! They only run in AWS CodeBuild 145 GB memory, 72 vCPUs instances but, naturally, that is expensive and an indication that I wasn’t on the right path. Solution? Well, I just added a xunit.runner.json file to the project that ensures that tests run in sequence (nor in parallel). To me, they were still executing in an acceptable time, at every push to a stable branch. They still take too long for you? Do you have a lot of tests? Alternatives? Some people use a shared DB, you may also consider splitting the integration tests into a separate test suite and potentially running them on a different schedule than my unit test suite (e.g. Nightly instead of every commit), etc but all of those solutions seems like a plan B as it is harder to get in a state which tests are repeatable and isolated.

I hope I make it helpful by showing a way to:

  1. write EF Core Integration Tests that run in a repeatable and isolated way (regardless of CI tool you use);
  2. get it running in a CI tool (AWS CodeBuild) in a way that you can transpose to another CI tool of you preference;
  3. discuss some common problems (and solutions) around integration tests;

All code is available in my GitHub account. That means minor improvements like setting up different connections strings per environment.

Run EF Core Integration Tests in a repeatable and isolated way using a public Docker Image with a… (7)
Run EF Core Integration Tests in a repeatable and isolated way using a public Docker Image with a… (2024)

References

Top Articles
Latest Posts
Article information

Author: Cheryll Lueilwitz

Last Updated:

Views: 5872

Rating: 4.3 / 5 (74 voted)

Reviews: 89% of readers found this page helpful

Author information

Name: Cheryll Lueilwitz

Birthday: 1997-12-23

Address: 4653 O'Kon Hill, Lake Juanstad, AR 65469

Phone: +494124489301

Job: Marketing Representative

Hobby: Reading, Ice skating, Foraging, BASE jumping, Hiking, Skateboarding, Kayaking

Introduction: My name is Cheryll Lueilwitz, I am a sparkling, clean, super, lucky, joyous, outstanding, lucky person who loves writing and wants to share my knowledge and understanding with you.