Enhance your .NET Testing #4: InMemory vs Repository pattern

Pierre Belin
Pierre Belin
Enhance your .NET Testing #4: InMemory vs Repository pattern
Table of Contents
Table of Contents

In the ever-evolving landscape of software development, the way we manage data is paramount to performance and efficiency. In the context of C#/.NET, InMemory databases have long been a staple for various applications. However, the emergence of SQLite in-memory databases offers exciting new prospects.

šŸ’”
I wanted to add a second layer of nuance to the use of InMemory presented in a previous article where I got feedback on the limits, so I'm back on the subject!

This article introduces SQLite in-memory databases and sheds light on why they stand as a preferable alternative to InMemory. We'll also delve into the Repository Pattern for tests using stubs/mocks, offering a comprehensive understanding of these concepts, their use cases, and limitations.

SQLite In-Memory Database: A Comprehensive Alternative

InMemory databases are widely used for temporary data storage and testing. However, they have some limitations that can create challenges.

If you haven't done so, you should read the article to learn more about the subject:

Elevate Your .NET Testing Strategy #2: InMemoryDatabase
The InMemoryDatabase offers an efficient avenue for performing tests that need database interactions without the overhead of database operations.

Firstly, InMemory has a lack of Provider-Specific Translations by not supporting certain translations that can lead to inconsistencies when testing real-world scenarios.

Secondly, its functionality is limited. Features that depend on the database provider (e.g., unique constraints) aren't available in InMemory. Above all, the new functionalities that will be integrated into EFCore will not be added to InMemory, as Microsoft's documentation states.

Microsoft pushes the use of its replacement: SQLite in-memory database. It supports Provider-Specific Translations, as well as transactions, which is a real plus.

If you're already using InMemory, switching to SQlite in-memory is trivial. All you need to do is modify 2 parts: the creation of the connection and the DbContext to force the opening of the connection.

Setting up SQlite is identical to that of other database providers: simply specify UseSqlite in the OnConfiguring method.

protected override void OnConfiguring(DbContextOptionsBuilder optionsBuilder)
{
    optionsBuilder.UseSqlite(_connectionString);
}

The connection string is also similar to the memory declaration. We also specify a database name, which is temporary, and the notion of shared cache.

var connectionString = "DataSource=myshareddb;mode=memory;cache=shared";
_ = builder.ConfigureTestServices(services =>
{
    services.AddScoped<EfDbContext>(_ => new EfDbContextFake(connectionString));
});

Normally, you'd be asking yourself: but why are we specifying a name and trying to share our database cache?

Every :memory: database is distinct from every other. Without specifying the notion of sharing, it is impossible to retrieve the contents of a database from two different connections, even if they are both in memory.

To share it successfully, we use SQLite's shared in-memory databases option.

In-Memory Databases

Last but not least, the database ceases to exist as soon as the database connection is closed.

To be sure of that, I override the DbContext to force opening on creation with the Database.OpenConnection(); method, as well as setting up the Database.EnsureCreated(); schema directly in the constructor. In this way, the database will not be deleted until all associated contexts have been recycled, usually at the end of testing.

There are several use cases for in-memory database use, including repository testing. Using EntityFrameworkCore, it's impossible to test repositories without having a database behind them.

Above all, don't try to create fake `DbSets', as this is not recommended by Microsoft, and having tested it myself, I can confirm that they're right.

It's important to test how repositorys really work, to make sure that data accesses are the right ones. For this, it is recommended to use a real database, i.e. a local, remote or containerized one.

But this comes at a cost, especially in terms of time. At each test start-up, you have to make sure that the environment has been cleaned and set up, and these are not things you want to have to deal with at the start of a project. I find that the advantage of in-memory is that, once the project is stable enough, you can run these tests on a real database.

The Repository Pattern for Tests Using Stubs/Mocks

The Repository Pattern in conjunction with stubs/mocks enables streamlined testing by creating controlled environments.

It's very easy to set up: all you have to do is create a stub of repository interfaces and call them in the tests. This cuts out all dependencies on database tools, and in particular EntityFramework Core.

By creating these mocks, you can predictively define the results of each data access using frozen values. This data is used to highlight the precise case highlighted by the test that calls it.

It's not uncommon to have a multitude of stubs for the same repository when you want the exact behavior of a method. Of course, each stub doesn't have to fill all the methods exposed by the interface, but only those you use. Filling all methods hides the real purpose of the mock.

It's as fast as the example below:

public class GoatRepositoryStub : IGoatRepository
{
    public IEnumerable<Goat> GetAll()
    {
        return new List<Goat>
        {
            new Goat { Name = "Billy" },
            new Goat { Name = "Nanny" }
        };
    }
}

The only step to change is the definition of the dependency injection inside the test, to get the stub wherever the repository is needed.

public void ConfigureServices(IServiceCollection services)
{
    services.AddTransient<IGoatRepository, GoatRepositoryStub>();
}

This pattern is the one recommended by Microsoft for testing the operation of an application, and I agree with them.

It's perfectly suited to testing our solution's domain and application, i.e. without worrying about external elements such as the database. It also lets you handle very specific cases without adding all the complexity of EntityFrameworkCore and database management.

Summary

InMemory is no longer supported by Microsoft, so it would be a mistake to implement it on new projects. Fortunately, its behavior is very similar to that of SQLite, which can also use memory. This is the solution to turn to.

Bear in mind that there are restrictions on the use of in-memory databases before jumping in. If in doubt, use a real database.

If you're looking to test the end-to-end behavior of your code, such as an API call, I recommend you use real or SQLite in-memory database.

If you're looking to test the functional behavior of your code, i.e. the business rules of your code, I recommend you use this pattern.

To go further:

Choosing a testing strategy - EF Core
Different approaches to testing applications that use Entity Framework Core

Have a goat day šŸ



Join the conversation.

Great! Check your inbox and click the link
Great! Next, complete checkout for full access to Goat Review
Welcome back! You've successfully signed in
You've successfully subscribed to Goat Review
Success! Your account is fully activated, you now have access to all content
Success! Your billing info has been updated
Your billing was not updated