Working with the EF Core and the In-Memory Database

WEBINAR: On-demand webcast

How to Boost Database Development Productivity on Linux, Docker, and Kubernetes with Microsoft SQL Server 2017 REGISTER >

If you've spent any time working with the full .NET Framework's Entity Framework, and wondering if this product is support by .Net Core, fear not. There is a version available that is specifically designed for use with the .NET Core.

Quite a number of the features, and how to work with the framework, are very similar if not identical to the framework's bigger counterpart, but there are some subtle differences. If you would like to spend some time glancing over the basics of EF Core, you may find the decimation here.

For this article, we'll be focusing on the in-memory database option, which is extremely useful when you need to test.

Building The Application

For what we'll be doing here, I'll be using an empty ASP.NET Core application, and I've added a few packages. These packages are…

"Microsoft.EntityFrameworkCore": "1.1.0",
"Microsoft.EntityFrameworkCore.InMemory": "1.1.0"

I've created a db context which looks something like what we have below…

public class MyContext : DbContext
{
   public MyContext(DbContextOptions options)
      : base(options)
   {
   }

   public DbSet<Item> Items { get; set; }
}

And the Item class looks like this…

public class Item
{
   public int Id { get; set; }
   public DateTime Time { get; set; }
}

To start with, let's enter some data using the db-context, then from a new instance of the context, read the data out, and display it in our view.

In our empty, ASP.NET Core project, in the startup.cs, my ConfigureServices method looks like this…

public void
   ConfigureServices(IServiceCollection services)
{
   services
   .AddDbContext<MyContext>(options =>
   {
      options.UseInMemoryDatabase();
   });
}

The piece of code above is pretty much all we need to configure Entity Framework to switch to using an in-memory database. Now, looking at the code in our Configure method, which by default should be below the ConfigureServices method…

public void Configure(IApplicationBuilder app,
   IHostingEnvironment env,
   ILoggerFactory loggerFactory)
{
   using (var dbContext =
      app.ApplicationServices.GetService<MyContext>())
   {
      for(int i = 0; i < 20; i++)
      {
         dbContext
         .Items
         .Add(new Models.Item()
         {
            Time = DateTime.UtcNow.AddHours(i)
         });
      };

      dbContext
         .SaveChanges();
   }

   app.Run(async (context) =>
   {
      string output = "";
      using (var dbContext =
         context.RequestServices.GetService
         <MyContext>())
      {
         output = string.Join(" || ",
            await dbContext.Items.Select(t =>
            t.Time.ToLocalTime()).ToArrayAsync());
      }
      await context.Response.WriteAsync(output);
   });
}

In the first context, we're using a simple for loop to add a number of dates to the context; each one will be one hour later than the last. Then, we'll invoke the SaveChanges method on the context to commit this new data to our in-memory database.

At the end of the Configure method, we're using a custom middleware component to output the values from the table in our database to the view. Quickly running this application and looking at the results, we'll see something like what we see in Figure 1 if all worked correctly.

The output from our code in the browser
Figure 1: The output from our code in the browser

But, what about testing? In our code, we've made use of the in-built dependency injection to resolve the db context for us, and the DbContextOptions object passed to the context is indeed instantiated from an abstract class. How do we get around this without having to make changes to the db context itself, which may in fact be code we don't own? Step in the DbContextOptionsBuilder; consider the following code.

I've added these packages to the project…

"xunit": "2.2.0-rc1-build3507"
"dotnet-test-xunit": "2.2.0-preview2-build1029"

And "testRunner": "xunit", to the project.json (and yes, this particular file is going to disappear in due course). The code in the test class looks like this…

public class Tests
{
   [Fact]
   public void DbTest()
   {
      var dbContextOptions =
            new DbContextOptionsBuilder<MyContext>()
         .UseInMemoryDatabase()
         .Options;

      using(var dbContext =
         new MyContext(dbContextOptions))
      {
         dbContext
            .Items
            .Add(new Models.Item()
            {
               Time = DateTime.UtcNow
            });

         dbContext
            .SaveChanges();
      }

      using (var dbContext =
         new MyContext(dbContextOptions))
      {
         Assert.True(dbContext.Items.Any());
      }
   }
}

Again, we've used an instance of the context to add data to our in-memory database. Then, we use a second instance to read data, revealing that the data has indeed been stored in memory for the lifetime of the process.

Conclusion

Even through this in-memory feature of EF Core is very simple to implement, its effectiveness and usage cannot be overlooked. At this time, I run a number of console applications which were originally built with Entity Framework, using this feature to run regular, scheduled, batch processing tasks.

If you have any questions or just want a chat on the subject, I'm always hovering around on Twitter @GLanata.



About the Author

Gavin Lanata

Gavin has been building front-ends to software applications; desktop, web, and mobile, for several years now. Gavin often attends developer events, expanding his own knowledge in the fast paced world of IT, and lending a helping hand to developers needing a little direction navigating the world of design. Gavin has dedicated his time to the study of the many factors influencing human usage and reaction to software application. Art, design and coding name a few but certainly not the least of his tools to find the truths behind why we like and use the things we do.

Related Articles

Comments

  • There are no comments yet. Be the first to comment!

Leave a Comment
  • Your email address will not be published. All fields are required.

Top White Papers and Webcasts

  • As all sorts of data becomes available for storage, analysis and retrieval - so called 'Big Data' - there are potentially huge benefits, but equally huge challenges...
  • The agile organization needs knowledge to act on, quickly and effectively. Though many organizations are clamouring for "Big Data", not nearly as many know what to do with it...
  • Cloud-based integration solutions can be confusing. Adding to the confusion are the multiple ways IT departments can deliver such integration...

Most Popular Programming Stories

More for Developers

RSS Feeds

Thanks for your registration, follow us on our social networks to keep up-to-date