.NET Core web API + Dapper – Distributed Cache

I have used LazyCache for several projects (The Things Network V2 HTTP, The Things Industries V2 MQTT The Things Industries V3 and Swarm Space Azure IoT Connector etc.) to cache Azure IoT Hub DeviceClient and other object instances.

The note on the wiki page For LazyCache v2+ users, you should consider switching away from LazyCache to IDistributedCache. More information at #59 caught my attention.

I have written other posts about caching Dapper query results with the Dapper Extension Library which worked well but had some configuration limitations. I also have posts about off-loading read-only workloads with Azure Active geo-replication or SQL Data Sync for Azure, which worked well in some scenarios but had limitations (performance and operational costs).

The IDistributedCache has Memory, SQL Server and Redis implementations so I built an Azure AppService to explore the functionality in more detail. In another project I had been working with the Azure SignalR Service and the use of the MessagePack library(rather than serialised JSON) caught my attention so I have added basic support for that as well.

I explored the in-memory implementation (AddDistributedMemoryCache) on my development machine and found “tinkering” with the configuration options had little impact on the performance of my trivial sample application.

public static void Main(string[] args)
{
    var builder = WebApplication.CreateBuilder(args);

    // Add services to the container.
    builder.Services.AddApplicationInsightsTelemetry();

    // Add services to the container.
    builder.Services.AddSingleton<IDapperContext>(s => new DapperContext(builder.Configuration));

    builder.Services.AddControllers();

#if SERIALISATION_MESSAGE_PACK
    //MessagePackSerializer.DefaultOptions = MessagePack.Resolvers.ContractlessStandardResolver.Options;
    //MessagePackSerializer.DefaultOptions = MessagePack.Resolvers.ContractlessStandardResolver.Options.WithCompression(MessagePackCompression.Lz4Block);
    MessagePackSerializer.DefaultOptions = MessagePack.Resolvers.ContractlessStandardResolver.Options.WithCompression(MessagePackCompression.Lz4BlockArray);
#endif

#if DISTRIBUTED_CACHE_MEMORY
    builder.Services.AddDistributedMemoryCache(options =>
    {
       options.SizeLimit = 1000 * 1024 * 1024; // 1000MB
    });
    builder.Services.AddDistributedMemoryCache();
#endif

#if DISTRIBUTED_CACHE_REDIS
    var configurationOptions = new ConfigurationOptions
    {
        EndPoints = { builder.Configuration.GetSection("RedisConnection").GetValue<string>("EndPoints") },
        AllowAdmin = true,
        Password = builder.Configuration.GetSection("RedisConnection").GetValue<string>("Password"),
        Ssl = true,
        ConnectRetry = 5,
        ConnectTimeout = 10000,
        SslProtocols = System.Security.Authentication.SslProtocols.Tls12,
        AbortOnConnectFail = false,
    };

    builder.Services.AddStackExchangeRedisCache(options =>
    {
        options.InstanceName = "Dapper WebAPI Instance";
        options.ConfigurationOptions = configurationOptions;
    });
#endif

#if DISTRIBUTED_CACHE_SQL_SERVER
    builder.Services.AddDistributedSqlServerCache(options =>
    {
        options.ConnectionString = builder.Configuration.GetConnectionString("CacheDatabase");
        options.SchemaName = "dbo";
        options.TableName = "StockItemsCache";
    });
#endif

    var app = builder.Build();

    // Configure the HTTP request pipeline.
    app.UseHttpsRedirection();
    app.MapControllers();
    app.Run();
}

I tested the SQL Server implementation (AddDistributedSqlServerCached) using the SQL Server on my development machine, and Azure SQL as a backing store. I did consider using SQL Azure In-Memory OLTP but the performance improvement with my trivial example would most probably not worth the additional cost of the required SKU.

CREATE TABLE [dbo].[StockItemsCache](
	[Id] [nvarchar](449) NOT NULL,
	[Value] [varbinary](max) NOT NULL,
	[ExpiresAtTime] [datetimeoffset](7) NOT NULL,
	[SlidingExpirationInSeconds] [bigint] NULL,
	[AbsoluteExpiration] [datetimeoffset](7) NULL,
PRIMARY KEY CLUSTERED 
(
	[Id] ASC
)WITH (PAD_INDEX = OFF, STATISTICS_NORECOMPUTE = OFF, IGNORE_DUP_KEY = OFF, ALLOW_ROW_LOCKS = ON, ALLOW_PAGE_LOCKS = ON, OPTIMIZE_FOR_SEQUENTIAL_KEY = OFF) ON [PRIMARY]
) ON [PRIMARY] TEXTIMAGE_ON [PRIMARY]
GO

The table used to store the data wasn’t very complex and I could view the data associated with a cache key in SQL Server Mangement studio.

SQL Server Managment Studio displaying cache table contents

One of the applications I work on uses a complex SQL Server Stored procedure to load reference data (updated daily) and being able to purge the cache at the end of this process like this might be useful. For a geographically distributed application putting the Azure SQL instance “closer” to the application’s users might be worth considering.

I trialed the Redis implementation with Memurai (on my development machine) and Azure Cache for Redis with multiple Azure AppService clients and there was a significant improvement in performance.

[HttpGet]
public async Task<ActionResult<IEnumerable<Model.StockItemListDtoV1>>> Get()
{
    var utcNow = DateTime.UtcNow;

    var cached = await distributedCache.GetAsync("StockItems");
    if (cached != null)
    {
#if SERIALISATION_JSON
        return this.Ok(JsonSerializer.Deserialize<List<Model.StockItemListDtoV1>>(cached));
#endif
#if SERIALISATION_MESSAGE_PACK
        return this.Ok(MessagePackSerializer.Deserialize<List<Model.StockItemListDtoV1>>(cached));
#endif
    }

    var stockItems = await dbConnection.QueryWithRetryAsync<Model.StockItemListDtoV1>(sql: sqlCommandText, commandType: CommandType.Text);

#if SERIALISATION_JSON
    await distributedCache.SetAsync("StockItems", JsonSerializer.SerializeToUtf8Bytes(stockItems), new DistributedCacheEntryOptions()
#endif
#if SERIALISATION_MESSAGE_PACK
    await distributedCache.SetAsync("StockItems", MessagePackSerializer.Serialize(stockItems), new DistributedCacheEntryOptions()
#endif
    {
        AbsoluteExpiration = new DateTime(utcNow.Year, utcNow.Month, DateTime.DaysInMonth(utcNow.Year, utcNow.Month), StockItemListAbsoluteExpiration.Hours, StockItemListAbsoluteExpiration.Minutes, StockItemListAbsoluteExpiration.Seconds)
    });

    return this.Ok(stockItems);
}

[HttpGet("NoLoad")]
public async Task<ActionResult<IEnumerable<Model.StockItemListDtoV1>>> GetNoLoad()
{
    var cached = await distributedCache.GetAsync("StockItems");
    if (cached == null)
    {
        return this.NoContent();
    }

#if SERIALISATION_JSON
    return this.Ok(JsonSerializer.Deserialize<List<Model.StockItemListDtoV1>>(cached));
#endif
#if SERIALISATION_MESSAGE_PACK
        return this.Ok(MessagePackSerializer.Deserialize<List<Model.StockItemListDtoV1>>(cached));
#endif
}

In my test environment the JSON payload for a list of stock items was a bit “chunky” at 25K bytes, so I added compile time configurable support for the MessagePack library. This significantly reduced the size of the payload LZ4Block (5K bytes) and LZ4BlockArray (5K2 bytes) which should reduce network traffic.

Assuming the overheads of JSON vs. MessagePack serialisation are similar and the much smaller MessagePack library payload I would most probably use MessagePack and LZ4BlockArray (For improved compatibility with other implementations) compression.

.NET Core web API + Dapper – Distributed Cache Redis HMGET Error

Initially, I was running the WebAPI Dapper DistributedCache application on my development box and it ran third/forth time after sorting out the Azure Cache for Redis connection string and firewall configuration.

StockItems list served from the cache running on my Desktop
The Azure Cache for Redis firewall configuration for my home office

I then deployed the WebAPI Dapper DistributedCache application to an Azure AppService and it failed (spoiler alert it wasn’t DNS).

Azure AppService hosted StockItems list code failing in web browser

I then fired up Azure Application Insights to look at the logging information.

Azure Application Insights diagnostic information

There were many questions and tentative answers on stackoverflow some of them I tried, and which didn’t work in my scenario.

No connection is active/available to service this operation: HMGET Dapper WebAPI InstanceStockItems; 
UnableToConnect on xxxxxxxxxxxxxxxxxxxxxxxxxxx.redis.cache.windows.net:6380/Interactive, 
Initializing/NotStarted, 
last: NONE, 
origin: BeginConnectAsync, 
outstanding: 0, 
last-read: 10s ago, last-write: 10s ago, keep-alive: 60s, 
state: Connecting, 
mgr: 10 of 10 available, 
last-heartbeat: never, global: 10s ago, 
v: 2.2.4.27433, 
mc: 1/1/0, mgr: 10 of 10 available, 
clientName: XXXXXXXXXXXX, <----------------------This was important
IOCP: (Busy=0,Free=1000,Min=1,Max=1000), 
WORKER: (Busy=5,Free=1018,Min=1,Max=1023), 
v: 2.2.4.27433 
UnableToConnect on xxxxxxxxxxxxxxxxxxxxxxxxxxx.cache.windows.net:6380/Interactive, 
Initializing/NotStarted, 
last: NONE, 
origin: BeginConnectAsync, 
outstanding: 0, 
last-read: 10s ago, last-write: 10s ago, keep-alive: 60s, 
state: Connecting, 
mgr: 10 of 10 available, 
last-heartbeat: never, global: 10s ago, 
v: 2.2.4.27433 

I then recreated my configuration from scratch so none of my random changes based on stackoverflow posts would mess my next round of debugging.

Azure AppService Virtual, Outbound and Additional Outbound IP Addresses

The error appeared to be a networking issue, most probably the firewall blocking connections so I added the Azure AppService Virtual IP Address which didn’t work…

Firewall configuration for Azure AppService Virtual IP Address

I then went back and re-read the error message and noticed “clientName: XXXXXXXXXXXX,”. The Azure AppService client name wasn’t the default domain…

Firewall configuration for Azure AppService Virtual and outbound IP Addresses

I then added the Outbound IP addresses to the firewall and the Azure AppService could connect to my Azure Cache for Redis instance.

StockItems list served from the cache running in an AppService

As soon as I realised it wasn’t DNS I should have “stopped counted to ten”, debugged the issue carefully and not immediately gone to stackoverflow.

.NET Core web API + Dapper – LazyCache

I have used Alastair Crabtree’s LazyCache (read-through in-memory cache) on a couple of other projects e.g. The Things Industries V3 Connector (DeviceClient instances) and Swarm Space Azure IoT Connector (DeviceClient Instances and CS-Script binaries).

Adding LazyCache support to an ASP.Net Core WebAPI application is one line of code.

public static void Main(string[] args)
{
    var builder = WebApplication.CreateBuilder(args);

    // Add services to the container.
    builder.Services.AddApplicationInsightsTelemetry();

    // Add services to the container.
    builder.Services.AddTransient<IDapperContext>(s => new DapperContext(builder.Configuration));

    builder.Services.AddControllers();

    // Register IAppCache as a singleton CachingService
    builder.Services.AddLazyCache();

    var app = builder.Build();

    // Configure the HTTP request pipeline.

    app.UseHttpsRedirection();

    app.MapControllers();

    app.Run();
}

One of the applications I work uses a lot of “reference” data which is effectively static e.g. national/regional holidays in countries where they have customers, and data which is updated on a schedule e.g. exchange rates downloaded at a known time every day.

//...
 private readonly TimeSpan StockItemListAbsoluteExpirationTime = new TimeSpan(23, 59, 59);
//....

private async Task<IEnumerable<Model.StockItemListDtoV1>> GetStockItemsAsync()
{
    return await dbConnection.QueryWithRetryAsync<Model.StockItemListDtoV1>(sql: sqlCommandText, commandType: CommandType.Text);
}

[HttpGet]
public async Task<ActionResult<IEnumerable<Model.StockItemListDtoV1>>> Get()
{
    MemoryCacheEntryOptions memoryCacheEntryOptions = new MemoryCacheEntryOptions()
    {
        Priority = CacheItemPriority.NeverRemove,
        AbsoluteExpiration = DateTime.UtcNow.Date.Add(StockItemListAbsoluteExpirationTime)
    };

    return this.Ok(await cache.GetOrAddAsync("StockItems", () => GetStockItemsAsync(), memoryCacheEntryOptions));
}

[HttpDelete()]
public ActionResult Delete()
{
    cache.Remove("StockItems");

    return this.Ok();
}

For the holiday information updates, the dataset could be deleted(“burped”) then refreshed the next time it is referenced or deleted then re-loaded. For exchange rate updates the cache could automatically expire the dataset shortly after the scheduled download.

When I deployed my demo application to an Azure AppService the requests (using the composer tab of Telerik Fiddler) took longer than when the application was running on my development box which was unexpected. I think this was because the JavaScript Object Notation(JSON) response payload was larger than the AzureSQL Tabular Data Stream(TDS) payload.

The LazyCache source code hasn’t been updated for quite a while (but it is quite stable and works well) and moving to other libraries has been recommended in the wiki.

.NET Core web API + Dapper – MiniProfiler

The time to open the home page of one of the “legacy” applications I was working on was slowly increasing over time. After a quick investigation it looked like there a couple of Azure SQL stored procedures which were called many times as the home page was opening were the problem.

The application used a combination of ADO.Net and Dapper to access the Azure SQL database, so Stack Overflow MiniProfiler looked like a good option.

There were quite a few samples online like the one below which “inspired” my approach.

using (SqlConnection connection = new SqlConnection(_configuration.GetConnectionString("default")))
{
   using (ProfiledDbConnection connection = new ProfiledDbConnection((IDbConnection)connection, MiniProfiler.Current))
   {
      await connection.OpenAsync();

      using (ProfiledDbCommand command = (ProfiledDbCommand)connection.CreateCommand())
      {
         command.CommandText = @"SELECT [StockItemID] as ""ID"", [StockItemName] as ""Name"", [RecommendedRetailPrice], [TaxRate] FROM [Warehouse].[StockItems];
         command.CommandType = CommandType.Text;

                 using (ProfiledDbDataReader reader = (ProfiledDbDataReader)await command.ExecuteReaderAsync())
                 {
                     var rowParser = reader.GetRowParser<Model.StockItemListDtoV1>();

                     while (await reader.ReadAsync())
                     {
                         response.Add(rowParser(reader));
                     }

                     await reader.CloseAsync();
                 }
             }

             await connection.CloseAsync();
       }
}

A simple modification of the WebAPI Dapper PerformanceProfiling project program.cs file enabled MiniProfiler for the project.

public static void Main(string[] args)
{
    var builder = WebApplication.CreateBuilder(args);

    // Add services to the container.
    builder.Services.AddTransient<IDapperContext>(s => new DapperContext(builder.Configuration));

    builder.Services.AddControllers();

    builder.Services.AddMiniProfiler(options =>
    {
        options.RouteBasePath = "/profiler";
        //options.SqlFormatter = new StackExchange.Profiling.SqlFormatters.InlineFormatter();
        options.TrackConnectionOpenClose = true;
        //options.PopupShowTrivial = false
    });

    var app = builder.Build();

    app.UseMiniProfiler();

    // Configure the HTTP request pipeline.

    app.UseHttpsRedirection();

    app.MapControllers();

    app.Run();
}

For simple Dapper calls, the modifications to WebAPI controllers were minimal, and could be done largely with Visual Studio 2022 “search & replace.”

[HttpGet("Dapper")]
public async Task<ActionResult<IEnumerable<Model.StockItemListDtoV1>>> GetDapper()
{
    IEnumerable<Model.StockItemListDtoV1> response;

    using (IDbConnection db = dapperContext.ConnectionCreate())
    {
        response = await db.QueryAsync<Model.StockItemListDtoV1>(sql: sqlCommandText, commandType: CommandType.Text);
    }

    return this.Ok(response);
}
World Wide Importers database list of stockitems

The World Wide Importers database has approximately 250 StockItems which was representative of one of the problematic queries.

[HttpGet("DapperProfiled")]
public async Task<ActionResult<IEnumerable<Model.StockItemListDtoV1>>> GetDapperProfiled()
{
    IEnumerable<Model.StockItemListDtoV1> response;

    using (IDbConnection db = new ProfiledDbConnection((DbConnection)dapperContext.ConnectionCreate(), MiniProfiler.Current))
    {
        response = await db.QueryAsync<Model.StockItemListDtoV1>(sql: sqlCommandText, commandType: CommandType.Text);
    }

    return this.Ok(response);
 }

When I executed the GetDapperProfiler() method of the StockItems controller on my development box it took roughly 2.4 seconds.

MiniProfiler results for StockItems query running on my desktop

I sometimes ran the website on my development box so when I used “toggle trivial gaps” it was easier to see what where the delays were.

When I executed the GetDapperProfiler() method of the StockItems controller running in an Azure AppService it took roughly 20 mSec.

MiniProfiler results for StockItems query running in an Azure AppService

In one application a QueryMultipleAsync is used to retrieve information about a product and a list of its attributes. The World Wide Importers database has Invoices which have invoice lines and Transactions which was representative of another problematic query.

[HttpGet("DapperProfiledQueryMultiple")]
public async Task<ActionResult<IEnumerable<Model.StockItemListDtoV1>>> GetDapperProfiledQueryMultiple([Required][Range(1, int.MaxValue, ErrorMessage = "Invoice id must greater than 0")] int id)
{
    Model.InvoiceSummaryGetDtoV1 response = null;

    using (ProfiledDbConnection db = new ProfiledDbConnection((DbConnection)dapperContext.ConnectionCreate(), MiniProfiler.Current))
    {
        var invoiceSummary = await db.QueryMultipleAsync("[Sales].[InvoiceSummaryGetV1]", param: new { InvoiceId = id }, commandType: CommandType.StoredProcedure);

        response = await invoiceSummary.ReadSingleOrDefaultAsync<Model.InvoiceSummaryGetDtoV1>();
        if (response == default)
        {
            return this.NotFound($"Invoice:{id} not found");
        }

        response.InvoiceLines = await invoiceSummary.ReadAsync<Model.InvoiceLineSummaryListDtoV1>();

        response.StockItemTransactions = await invoiceSummary.ReadAsync<Model.StockItemTransactionSummaryListDtoV1>();
    }

    return this.Ok(response);
}
World Wide Importers database list of invoice lines and transactions for a StockItem
MiniProfiler results for Invoice Item Query Multiple running in an Azure AppService

I couldn’t see any results for reading the StockItem Invoice lines and Transactions so I wrapped each ReadAsync with a MiniProfiler.Current.Step.

[HttpGet("DapperProfiledQueryMultipleStep")]
public async Task<ActionResult<IEnumerable<Model.StockItemListDtoV1>>> GetDapperProfiledQueryMultipleStep([Required][Range(1, int.MaxValue, ErrorMessage = "Invoice id must greater than 0")] int id)
{
    Model.InvoiceSummaryGetDtoV1 response = null;

    using (IDbConnection db = new ProfiledDbConnection((DbConnection)dapperContext.ConnectionCreate(), MiniProfiler.Current))
    {
        SqlMapper.GridReader invoiceSummary;

        using (MiniProfiler.Current.Step("db.QueryMultipleAsync"))
        {
            invoiceSummary = await db.QueryMultipleAsync("[Sales].[InvoiceSummaryGetV1]", param: new { InvoiceId = id }, commandType: CommandType.StoredProcedure);
        }

        using (MiniProfiler.Current.Step("invoiceSummary.ReadSingleOrDefaultAsync"))
        {
            response = await invoiceSummary.ReadSingleOrDefaultAsync<Model.InvoiceSummaryGetDtoV1>();
        }
        if (response == default)
        {
            return this.NotFound($"Invoice:{id} not found");
        }

        using (MiniProfiler.Current.Step("invoiceSummaryLine.ReadAsync"))
        {
            response.InvoiceLines = await invoiceSummary.ReadAsync<Model.InvoiceLineSummaryListDtoV1>();
        }

        using (MiniProfiler.Current.Step("TransactionSummary.ReadAsync"))
        {
            response.StockItemTransactions = await invoiceSummary.ReadAsync<Model.StockItemTransactionSummaryListDtoV1>();
        }
    }

    return this.Ok(response);
}

With larger lists every so often there were ReadAsync calls that took a more than a “trivial” amount of time. I surmise was some sort of batching done by the underlying ReadAsync + NextResultAsync methods of a SqlDataReader.

Need to investigate the use of

using (IDbConnection db = new ProfiledDbConnection(new SqlConnection(_configuration.GetConnectionString("default"), MiniProfiler.Current))
{
//...
}